var/home/core/zuul-output/0000755000175000017500000000000015150064275014532 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015150111207015462 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000333605315150111065020256 0ustar corecore5ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfF}V"mv?_eGbuuțx{w7ݭ7֫~𒆷7̗8zTY\].f}嗷ovϷw_>on3cvX~egQBeH,nWb m/m}*L~AzHev_uαHJ2E$(Ͽ|/+k*z>p R⥑gF)49)(oՈ7_k0m^p9PneQn͂YEeeɹ ^ʙ|ʕ0MۂAraZR}@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߖ|Hp(-J C?>:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}~Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&V<UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ{g6R/wD_tՄ.F+HP'AE; J j"b~\b$BrW XWz<%fpG"m%6PGEH^*JL֗J)oEv[Ң߃x[䚒}0BOnYr猸p$nu?ݣ RF]NHw2k혿q}lrCy u)xF$Z83Ec罋}[εUX%}< ݻln"sv&{b%^AAoۺ(I#hKD:Bߩ#蘈f=9oN*.Ѓ M#JC1?tean`3-SHq$2[ĜSjXRx?}-m6Mw'yR3q㕐叒#y|n^qbw h y1\'$R3\ȼ+&|V"ZF[Ky[-$4f~oKjTRqo%48ĸ1`g 6wbc(K.$tx2~lj0<j}fa%(%#g*{K.ު s^ &7`d[9ɃO>z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ eu;^u'}8H0]+ES,n?UU{ x~ʓOy_>?/>l8MrHID2VSsMX^"NۯDc558c&'K0L /C5YDqNe~ض˸nErc֋@aw*r܀0 a {RQXV-/p:MP\<=<^越a/bz?ܓvjIg3MN4:]U]STa,@OKdޕ6ql?N/e1N2i#stV \'xMgaSZNg8>e!^f%cYr]qs:"̊;isXa]d+"v=x7p.fZCg_Ys;pE&\U}ܫSh])qKYAـhhdEnU14&G * QIQs;rԩ.k83֖8Muqu_48dHܥlWW q>fu6+'}xu\Veelz`Zbym gp8펠ˋֆ:1IC8qٞ\vXçL ]X/r}7O}Wh,h ;RQ=]u00yiC۔I^3!?H6iUH:ô 4P$rT`%2Aq-֢׍qt=@x#~0)p# ы9'iri]ͪ/@繁qVGCڤr,DihB ,m 9 _$q3= A$IC"6g^4e`Xo(D*6"^eTh'4xpFڜe'fVQ7~'c L^ԯwIڣA.}H;Ë*׬=`^ 9]r鐃 -Dfi2|QwZk‹u^6DQ1&H凎c!n[mi3)WfsF:M"uҷs.1!뾧1%s,hQs|hx̗3%*v9(I;:'>uQ+v)vR/egBhAAdh]4H:nV$tHI98/)=mͭ ڐn}}~ק?g_6WĩDRc0]rY9'z .(jHI :{HG}HDN`h7@{jnE#[dz;n#y 9D*A$$"^)dVQ.(rO6ӟZw_Ȣaޒu'- ^_,G;U\cAAz7EtlLuoXuA}bT2H_*kIG?S(קjhg 5EF5uKkBYx-qCfqsn[?_r=V:х@mfVg,w}QJUtesYyt7Yr+"*DtO/o۷~|hw^5wE of7cꃱ.)7.u/}tPTGc 5tW> l/`I~>|灹mQ$>N |gZ ͜IH[RNOMTq~g d0/0Љ!yB.hH׽;}VLGp3I#8'xal&Ȑc$ d7?K6xAH1H#:f _tŒ^ hgiNas*@K{7tH*t쬆Ny497ͩ KVsVokwW&4*H'\ d$]Vmr달v9dB.bq:__xW|1=6 R3y^ E#LB ZaZd1,]ןkznxtK|v+`VZ3JϧC^|/{ś}r3 >6׳oƄ%VDSWn 0,qh! E-Z%ܹpU:&&fX+EǬ.ťqpNZܗÅxjsD|[,_4EqgMƒK6f/FXJRF>i XʽAQGwG% x삖A7 u/~&ӄMu.<|yi I?@)XJ7{ޱ?Q]{#\4ZfR-dVaz./f+yGNMGOK?2_~3\z=y}^G$*A! IcuR.o=MZ9zu b#s9@*иrI@*qQN||Ix;I}&ݢ6ɢ}{]x}_o>Mm8S]~(EX{wml"Ms>\΋"?|NKfֱn !d[s׭dֲcUh=Ɩ9b&2} -/f;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6seAU9c>Xf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O ?mI_K(h^ ld2;x& 0Mk[MR|b&&z>JrQꎒ1ODO6Ⱦ*L.9`_mV$rAAlUU!JʀgDYI%nJ7@(H9\\\xB {5-ys+~=K.Q!R$ Sl ![5H2A$R<*}UyLv[xZ׬\!U͔FZEJX&L .(J]bԃľ ?-E4[G{T?5. `\d~ G{sqg}\$ɽ9NվJ vI;%e#OvpX; __Z̲WH2Z{L~Y\;_WmXq-AbzY\130ޱ6gwý 51'1e;uZ wmB_+ŊQȲƞ9smG!|.+Y?/t)'ASQAțlEӿwxr,"?ɴU/2 cWdN&/Hoa,lG şsE$TqSW)e! =/!>府3!>-sS-X2\3 12َK =˵uwX}#igzqL=Ǵ}w_"'_*tW^{0KcĖ0a0,?f+ J =vudr޽JohL/a˺N,iSXjċm8QpP"npF,mI>!LXz]zԃq%ewCk'e ˡ6Rvg>b p\A,fQXaa,81m!-ba`%<~v̬0p4M3 y g]*hx2R˭:" 5UJFI-ȼ*iQK=@{X煸!XN ]qo) cŚQ&0D "@DF̒jTВfX娚&񉑒~>#}" |Ǝ;z}9Ok۲RvȔ%n<2Ƞ=K#奈iZɥu*Tïl_w Lb+bA6^Lac~5Ӽ;F | :S}|{D0x՛Q\skP>ءN+5'GI[R& Q7c~j˻SxP+>rE(I (055%aBO#k!sLgu+ﻣ~8k`Mԋ3aC Z?7JLjPE7Y~|{ @mN?Z|*@MtJbG50牢Oϲ|zD GQ, 1R9!u{b]k (M"ѥ~r1D) W5.>c 7V?u6.)e(aؙrEk}|Vuu,vLRS!?[ol/&˄ns5ƥTy Iɛz݇p%DCƭ}2Sy2Tk  \Z 1  x `X}"fmbL>>W'lL8(bq4+R"loDT'𹺘*ũFx\`㠡l,%0mI؄oORp *D~6ڜILW1)|4O[鋎((OPuߔo99[^hYۋ;DFsg g XdZsϱ%w 0t=Dw*>}I]8P K8P"FwUYQ)+#(^jqX<ԙڣEF5e8H۫S js;D3g)͗^{ Ѭ/ᅓ-H-[4Kحs% C?$$MVu)hv[O`X Qۚa^ϒ<[IDa/"S[\]p-IAA4]+Sפr?;J5 SkvOk;T}L0݌zi{跏o>\YswAji*7PIY񿿢edS(ၒ?h(EگYL\1$K:X| jy)rJD3)/bVHF۠Em?<3!YOC"~/$<O*YpѾܞrrM.rn)sqtu1}x_$8ƔH]<=a: , %zb1r F2P'p@OYEN(3.ŕM'C2gT8͉ GXW+Lv\. 9DwWHKk9=2 )n!/rUZjޠu Ɇ5BQ3„2fT\VC6#ނi!糥9{! XL9mG5dT+.Ps\GMY +mnD[$)ل:P#.bڤ!,ݣa8HcigCb*S'%Dn: )M2,ס!BURuI*yV3 vgqm60˵oCxl6VH,J*qIGTq&؍i55-;&W 0_'NP2шKmJ]`L6P]thNSJ2hR~5ҵ,X2SSG5%9Yól]ųpĭ *޲I#[3CLTI6C #R-zL}KtG]3%&66J(<;nt̘%әQ`d AG+4!T^ݢ j]]Ĉ[Ldl 9i:=q]M6ܣE2TFzI^` L9ݕj_˨-(!e7؄R2hI37`pJtwUxidz6^-by5Iq{hAYR/i?N``t[vrh*x{ucd㴩f뱀 <7Uz6< 2 w_8;+0"hRkq]s)yM۪{w"LW@hikm_Vo}y-+8t굏RHC{V%; S?8D Bp^h#y6 \!0ͭ۶VA8J1\[3>Ů+?})၇-qi0/t9m6nKN@Pme{c 9ںh{|Rb|n5(ԁW-jXx}F"3&26aj'iMF;T,6X-խ $gRfFԊ* CA]6 Y±~.e}x𭡋VfG~'ƜMmbՌڮq\M8<$7ж"ġSH }? ,rI Z&X ut:Zwpf%:VGj=rNIKY V/pFjl#crn&.&Ma B!Cbm]$PG␷ \%-Vl&xvۇtۊP}_&M6A誖8J8b[OYI$).ǒ+ۻ e=OL_6$k驣K݂36v;8V6̙r>隱+HKEU[lڥHCtȦ'BϧIW1b-2QQ>.4pG,]0\-k7y.G5 ayތÃv=H:0 Uu%.Te`qCY+u0FE~qzf4!T1U3ǜwCFf&vٺ3"gj$hbku\bvd6hm^b&2Qެ:yô[= E-uZu+ѝYh%j62%I¼2X YtTb1TuZ2'ʈ H!=̱2’{+&ՖUYG^!QНZi-:FT옅 yYg͍eo#MֆNoN(>޻xf ňh=.}'\sL<'ة.:#Ӥx# iEI<ڑ J#Z;Їx#]ȭMm;ف>{؈x-ٟm; n]Q/|&`Cٵ>}lOA(mkmp{#'6;Ȥ isvfiȒhjK^c=dq!_'{Br^AmamM-r&foeg[>f'[&p-T'o굙t^ /9 X#6\wmTZkVuS  "4utY 4`qe]=Kr1H(yϓiSÛOYxq)x1U=ήY1YR]=pV>S~]|G|fSŠꪇ>bnXT p)T9kճʤx_l,xgVTZ`>/HWN²˷ww[bTM*Al+6@ǻCLL|T 4o <>gӃ?(0lɻ?@@;N:a=ad"? _1%rxɞ/8@1À@4N46LƩ+Jbb| %SbppH\\|Y/nCQwO)}y6чsAA$}Px?ŢJ8^H8=# ܈X03#ԟUK!aҗ hyg1=ByQ!n Lg` GXP,֟U#A:E(6CGbq{@H˜>"NM:A ڏ3Fj؀v%n|&CA/Oey}#2I靯2ati1Pt(G1_:٥G(5:?bmޗm{xv3_cK׭?=A+K\}ɩD1?S+<b9ϩ~P.Bp*xrj^@^?\'hrGX8}i:oxh\'.r\':s`rɝ,wӗl;^/>myG/;J  V$"EXy$-IU}7rǮ;r,~DX\kG8܁k|.>'0Z^E^$)}cs2!MZQD)e7@ëNѰ*?#/QfB,~r B?)!>d_\$A==klŀЅ-aXyCM&(\fy _0vĩ|X,X|%E  LkrKfaZqOqif_ZӄU`L 5ĦP0z ٙ6tzHZ^|Ԯ :>fW))i. g[HUA0m@)t4k3F{*1V}ʱ|:JگLth>ԎBvCё: I\G3,}y90Gå>b~5m;AxS-Z{`{V_rJVdl,程v r=Dq9`8`&"h0WE(N1Iye}@2fg <_f ,0$~u.p-w}lP%r_D.qxzD)͌ԵZZxk)F*TuVQ} fp/4ԵzwZ1_8Iܡ*R2Uh9pg9ƒZ:.x 3̰vؐ q4Fh?c6 $GTYElNVmei2-L.^ݘAmO:FhESq µwVSp?HğS:`< \XE[R/يXbKp`#W ;>s:[ɀ(Ҭxa0( r؈B_=9seVdS|UK= wdjNI%W$`PgtX Gm.W=)CSG\=ا,hPrr]L6% Z$Kq _ȫ"U|oee ʞ&({lAךҦʞتvjݩeӴOmG7.W;w?MTHIZZ;j.4AG j XT ]P4A#;j j )Ϊ: H&xι#9,)>м?siwi[gjp rIl,W ypr\9 U޻p^iU.Zl:/D)0<F "d8+h:+ I?q2Hsd:hRT7A} ][bi]j$R?Y1$ʔ)\ָنSC<&%*TG$ќBPhI"h4Sw}3R5TA9IdA J~%Vgu9h^ަe1kOp# ELвeҵs2N650'+:!'Y8ܿT(M7 E_$*r裸o\Q2A5I"lʹ&rb0GY`434%UhкEnhTrKjZ\GsXh̪H,kH5 aQ7m`ɝFqKtC>UTD:F _oصsV\| .e+e4TY/WDPM3p)}*;B0J7tjV@ԁm-2Yj;О-$.wA52]^c٫>wR3o 66Š R `~0v-rTY:r̲_ӔFmc|dLqۏr8o?dҠH&0bm2';񙑺,W?†|% fO41zi"iߦۏ>dN:ԣhWLݫMSr0wk5 }]={WƕzF+ RG' vyS _zR-rm-<aY~>*מa`BHiu~H$SP:8=N8ſ\іѕtn{ăgyqczZ{Xm`5R+(6$XE2^d2/\u@ZA!eR wS&Q<Tauv1ziڨƑWpIt8gEGYIPW!ͤķPs(F;GkJ@ojrMCXmj햀ȡt B%,l\ 4k7G[yz$Y,} R^Y (-RP9kv>W@.ZϞݼEU!{fs}U^럟_wADޒ7ζUct.HA+ZM[H8#q$3ZgtQtCYΑY?(xw$:!nčGMKVEcUPB)eG gN_y+|N i2N`扒_,pׄ =~{({p~}a 1lk6ޙ%G޵u,bܵ+C@]'EqH)O yѕWCmdjXz{{_σxy! 04sD󿥿@G7ғx=_mw>oYd :h_ۈ7w /whFb|F\vK;s(s>qG7R?>\տZշ^F|Ӯ~oCc /?Pftu+>.1?Ҡtlj#ިͯP"_Ɉq!ղǟno~zm?@rw҇\ݎCWf[;壕YFhKq&iw5PGFZ! ݛ:Rs Qpd?ms2tBN TL G 3{F5h`;ƛl>p֋hjJbRZ-I Oa$vH ~PwmԘosNCݥf?nwmFTe6%VPZ4c6CT$\lns,W'֬SAT,`So:J_ "P$bXš֨8z^C^|BUC$NP MEh z&&YWZ8- A$n\kx[hai: 3 ycEp]kmg_ـ :qf Ip?ZZe:N%-1&A% Z:Ys%EM"<=}RZeZKB}b\M*^s{4Bcvs{rW'sw >4T5˖2^JDs) n(up' DWmcUjH(-[-:͐60v.CM_Z◽ZӋ&8Q$1?k ɣ pV R(vQ$|V(Z"tk)Yb "!uH!`pÉfAN5ߜ1=SHjw+54 Hz,Nd }qwXd[٢)ƃ x#cֺ`| BY{RҴ"+=Sl{>~oB錆\.FAΩ5ap?yY0 ")ΣAUm2%:P!h|m!@t3"4O" . / >q]Xs7QK0 c٬{يmRb"8SČԨWc4ynMc3.!NF&ГGP{bjܿ>H3P7,Msimt BcQ&؆VnB#םG;73*Tmb,ΊPEĜjD p4|Ă\ EY%S'0q%-m*1O>0veb0g^XP[bq?ZL{|c*>afĈNNxyϕu|yX7'<" v) =X0j}XHєQ_?IE֌wg1$,~wY>-bq<9&SaH,RT[)'{ jㄦg)aLS$1>}dzv!pi#.U#R$1ދ5~fiVCIAl!x*KR IyV-+łg* eBɂd#rjO.z2k\aS^8 Ȅ-D>ԓph-jUV Lp]]tzh4A{(n,8k2hQyn{ӠH{e ݙ; Yΰ6{C~o}r,8nՙSxUfUDa?o?AM5(1GN|1:6ܫ,h4xQS00mji35E`/Ϊ`jOH㣤̂ܺ5 A %oJ=_I,P[>r͂ g5_ LL9wo̓,4zBQL:E\N?tt'nz,8N6o-Bn%%v%ؔId%#tT=v/Lq; Op=EBNL P:T}:GY4"Fa dn?!Sֵڑ tU@ZTa`aem5E ?v*'x<tΜU%|g5~Ő&XE!$ `OL wVUURҾָ'?^ԕXRō5U{_LL$n6J.QhISxm%=KQG =`]bɼCpkSץ,IyFtsO5dS^YFը1="T4}BJ6d'QV9<&J&tIhQN;ISq4gU\{:&v,?龧awygqh?wxy(H#!y[wԣpYp=ώr;9{csGJ4 ҅"Ҕp"w}g$vr+8K.[Q=R{k SGy,ʼ%LlҲcyX9ygt98,'o BKz)K|4E 1+y#3V ;PjEB:C&|bqr>2a"[+ ɩ܊317;gj/ҠL՚^\6P$AFLp{8 |P{^v1h'fWk($wXpꬢbJt|Dx{Xpԓۂ/] fưLB!vQ$.'~w1^DGZj7&m ABov,8Nl⸻~bq2橫_q$<XpD8sh;b6]Om/,M&S̔ȐSV :]o)3 V`3S'!V;-)G}Yi3y?tgЅxvīraXGqWv{\Y%c.X+ΪP"r-̙ l7}牵Ub4Mdѳ89pEݔRl˩@R=if2?=M]؎/ȭ7m"q%E7 G:xZp6ڶk$K綦:Qo90i"Ooj陮llP=:UQBxnj<X2/sL2lS3کTY?Xe|*S9͌K 7Fz8-5>*,ܪwڢGg"tGa4`dʛ]^t>{jc߲g?0>+`y|I{e~1Yq0kK֦޹df<6O @֍GĔ4*#`]0%{IU_n _ G7nM ~xZFȷW-{D9J[mrޏ(ˣY}7(ƅDy=zZf <k9v?;HDec!^OO7gNh9 ߮)i@ԚQ=FeUcK~W*Y,qo 4êg簝WkWN"g3ceS'nK1dhJQfI2P:G@88z>Y<3ױOu96tx9?[1 dKo?7`o٥0ﲳTܛ8.2>aϡI~am7ӦJN*^UKp1<*`2< &>y@Y003]_o˿雲<~{uy|t+C[cG}G56U1>}>d8Y+q`dc{Qu9TPqtxD&/=)Wa#FsEZIVbrKq|:=`fNhU`=݄0xU><T~gɢơhO,QnUmDUUeiU֖"HLJ;L9y[r tFoU0GFI{(x6 Lj?'83s-WbUcs<˭smk3<}6~ՊwTQJ8pMqk|Y;0\}zҏok_ոCbUTS5Xpܳ(hr xuej_&bdk##9nvt R^Vq W߽5tsY]n vwfR̓Qn9̜r,:JZ?MVVhOO3H=/AIt>H ϛIofQ`:7JڕHKJ4[q/Z"(Y !M>= b,aWǛDcPu^GG"pkSAǢ;%3x-PZUǛ1̦.@1̥6-05jC SŷxQm7/Dn1F"mqI2fD(%mCѨ & 2 ObF.d(ݲmtw;ւ; .Fӷ6DDv " ]J0AJDTJh.; iVID'h0>Sc4Yߛge_~?XRUGR?i)lQ qm/+Q8Hő>Z'V唩BWm^J 0nDLHNb ~T7wlw`fԎcֱ_&i1<0rXbYґ9(p猔s r ˠ䋫筟ߝN5z- /:^~t, $`v^ Tx.hu.H̰̏5##V#tN#k-u"F'`sk1:Z~|t6saTQn?t T;K?c ~xfZ_ "nj9LЧ$#i(Taa{ڦf zRh[?ȋ$+̳C1%dqFt\WF@aJ9UL9TIIά[Eq*fV-^*ģ?cx}O^BS@B:o}1~8{wӿZH;JZZ4-=㢱t9O[t+moMە&H KX6&-dt%jσ{ci#BД~~UqL96F_{)OŞ6y]Fg̍jr5狴QzH_=S=躭 2 #N6#/$e\J8rSCz+"RZ K,@ZG_ i!^-}-@8u'^totk#RoЗWlGZG6n8CACggn {aI`K;DXʻݗ$0c_/_M]ú %(,?._a\$7+bH]ñI8|ON_A~95cM.loDy:T+g\^XͶy+4t}h{r^ȶ\1f WLK(]жa_+@%̯S܃oAvh)ZSV^ v!*h| m?ҰQ1 |'oR%q5qVl~8ݲV}(|}H'嵷?\_ =}?oU1>}'`6CJ \*5[)/3P/V\?u2ŭJ$!3ʬW# ]Qͼh0 bi:H޺W~D~t[!t &MY]Eըjv=kPѭ,KqVXTnŇA&/i'!ى7ޝ S0.ń@۷ \qðo8%A~Hi^ݥUd\flK+n&VV" !4p*ILte8n*`?ocP<~rg?'-OVrW) U%gRlf 0^OɮfM ^ǧ߈ꖏhNЦf5\_7_2QzT"yy*)ɨ7Zd=dG y7g)Nkl@P IY({}Fq@\I;CqD\Mhh:D&VBDu9X,*BhК^GItfoq2ha2d@qͼZI,3 SMy> YVo OZCDb耐F6(wZR:k K28Z -{-KYIfPPhΨHh\0"T[X#:ԹCDzI0 ALB"t`E.W^_F-^ALO9-^I`Ɏij Z Ҋvz ǰpI뜫x2\9b'5q[eGօۘ$B01)@f2lDwLR>!q1KbXB 9aUfe 85"X}i΀ Usʁi,YxC("Y|(-B= vJC00֜AP)@1!*" &8RB?N#Aq;H 4pQ0Lٻ6$!JTiO5650,ZRrVkR%>ze~]U]]ϕRDF1sk?A2θE:^$OSavh+ !UQ)6=^nL]DtF[rm:V}???0T5S{RS{J%^h.y=ZxaQ^6urD"m eܦ\H;M55JHkfh^:A~ڎ+ބ~2t+3nXs2 gSr+XjYREv'IY4cBq r \\X9A\5 [hbҌ [hE' 8iyS`"BƦàUu$,sYD2G2m"H}ZK-ey&OOT=U6l-z2Nd]#U~Su 9j҄MZSږ ol6S'o:RZa= AVXmi  &z( %=ؖpKnl8ܶj6w6@f@I@)BXF# 0Vh:[R9ifɹ5{Z]qR,u~nZѯ%nE{`UAMb,-Ū#ED/"r9 |`)mUhLSS42X;[DW!u_F%Ql7u&Moz{yLJ[XQuϻ5~<:CymW(DŀQ&kמcUjPUNJ/{\Ɍpz|1j{Uu7աHݕ9YnI-PeCJ'Iʿ=ؕ9&0^\eO V}woʋf+s.*|U&K:98;DGV)J4̯wh3>+6ױ?́3⦚}4]Kj dcz 5o\:2B'7sk]&wU^R+s׀R>]O/.Ï?⏳_&|;z{ W_;z M]Cz?a-"O]rix ~L҆@G?U yusa)oƗ+p?Jk&?=\^a꘰㇤ʭ&?Ls#81Cq3G_&Q9;MFn:PErz\ ˜UAERpUl!'x->z >?289œd}Kť7 *8㪏WZa·wkJvH%5i~7[k z jHgo`=J%' 3cE}\5vB ZZ bo@ǤDV1.m5';Pl86'nAZYKf`ZLurMm>k xgf@浊S\,P'F]=+~$vX ]Spף |hZ/U~W{?܋3fq[HX`Wp=Ҏ`#oVQh$iӨؽ>Vgˮm',bOS}9~u{1 [7vK/{7Err+ w?7{򰇸+BWO_:m55(3>crޝ |k*} ԇ^B?_=~eCy?ƓtjD}9LWw[!Om𦓋>)mZݢiZxyZjM!gğ w zX{zoq}>4֒ZoqAfLV=fo<`nqC`6@AKa$LKS8!NH:n Z.k]9gPg46၄2؝v>GpCI[Lkyx$`WB:à5D(K%,,6h>\Z UݝR*.+k(.mʀG)- z6 y˶wc4p7V&n@&xR&x!0 2UuoMgЌA}yyɫ-}ӑ$>o"9BS &h8z$`I.kMԊ ke( ax19 FݑNI0zr{A=ճ RNH}H>T>_2l 9.aB̽/1PR-Jm$ck;W!Bfɔ/7 쫎 @!"00-sq\<> +v@(f\Ʉ)V;ld{%cuQHjh =Lw̜-u]X%Wkim$)4U[˷DSq6-ajM׍vz@4l̀q)m%(y`JaaY3"-tSjb x@-Qj3%hGG8گn9RpOIEWV?RRERf0#beXt LN9Plxt\A6@ bXXBQr}i' b=gP;%=PNPuںzX+]ȅW l5}-C3n5kd,e+{.JG\ "{%5)čHHe(($886nY1-#.]2K7%֟jqQutY.dt<U;8g̥*#Q] X^H*a#JIYWӜ& Jb89Oz^gA˴dvϩϲwFKlp#.a/d_\͡G&I =SV8.TsVG5wO9y*`wHˉ5x|>+O$@Rp3o(n<yF卧p:9kEr4a-Vq-VXi ARG$1,aNV4*)Mduo(P&l7 s n`5Wm*Y5g4'] sp}X.E,mLp{@[W՞vZkNc2Wւ 흺15}-v[;FjՂu`% Fjڂi^3PUyX} d`$]C d`6esf4NP)~lDv(2vYZsHHw4;Ō%EN֢5Nõ-NDqf$k s]D+̪#nsaˎ:VZ:riR$WXG|g"5hM_ob캮A@+՚H])UBq3BpjU9X?ζ9 d|%!+m*J P-4@B -N0 3Ok[ A3O;XӓoTƲG=E!kB5@ ^X' Xc (_qbuG'|H;BˎPҚk3-U=̠ab-RgqЂjX=NvR+!jH#c" ݖS uDa24e %''6Qpaf/IB[]dW"Ɋ=-T*}iFVpp5|aH+=L7!s We- ,yu-1/NGFw6FYluQ#2jQ%@) !#.BtC-`bPhEB˥]usPΖH"른]4αMJqz KLb:2Z~,.(հ <ןށ&/r| YދPݯك+Q6C_>kU&DUK Dj bDbufM쳧 @DKԶ牝ቃWQ 1w)*p!-%Huw)ͽKALEeeݒH[)?Zv $Y&+%-WR. Al3brsPnPOb\%G28/ep޿g\yzŏ}Wgވ;ǎql(++2CUAaeOs+ ?xڂ>=bԂE)lM% x$׊Wg1MjA.}jG樥(]aIZHjkr0(`u\!߽?#c4ʷVZaGFjkr0(#u&Y|*Ɂ,Gg.s*W{cUKxZMwa*c/lH%&r0kZcio#G O;#2a0XcbbQ%X4=mcFR-dTI5Le [u'#"##me+Ae *_ %/g×*(if0Z 6L L17GlI34v!LE&>#"%s8@^oEuR8Y Jp~Lѻ7tqsSnObC VKOf7k0V ޙ7OZ;/s߸COpw5JW¸ tq谣p1acJƕ0P8ׇs)%+aܡq68a*X R×+QHZ޸[iHřp^4H}( /UOΕۋ.;*Sd$SBXۋ``|,bsKh:a.8'3(0`Ge4@HYIeD=ov.YX86IA4Q9SzS<>R=襋[ɷM0&-1jj#-U/#S[X3q|` !f$dS85m$PC,9KO[js~4&ȹF5<jj}+Pn4.RY຀ے`9rA}\ #`l5i#;%=Bz!]C%=ð嗾B6^,?1HBo-R #BqovZQ32- 0a-KHl {**sl KRewIi6 ?>($֚g\(dT/&5U ƜIiu\@lpسֳ) FHLY‘ Hp$#w5;8o>3饭ꕐ'Wc#U'AlZӷ]lmucPu{ae18S 6m$qtKu<*qZW.~ZNx/L! Fϔb1mcyi\̖Wh?< e< :OIдl!$JJu SJJ`‹!.94F} a t ML$n8IB4--kF^i}_ v%9oNszNY+gI%fgtj1H6,qzVe~etPPGffɴ_@0뺝$Lɇ%>L$NUd5Mwsw|I/UzW}8!)0`әJn,Uc-u5@0D]k4VzQgCk%y26c!Wh 7e#y4ޠ paܾ2U{s8%?(AK.u|r—}5E]*b2C _d@;:y3k ٷ2"7 Kݤlh\;WPz*SK$-g|נDD $2bBBm_*E:S- 5 neR dC.e]躢CYWXZˎِ 80 }8 PFYjU,FlVH A~\|PĀebn"q^ClJ|߉؝6hċ!*oq!Hn&d}|9wRLw޼ `_a;ba8)7F፨K.h,mGT\З=+q7 *T`$88*S\Nq)!0ئ25W"56 Žo.4WcpB)9#tپ'CLId;Q(04OG挹?t?|yy7~x eM~)S'6AsA3h轥Z4|fyCl4XүkX*l嫹b Z&4=gy3!f Ľ Q5a#T9MEk5X*M V=+2 IF9\fgREo]gΓLb$6P3*H6.UXpEUݷ&5O?n*65lbRH0! n H fJ-@ÖER#ـ`Scg , EzmCw爂 v zPA!/HlWZC%r4dM1FNԤl0 ii uGTav; mp,e$9qΔ&q nmPQ7ŷمomcv@^B])6 ({?Um*vUl[674!c[AK4 "ACGKEثYW-yu6y!4hMZPh U] "vgvzd0͡&%W`$@Ķ%W4~.$~vvL E"ÈKB,MlC-uJX^]ŶK <6 iF\CdAg +y u6x_q w 2m$?mEOO|~, &x 譇# {A B*K)<y]vA&+rbz{W dH6`|sb!zӄr~y5q2[INQ`2q% UFAo['pօJoVOewM

bS0t}Ķڳl+IȠE=)DC$RMUѼ"# XJBz%q!rd+>$H>"e5&i#و4=P ۗ>00 4 i#ـeOW { νҋf=dh.Jāu<$#0ݝrPי0%-C;FS=")A F2;+BrZq.!+ 0Ut@|,oxD yyo? ʐL$ (F`7 4de(i%q?&b[N/E8L'.p_?}m`TlIFճ7,a6X$|o~/-ЁU|f0zMI`4Xxf"4v%,X?pIǦ4kNBl^@R$`$I;*iwEI:$xNI j̢:)6J +W3V2PL\i%;"1Ac@#F;o*{=X=Y3DMrdLxJ qew(WWӧ@`fS``$ASxÓC6>Af5lFֈۛAΞر5%`$A:`ziMI? cq<9B7U(Ȥr>A F2zRn/kF#eqB$IyJRCs{~:zf]{ul׳rkuO) x﫶[\/4nOnn8oA\I) G=֏-nUh eiW_Ko+-~<4axzsnsHቭS hVߑqСiL  y?\ݿ\ݬZ`^ݜ.Cvtvww}{wbOڸYW]ܝvtŷ3 6]ƯgJ5BY]vK-̫wM׵>GjX_NG&'NookO8L%ZYV\=O_@e^zy?l bKuI}G?4hNq-o׿%ZA6av^q}U\;>\-HBYu c~q\/66i:]]L-O뮻9`_N!n(Nz*Vޤ'l]]ɮ6PCpiN*(CʙL#i'n.%lh[z]WwYl Sr[e9!Tl 8;ß'CCGZi:_Χ4W(|DƖx6nQd{}KRӗ#PeylF@*Ivu./lي-(Ͽ󻳅jmsv 0G?-<:=qi#=ϗ~/0]ut?;?.}oX b]gwfywSbgp~]ko#+Eato2d0I`d-G!KdYUeR彇^ @j9~S0K[ jTƩ~E~.[D >w4c'"8>:0Zާ|5*1cOZ6,|:ˀ2 %a݌Y~ 1LmpE',^>gˆWW6\i}dR-?7<hޏޏ}~HOB2M6D)9k Kޏޏ]D0vY2e)eύ,c zb4(S3Q;U2^|u{3ZdZnQ05JlgV3"DieG+[,zSjuzvziv hYuA9m()C P $ F3ęa6ag e-bDž01J! S:X/*~l[zOzt|G1b&[1fD+LA\*"-tcGk; HvB|.ײm=<-_eɇIe72fOZ[%28'+C>&͹Y>KgI,Β*}`eI/Y,e>KgI,i%] yY ύ,}mZ1=VHx|;M*]}VQڎd q;;W W`H CC'5B|u/]duYggAq2Zf_bQyLF`j&ǓV#b%l:`鬄?e9;^'(yrߏߎ?qlo V|DGǟ*r HTu*K 1p$8*%w0y8Qc(Ն2Or/z| Ԇc]Ǔ;6l{ ?noA*W瀼?#ժpURlrw ܖ?~3qIs?9YqY.sy^ïMSvQoǰb_S{fچhqDs[XH,w&5T %Isi䑜ߖ0Ǩ=uahmUFKpdF[.}pht5;Z5×560լs3oXN;6vp[E?g0۬PI8F9wPv5r9L֔l颸5O.b|yzYĖ>m?zg/^={zP~Fi~TrHCʼn$M6MTbN5/dZ{#)؆KJ''>fyHbJ U}QITnqer>XOpvLn9/ӭ)tGM9>\2ZD9߄MK }ǚXxe`<1P>I"t\m*]3"60uIsbl*11k:i1t鰇ۄb Ǘk'Ct{ BץR8: h"Ҥ_NVQunm)I _m&YNC>fXZѯ^b ,U#Ub¿V0 %`qZ%\k; B VI,$RAU+S\x2U8KCE/Xld)`!e =?/nKl#6*MP/MHૅ4s^s!# }ΰjc,V ی CLFTnS:vpEn-њ'vc K2e+-H0Cn-t]jRDsL;Gb>#0~A+!v DwʴiYeu}SKVuʴB[QuY!\Tli,&7qsY\r>:Q{(%Vq߉3B 9y)"{ׯcڇ!d|04tzF#V0yFr=#*~K^Lطn2MRAA$syq2&C|8pW!{՟.3񍖌vo՟N|ߞ/7ʺ=1/s6)+\+o ,IPEYq),gVk_F'<1V[[.)ѪNbo2)V; 1Zn |60s`ɺaڇipHS;-~Z…[u-TQ2pjӸnn<5 a1gD[z@FskoS29sY-Jݭ Qڬl'<_?*J_.RaL0.e]bUIA*qZ~:9?^8c);4xhZB!X&J3M=IFަeY5Oېrr'F>Ȫ#705vҴlגn]6hةKc4 5gD. pidLP^?~6q7tGUԱV1,)|JB^+V oUGi yoϩl䮦 zz{/Ib0j"ϥ8s^ԅlzJ\s.pĉ5v탤P=p*KaHjdӊ)/;Ҳw.6$Bˎ6o*vO/p*CymMb;[^$VQv0M_ hBǁcI|{\9ѼĒȪ={09Z1Lu=-Aj)^߻koXJV3'6^ $眿~@K*w658(̪ ӛHF|p妳?DEt<\[%Yg?|$UB^B.ڦY:|_}}Q&ԃ+? :f'p8ތ|mT98_)afqkooFCughx>YAMmf ߬%jùRAsduFju#/"uܻx $ӑ UY%Of:ߤ_j :F B_x{Mb"wucʛ(Z׹i[8DHEUܭog l8_l/cLzumnn^h2]j}*.{*qϨFM_ӏLeT`ɭ\|X0wGGe>O!L$֧!~'qFN;}KǤ:/ltLxV2RexpVPUt]iJ8f) /<+PTxI4sDU{4Bԫv4 `"P+І,5GơM,@;sEX~&1hX. ]m mVt?#&I=!j b}=R'i$k7)nwL6S%GKKYZ^%m2E!W6 |Els &TXB་e@"1xGzR*+--#-O}ɨ5Co,!]gUBwu gǰaw5c|!'nZR2m1Ux jFX/ 6EZ=BdjT4`I;žs#HP FJ"ΦEŴt%zJ`q0f ED,UВaa E!bt -1n[0L1^ F{VkPV) Vc G$Eg-Z+_ vjspUqa$K }T50DE ^tUB$J̀dRq  * zT6HWZkRpjKZ2@E(QE20͡³40MƠt\U CH-F"0&f7Rwp1NǀVq•y , RFl 9H_ PWM墤dYf/b ;@); CA" PGK^d21|A%`)7p ?Gϣ vY.)l*h{Ce/)@1ax+ (!=&XG" ̈́- "BzF<-0KV n-  !F8p.Mu/4!b;ˆC+(#XR[I8>m mC{`=xHu@K5R982YIoESc@BJK @yFӨ fʀ|ɣ ZKЃF>@$5-IT^1@6p3T:XiGe'XL7p_2/876 d|t`uHp*sϣ!V1͒mL1J%"v"}n;ü>,d+cCd F_b=yDGd xqTnBEMt, ˸`t Q x  BiL)ڽXUB`@;)-1!,P,-{uXdhͦ(d&}ڬSGŌʣ(XhrТA2yN PHLA#icf6s$0d42Qtt-;@L8\Nb,GaPOAe8 6%d%!Z3IQYi:b,+3G,ܙ Ԁ5JRYxF ʔdTyhF ,ˆfNf@DcI@e|$*! ŐJ9R`:7(ֽg7v˓)1_'i$or.+22pqŅ L{*bvaR`ðۦi i eV ˻VoM 瞡d5Ƀ1uSNfPfs-s8n C]HC܃n8Мmq@8D>+ wQzfT%.J+$=|,/Gs6Ѓb;n+dXlBVH lE]("ɔw.5L'g^"W̴ %DDyBRapa+}(o(8/(c8)dRG2lPu4Υ? {HcGGS90Z܏HuԪT2 @2;l` " Ak'ka=C?mRíeT_^L.7 U\@V`H~c˱uD5HR$P [̀zx=J{!ھ? H(`ǎ=9&Q`eJH.NIɑn9"iPXU1Q WFӥ16A@qI@~. p\&J#ט:raey"!w2YmVS*n`E2c(7^e!I3Zr)`6Lf,,q9@T ۬ WKڍ"޾o rI$A Z,WyK~}ָ,!rh ds:DggeE>LW^[2n7}О=a)pzyyREm_{vey>D#3|HB)xUt0BY^xlW: rn[TNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTγ usO*$9XPPJ`:V/ԁWQc*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB+qHnHB(\1+/C^PVuBMS uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uPu|g;OQz;)+AKﹻ}%er8\e4k,ϋvC>Y,kfٸ́KYTP cQyAb,Ye=e)@˘sd0j6cS|1V2 0ŀV ųYʇ2 6oQ=)7BˊX^DC1V2z JA1VC E@5\[}uu4n~Y*$oFoҰ4~ⲬK^ucG;o=}IA<4?Yr_->:n@INN?|z9WW8Ϳ竫e!% %'? ʆV1"Qyݎ,F??"-UA&JQ4'bC,[ E'bvq) ?H#_q[+7Nv~J>~5{Eݜ<+ y6Q^;ύ Fn69I /mGOF!/3_{:h?F! r>QVvSsk+^njb/XKVLP)`s䜒}qwXVh΢UUP"Dz!e*ߤȥ4,L՞}ݗ]6Q,@9&⌒$Z˽ʮtP.Қ>Q.z8$z5zM&˲ L6DFHP!BfɃv {t;NuG=n+׃YpnCK4 -}o̦u-Ls0IP1<~_ .TS$X@X[Kl= X_wG:)ija??fo9} Ď/~`ψtyw{[w?مn?}/>.'-Xݾ د~~y]7?bs~sjq넂b|@ݫ1po6HimƓC^˳i|,|\=/+r/]x~X5f/W,4=<)? O~ak4Ә&;2moeul>zuHݔÞO~v2[H8?ii%m]W.p}κS BDY|!g.fހSUX?PMAô3}[0m+6w ]U:QNlv4^xv|;v-pB7Ht%~,z %%C`8B@8ۜ[Zގʂ[hxߚLw w0GU X!YV1`]PŀU XŀU ZrWŀU1`VWŀU XŀD ĩ!~ ZA7nq4(3Hr6Gbhm 4,LciHV:ʴY#Tq͈P)5iǙ*3n֗_k{,ߓ_ ZzTv#1)ږ;e-C1ȡlPuoQֲnܖ)Dwfa8=?NKa`nFTJ1m.kh吅Gv22!t:ZNK\Md,}r/NI^0JIɜV1<YXxsuad K DnΓ^;|}xp g4c ;<˲)3mhvB5do]0sC͓~rde\crYKABDZ3|*X3:.wa.=^YWBLC#,%TH땔Y&R[ {]~S%8(+5d`z2Eiτ>7uÜԵu\5,j/J<{"hNqrtќ^T=! 6QybތJoe# \$rtnuadߏ׫UeaY?oq!S8&)D֟Yϓ:wde&7.Qi$N'JP.q`Xڨ M]Üj)ԫSH߿ !ex&2_IkW~jKՋ_Σ*5D@,1P 2sSǩ.?̩Dqw_rG˛-yi+@PdRc7Pǝ]} >xaqindL[y8䖪T[٪rMMFH=o~-ݲD6?h$QMI덃i_,^4$C QL̹@r{ ybiFjϘY4`*|S6vѻq66ШX- Z0] tTڒc>.N03(iP.B<gY`0d( w9YȚFjnQY#Z ̝Yz1^@,MZBO] 3+_׏/{8)@np=&5!]V8ro~j;r1ǿooI/*&+>  8'#ݷЏlcw}"Eg|ʱӪD qkq4&8'!v--|;@ ='3^{C`|09~Ƶ(QH#qB`셄QHlB"'+W`Ga(BCgaMY3d|PVpPϷ-ƈ2uYs'v/O]cf::센0_9 pw5rƠuSYyCEӋ'|ݰnq)B5 ^^( srntGhv66k]r`~`p074̧>ˋ"QzQY!^OPP\7䭆 \߭|WRQwYbE^(15)\XGd(\ ^(`9>|gc? ]ә?=,ˣ,8{&{ҳ<)]>y_.Ƨ<3[\{Zc>]q+n/Vp6Or7ɖ"8C3*BbO6enD/.Tk61FS]SmI2yʘ,p4Z*1ҭ(t&O`YW&Rz7q@6q@Od|.)IsH%*C5itKzq:,yI)|*p fj4Tr`5֍3~pr[D*V/'{ WP<*}ywU݆ͦ;>B;G.W T]~kӒd\m:993P;\+,~ۧdQ kdj; 'Vw7J6-@%v1SeP!?ޏ1U=z[[|ŭn6$eh۰K[u8ՄS*\u>t-|j=eܶ“@N2E@F#}Lm=rGuA;7Ձ#W4CUIiۉ\;VPqc"(?gŮcNG2R#G=A IpDېWFHFH(-h7Imj48œIA$$0Е2ƽi9b |"yG= d6D\&_uCdau34+v cˈַ5)CȮ-mw`xqXҎqokoQSY*m=1<=k'\#Sm9c}Ӿk[iAb6$Mw;_mDxkd5#A2J,sp g{6#Vٌ9f#6/*,{ikZ. OV;K>x9P8w6c~ j24Ꝑ,<>vW6 >_H#xO) CZ5)8Y?҇e"mm4A84Y$yvN{ iCh yIHG FQSnT};םF% Zw/aG%< MēY=(l_ oKYUwiUIy:ՙmI] ¿cK7aH& |D 6UH~H"qxtXʼnr m6&61\o{#Yb t/;}! ܌Ro3NRb\&ۄC>\b/Ƙx8ɍ$Hڬ zZNuÀ |*k!?׌^j5TՙRA}8c5ě}nZ8DMRT٥A2owH J(KmyЈ b n]ڃ ZjSM4&SCaJT9> K#Lg 3( \Vzw<#k5(LUV%Kl"gR_ce6]2kdaHQUi}f>_O*dp\%T\h36j~~ Aghp/[?^G1[;MJ H>Uj7c:5;~EXx!,]㻢Uko{5;FvQ*CVDeYJXGaLZ^t1 >YjobШ9d +Aˊzp#llFX'; sK3(w$CEf\kQ*uD@GpLΦ3tD;ĴzШ\ yiGDS134h*0r=[224*NI``aeϨcT̷o/="I(Ш8т-O&eV*u JiIhtZä"IäRTfLPF1fp_'TOfDK`v=Xa,tUܺ6\x=c4 |"˹-7 , -t\Oࢽk~7_=zWU\e͗sD<ݚ[5vz ښ#+jFa#倦BG` [˅Gs|ogvFOey-7*npڂz|tcw_+8tA {etdB`-_o?].tztvvٓ"|pKj~)Юhz5zPpOvHfĖ|]-HXNVΰWjY1_~xQo_`AuK}mjǫ_hj>yb mbq|\o2!eju%pi 7XCCpH[{}'0X)+۶M}9\^L!ݿ!оtj ڊ\+};'Ш/\.c ݇[rxk6ㄜuK8wC3^)|+AxF0}34*i/$մMIupߺOFxfTAOeuǍFTb34*~cZ7>Q9(*+z4NdCelK*nF#Ԍ:QrD)"Ш,Yu61 WAm rp(U |ӑF8>zN r ]P_ш 5;B# =c/\)>=!SFōvB~1~VX}dE %ؓzhcjPk)YU-  { 3?"CZ mY?7iuxw|9%qew4Sf{w`mҭ@eQ /_^G3~bs+~ye*u,+64#ZơghT_Zpf/FnL;J " {0)7_ɩ#gZĴ/zNZOֿtXn\DDIws8P:р6J;ɜV<ᄍ]fe5_#(5no>uv􄠥q-%ػGrW~yx<~OkIfTR>fJJBR)X 2/D{]ƫ59_NJKzx_'|HK%:X0j_lW.Ovw`[K; ޡ[!7I·Itk_$G j"Y.خd fiVGH֡7]!cۼ L[ u[Th:L{JFmd"Q˶c5pn83_fyksѰVb<{׏mbeLm\EzXn7Baugk&OrNf(XO;ل3(r.c~~ 5k1GZE fK >pBko ґ4~Y;\<l(zܒ8E:*Wo2+Ɯ/3X"YQG \PIL:@gUE9[+8*x wLϓ3,AX\>HQT6g+9l14 ᅔeRqGJvA9?ߕ\ҪyxB߇h&P=*`~w(m}V!;_3QփO{2w}G#d.h]Eq[W(ΜNPVSis:D`$ K0*2$tp~sd6%|)_>|]ӜpA?s@]Įzߍ2.h.Օ24!`-qͱkȥQEoJy8[aQ3V܏2cJ^bڇ@[WO7ޜS _9{` `7#[I䕔I:Rcʿ>ʧh~6`npo)ɫ7e>MwBA9.)Y܆F!Δ}+U6&YSQÇQ>]>8|˞Ӗ,> gν]6E_6sBf2Xg.4 M m ;Ҵ[^ ݰRo<vA[wHV`f%:̳7Cbl1/`1 IRYBOc/pKD?`%g̪6rFVFMڵgpFz~&cF] F0Z5UC&}~V VKpVBu__n66˛1RcYbTΗ΃\DpVNu?yKq}q Y)B N/+6Jbo$"UmLr %WXRv_evT''Sj|0ZThTnSf1D)h0QiA|[pv WrpEvW0EeT;y'o3"=}3!b(h/:_ći6Tֵ"~/_mC{ asV"NyߨOp5pF{z=!zvIS(%VNT)48Iyzg-1g;ouX' k:a+ΰ^5`- upa CD4Ҙ9kTE˵ ;(*rXnaX4L ' pr$4QJaRsgAI!8u^C'kD+x=?p/ÅT &rYVh`)5+)oX` [`IK4{ꈶ SN:X@ŸW6C l1ʹLav7A} h_Bc%.cm|Z#]qM1D!ҲTDp'3d@7=]ǮpC'Оu7YWt憑:\T┄/Fg<8ĭ`# JHWʉNRsY,<߻|7$Lg9MOӛ2Y㿶3f9{38S^̊{UOpRu0g`/N>dPU?arCxfr f yqiXծnű;xZǺ@Qj, f_F>M\ٍ8Gh3>4`d4OTneak:L?@iW>AndȢ>]Çhv1XayVxPQ *-F2O/l<q5t[~&PkGge,A; Q]S`#ҿ_3B]{J|nk OeUC|Fpߙ- ʂ PcE pxqbpP^?k\y5mA,_bѽ2HIXlJdAmH4yj_} C ;34@%;Ϳll0Ľ: !-/sx#X%I"EB\Ujֹz"uFCչv;LK*d9VlTaao\qeXK(F(bDe !Q{i~G1X"@,͙ {a? T$9cuÎuc <ׄbKyϗger-u"Zd D<#+O&ZǴO@,0 u b]7-|[i٩ZΚ/Qfq4M!;M_x:Z>Ѭ ]\c/P]{ȞpW{/c(u+^bۂukxB9ኢRV:: "\]w})>n!a[JKJE "<R"h˜J://k~<gܖlT] P?BK 0=_A=eʪ;gZfyAx2%"j2@\1T@#p#꽁E4]_XnQ ;mT^_vrmɂC Pe-1hсkK]2\"$Ɛǧց+!WI[qRM$~=hgnmC<:! tv0$\E$DZH$W-1/o- +cc5] ,W:*LĴ\ڠ VUpJYd`RFD)',e"Ai|)扝0w/0̩߻!6R7 .Grm 5 !TBvEq<͙e7s9YU̯uV ]Ƭszȅ"I K &ɸ Z)~LKA.F0h]Ѭh4qI ,>##\|yE2Pjo3t%X(KS5 :A 1zfMQYDQQ(x񵴦b.r%߀gRH! f;߰/nBDeؔ)[K0C4$SO_Ro&'59AvCh!`M<3ؔFIHt!H$*|ă/ 7ChA$*Ӧa`C֬ӅgbW|gʝgx丼0eEH([G'E1z !SLAj(iDG+GnojK=0a_ 9g".cQ ]%eod"2]'&a*:/Xa& #*ɼW,'W|]uC]<|xCL+'THܩ!.=.GnN7z%Q DQXDƖB;–8@PǺ-U:kG5%7BVyMf\T -C|rb%Q"ϟ&O5|zMO9!vJ'oP>[V05QM7NtvPܖj,.d9F96G<*QD ٢r.;q8P>󲻀! ZyNԌ뽹mQ~6P}}xHT˂EaP:iS^ _Pp $iwsOő/c*՝~\rw11 樭I*?]Yo+r+ g;9 "f ncm# =E-,jd{Mګ~ٵw~Aϗ]k s lԌ*zG\1̎ꌅEQɍƮ{x>VniP0ImSnJfL,ze;Nd1'Gi:{w_-f"}~5CΎwNm2?sMnj׺}Gp}>F-+Wu ԖU?w]t⹴Tg,ū3 `4kPleHSVy/>B~=zk9EYN1VB{0fiP6^rg>̀r2cIwx6UYy3ٗJ>qGg HiB~_T RXMA8M8"ꆰZBӒ6[3ElEg8[BqYlq~ PrOo;& {LoPXyIk"$»R5q`l:{{62f%p dL($aR5H2Ҕҍmʇ gK䇳TuLcTUG5O/#@kGlaDҦ ʙڰtӼSo=2emivIl7 [\]mYkrG'FbQ0{ζ,7q;wy +>YJ2yڣMe [i댍;*{䇐NAɨN0Zԡ/eSAypp3Zytw26wiW9|;kSZK'*3/EIJW2R4o e4oCǒ1p# b<{6 4kL |3>S$cE>UJ%$v17߬W@OѦaA#*C)BK6u=U>u٪rd\ ht0;P: L) @H8PЌ8YN rj5o)X=ʒ4Dՠ_W&hC2#Aq"Zp-`KO8 4t'cחzq}-湌S(=PtUmb)?s ʞz 32gM^YXo' /)+A0N)\ȭAgg >j(ߛsT2#?r`Fd]#=nQqP'U*-@sk՟^ ;y(K^,~bYgyKWҘFۂ0ָp=)"\Ce3~ܱswGovAXgzC8ˌs5w UzIjJ $ ]f0jC*~4x *7kT wO[O#Hصg{7em2>_U[z/O/cs.)Jm@4 V p4#LH_R*e13{]0#%/(LxOxS)/iIl\ՎeKd1 [`0Y#fz?YuvE?gJI5fa2#d!Q9E CB.׆Gpv>cdNzv^dNU橔"c* R4xzt@\\NP.WoЀذڦGK-UVW>=AQ"|vMzo;C2BPohh_rM!^RLaJꕍQCgPatZ<.KFXRkfK]L]3C|[mp[4 w>XП` f йNMyXùP7h ٥PͨT:㋷3̥Ͽ[Rg,yڰ#;g5_73oۯszMݮ?~on[Ǫ-kM]Y__n\XXr67X q }v/&f=lޔ Fm{3_:x\L7~]b:_)-1ovt␾Nm!$bT7 R|oVj/ZegpZWd6]UU=WEm>~iZ89x[ ?۫@1{W\<;j33v= (j?42߷ͫGswH 0QUU)54"8OBW=iQ;h$GJ<WA#187gbİ9QTae?|;E "ʥtH ywxpPNzixpl-d&E 51u5 jpNf,"6TcxTW`ZkpZvnb?.q/ >u;mb׵ ApPj e̳IZ&f׷'BjbA&v5{}[ q Pd=g?(vS+WD 'B(ť2d>R+a\:j}\gC#18.o9Z$HDtb-آ,ٛ7 !c rQJ+GkAxC*1ZtD-Kh؃v4{b@ltŗb4oxTr!H ~rr6c==j;H/G4|x!A#$:88V;*V;b<(ZWDHd.%%`!=p'SdFbp"xbr]Kr@KRۜjCACJrpb;E*81bP">_z<3?Q $uQJXPݙ'4znRxIdJ.u>l:^]ÇzOǦ;2wH *ުGiM ?}"bTbdCBAaJ٤s;Ј Z;1.z1•cpEpO .mJ8T|@VDsՏGōʒ#_ׅ:83G^ S1X@s\yC#18zc%6v j5NQQnAQD[S4ٛ41zں^8Ca'Ţ&A#-8P1nJ۬fS;[:pNʚHʆ ++Wk餮/JDJwvxE/L륖ֵ˕ *nDH' )}U;&n<=*hnHņX 0w =i,9M"14B ޅ ȝCrY(Ot /ߨUaH Nꋚ'pC*1ZQ5Ʀ k%%B⪷X9pܜ=9Ro\I NK┢J0b"$CK:h$G`]<%b_]zp7ܑWU mr89=b\"T|SFԉlR (7[k WBS$Ez")seP$P]1SDm(ʭ4k+zl, p K;nh$ŗHb*Y9<[=ޞ'FE/Ğk*A#18.!EʕCj_N e|čubUSV=PUa:h$'QKq&lR;<1v@5TWNS5+h$;޵Fbp"/u[J!5PLR5JxV'omr_>[˚qdE  1UC EZ!4ËOG468=5 v=`HmMYh[+R{R2:9EC6kb$/OlCA2)4]pP,R7}XiE|p&=L@ND(!tϪ9)nZt頙?FswF<˾W˖Pc&ND@长>1$ıDNJ̙e|ӼG0J접5Hc&S HNM;*h3g7}vQ0,ja"vN+]W(*]u,oh$'Fӣ4Z[PiDx&(1Vrn$ԅm_Z]e.`z;s2A錋Ɋ&Γd>[ۢ͞0"gփr- Chk8N( ekaD/sTQtK8^KɮS;2-"ÇTuYNVȳok*ybDWDVdDji}1]Ԇ%n㖑s\O k>ӣo=a!PM2+!2=8%waʻuI Os_Z+JP/R)='&72Jkv1M=44n/)́.RL]N}{qkɶRǬt&F{ N2>sv21MO{:!&4Шa◤`)ר]~ܩ0Q;ۿqMd.:gsc:qb7RZװY?1.—0`Ttx>}̪ĺrEPM&|6\l74 FjCҔ e jrz&Gօs% Y"jUCTW5]64QlON=*`|4c4?4t/GrRjѦO+F頑SaŜuZxmP-:h$'|{NGs;h$AtbT|:}wC{q-63,goBNζ9>u*)J1e,Ar11J,FG0#I9Y#b 5Uu( @$25f {supYWMy##!<_+f%3MP4Ux:3ک{QfFa[{j&~NHjPaa|T@4Mi:2ElO~^h>F[bٹ+Ĥr(Y& !Oj)RWvl40"mq?B'|_ GpCJ'll7?6(n,;FG?CdPGd!s޸jq:}(?;@q@'(mnI)[OQylM\h6U*$Ox2lu~sS~ҋl:h ޷zng=e-ZirҌawZ^޴/.Xq,}L-9eXHS6 B_ұMqC3C{6pml ;c[< B`a[2H pSbas;JOJqC4K` gA[KA rBa0`@"/"Pn ~t L߲e:p\3WK;#(&x Pb}iwցeBPP.(r[0)hbIA,‖cl\R'ĠR+`6 [ !``پB !$24Gmz=hWקiԄ[~H< 8 ۡ09r)Ds$Plcap<tAz֚@]1P$pa-Z; 5D')'@0 Pj4JEخwwkhNHϨ *n*#c<CFB.#&'R1u!\% h1,.a&.Aª sjzWkJ-~OQk,WN{;l?'ЅT֛^`y&|oXVUG[,6,jJt-yi^w+|&Bh/āl[0"ǾRD>gT,AZW{?V v^߬W7S"hޮm6a S @ }jJ<ć=C^8 J2yʥ B]n 䠔vxtk^oCuR±+aiXs  CBu!vGOAw>%I=קոJ-&jŘt_&NBX"u|z`i| 7Rځv\Pd(ϲ~>E~L(ֆ~|3rV/' ˴5ж;y>);J3>kN~('2γ$ʯFY~>s,r>7Ņe"ùDN~,HArD$鹪Ja=GUe܋Ы(f鹪|=dщ/LzUKƱv@T/'Yvh5kוUm=fa~eX/ +bEW^phRuJSԕc/h;,SH V:oK1;(Nrt.sH#J #%s,lЉÓ89)lőL7bEcV':`Y:(F)'rb}ibSMǎ>fg^|.+V:fmy|dž5h:'E٣4Q}e9?FQ.ZH6MŨzVπ@8ܝ/ x!O^@,3 e+x%t< 9! ,j`b: p 1cL }P)dRӼlQ:FlJmՎH5g)b!=Co@H>UҩH `nCx.' Yg b!, ؘyJxU\x& $cȥ T.Lvu!S <y>:}&W\4*Y[-Lr<}ղ0DBp\I00XσJm"SI/ZSʹhC_gjGVd$/qfq=<9i$=V0 j1 -3Ėͅ+o:>L|8J&C6F_tsbM%Bɠf0_FTQd-|_拽拃N%2Э JP^(,-3`LJaA+ߨ?,+0B-*3GqrVLՄ_')}jJ PS"B3c59Te4Gq^ꀋmif6LlִcZ\[uلIXצ-?/t<8%Œ$`,ݘ2d `Zqf[󺷤yb_g<[X`7V޴M3[U`e NUC "@,[LuV% ~kpnn}kPs& b<^o :ݝ#:ut̋;zowV7š{pyħM>Y}L&.L6ܢt%;0ǭ04:H(uY1q*u7-=K;IK Tkh\FU}v8X8ڛ4tu">DHfU``(]t1m ]e?h4{vV:qqũ.r 4.!X8jwf|Z9'4 ׍zv&Jel;ޠwQlVggKiGg`iz/suVOҳ֤jה4>vc;T?PFtB6d3~~2eueY-ZF-*tS/sUD ϴgmgUm &:C 3TΨ꾍S{ţ]Ə0.᎝ԅE` ~s9x7؄@kO944 HY<a:S{IQʹ#Gqo?{=gG(cVn^0eVj @ 61lpn\W8 .ٍCDL*zle6J:ٷHi>)$ƵT'Պ*e8 #AўVVGDZeT|om?2|%(yurXy)׈ӄ堄i.NuwUpE-?ZVQ+ Ւ CKʶiô 'POY8mϯ @)èY{W yZIxw`)9l鸚S7"ƛ('L&j2>+QS: Ez,GJE9SPV?%.z۵^߿?q]I\)h^\)(+E=G\9.1M +Ү\ˤOE\)h+K(~B &dĕ1Rb+e%,~ qEIﹳr} "InI>Γ~(|y1@ e` gXqBllH)x{yBiFiu7bq=ewR?Bi.\%2}ߵu:~GI% M %V¡F&Zx4`rLƑB*zr ]5}LU5ADTfq;:H; M|htk{hV#uy*JæWJE,]&ZަW"MdG>t5lk^ʰIu. _ \@q&|7uQy%S~շrd@燦8'`PdX(g>gw&֡*d87Gϧ!(whT]kE[{om]Mx<.ץ_hp1ɺFHu 뚥_ϒ] o9+0̘˷XV2FːD8YHr.9#-1y63LbÉj#U_l2jɂ&N0j $ÝL\uoՃ0DJ B9t5X2`(=&ʔN IJS\'wtlC7ͳiOf6RwM3m%Q.xujPaG񨟇7x2Ǚ)R_C*"nI[=kP}m1Vi޻z?|"J|pth.1L޵)jK [E~8ճL+faj^*Z؋#9/b0LDAEJ\008b*g~\YN&"as]{#= ޺$ {l7hF;LܴoPҦ:x/5h ,ä10RJEf$oUk$]^h |V<F9!S4[|F}fy3.*wY'/4ra'0)IV8 p&YR| 5cGXpKĐ %+dK"xLShezd"sD_-X N$ICiwPRvsFEWؘ5mB-;H^+)k1eDi:E ([Qi]$o-İy12c>EBZZ j|%j-q3"fěwP

'̧!ɸ2= $qJ1 EtPbPUqSREJ"xvZR44`zm, }jR{U謀+tNn3*_P):qcjIKHޠ<'',|ICre@LʩLQ4ZhK%tcQR C'iU[F&h-T2n  ygN <-N-Ss]$LDKwLP *, |*=7E>!jNS<8-|୰#ϴ3e s6)4ԙ2^EBф @Jbqf0D$6$T_^rYY6S6E&194ZtWH(wP<"X-HjUBƭ&-}๪>Ac#HɃ&!b <@e(4Fu| 5 Z}HUKB0ek1.jG~]`dh˓%D0BP*8*< 398aFb,ݒvPTv{% fsJ:f]$aW"DxJ3XljP6S:ܶ*mOP $'DqI>Kۤl 5k0G.`j$(>L <[EB FUa[QMXW$(!du]$6 TG+-49m`H3f#-$DiuZLKAT@|TH,>EB6bz FnUK&07-}IN1RX֐ 8d3E9gc4 EIxԖdb,[Z.W|NY(N[%gN!CэmtP2ȚGE?`TGېײh+IR_ ?2zK{ uWE[Ż`7讬.E(zUV|ǧ-wwqWcӫTk*n0ͮn9vZ/,mu w6=@?6 KZz+OnDo6|CY@||;LXgLY_1xLZ0M:ZE{?uKH s5Y2c&Ry*돬A䣐a9U,{թst$-$D='0WUdAp,硞x9*PKEF%Eh0kaϹ ,H$ b4)GoS+RM͛e|>}妱ll{?W.J9o6V|6NJ̽-,^5-,r_WKwNXf(s, 4p|+7 duؙ`%ĭ_6xUCG#Lқ|y{ 84w:.'u_7}!hl{!E^zw/z';f3P#oRm-aYkE=Bʉiݲ`n_󗋭_~ixq&Y.^vrgehV,&[íw۟Bn~;/ã 79K[𐙇l|v5OVR.pm?鯿y7᭖5܂C`ţ<t1dFLL}J*k:|SM]܋fxYi) f|L.2Vq<9#,=Fu==?Na 50Ryo@b<L`Xs;+zd=W]}wT xgp0 c 2.!<8$AMQdoQ oT'{5jYc2&%M6Wt)l!pnT`J5kDR 9s tP4jnh &*iFu?o?<5Z_]kEXiJ3asMI%ŴH&bZ8 MoTwý~%To$%;Re)Z+( DF , cvK;~ v84MٹpD*ynCG  ESet M\:^0\! bY, ac\\IhsxxxtqgTF%^~^ c: ªa[J]*[y0beD-[ .h1i.$v%`ך}/?돆{;Gǻg^rLgCDF ;yMA`M}&gb'w~<=><k}3:OEa= t}ww/kzpxI]nf(*"!`-'f P2&jxQYjgZԯ/HtAXd^4K2d yL9 {k7NJSS:{Yx?HSbȸ$5E0|se L-.w4`_D@KkB˵R*VgYFLI(A kLFh.4vێ4Ô~4m4Pދ4_gӭl-_m+G=bYX0].v.Ny^\;jNv]{ūal'-fA J ?ŚwEL^GMVMXwjI`wtPyۉƯBTڋRdC;Kzv*+*Z"uOۗpُaY :ck&Χ (yT%jSBQaY=%af:fCѬͦC#&hr]3;^:S.=vzjZr4i4e%7 /+.'hhOSǣۛQǴ$ ]8'c<19on0@!/&wgfxwftc:mVKzޝpjEK3 C':X 瘁M_hꬼ9Ʉ$}+D"9vŸjaGХeTe|_ Lhx~~LôS̎|={[{zxf=3Y"@c>ݟ2ӛ Xsn-XR4hjF#T@0SQ%`!.;0&%:<8^qzwLӿD(CRg'mڤĨHОlc{mpַ/zlϓtgez8U$NoQEo(-żiA4U0&f6׷i߿]YBP6ula u/_=<'lrwZt\YZ󃸶@ Z$޵s:N-Iw\ne\nǓn'/^drvPȻ%\+.q7W={E#w[u6Ρ ܋ ds/pn<7Gq-]? LF>sgmT:+f0s3̜aLÌBʆ/9,c쑗\1u-/\>:;An\50RɃ{pZ3Ik۹;D{P`Пo5M4\_wޝ#\ mk~8n %\Jˆs>%}FhWn<>_a/)W,}Ꮳ|hyN! 1nr񆝏=M7Vhv;͇{/ oR$Cs-l8\x$c)voF`ZMާN.Ib6\mϲrM'q(,uF9Cǀy,~=Ŕ d$EOv'I? dj>v؛o kv x_-B=%x]<w[ȆoA=aI߼~ ݏT2L,8˾+3)|6If*(X˾hrv.*cٓ33fhf]mB6SZեX7Һ_Kyɧ+UZڳ!3cٜe>d:jXp۽Bѭ;R({#h\:| 6\<3@zZk.im̋~P\`T B/tKn)`AP7xˏ^xrpTPcԾ/S{s|Y <)잀*dK~(۠),T) a"#?I+10'ocCmm+gl8J6XiUsR:oz _J2.9w}MmoO|lYcwJl*`Jl+Ot]Cv"{V5AdUЉ"kjkuvVzv[nmIl.nqށY1s no;Y)ˆ(c/Yc'/VTQJ{{&o6I[Ղsˢ XwO.J5qH_BF"`$6b]'c`8ql2y4Ō;G2M gn}Z qC@.N=3~i1/u;AJ'U@J~uX_VtZ1V<ܱu0[v0:!QN 5 N4n/wȝKtrs{HwdGFDH{ΎtݾnskĥuREQg\Ĩu.b>A .bI>1"FVt.bER]yD]yhQiV"&tlZ:ɹd z_PT{ F".q< _;Vzp㩨r%c9=Wꥌ?l/%7?< [ޱ^vX uضK+w@єي|CtSV(ZkРRB+wna 0" 2E4 Ef)v.6S3R5Sbo<MNdjV4smM]|;*\m%Zq7Wj0̃hVP'WGo㊖gհ\x5PtmmCD"jZ&4,TnZ&B5-{ŨEW}W^֍<3ƽ!Z2$Iː$:2dRPR3K!wYkO"Y{$ 馵Gp=cw)ߢמ[f NܣgŻ$[1f&Ѱ}MI Q';ƵE0xa|ˆz%2wY,|l2Bɂ8myM>onyFFv{35fU:9{urN.sؼsy*r&\A򲂙l. S~ђInQš˭@A8.#P!*#OWX[t t%8:A҄~l +{ ji ]Zqht(]ɜ]/[uUǦ"p%GZqJFdJ+lSJPYCWK[ *tюNcBBWVtBqtutŨ&c+{A-th%M+DəN]U sI+k]ZM+DIt$" n(-thE4~t0u]tp5vݒ x( 3 /Vg/MY/Þ}u6<#FOsczscO[+-d:oȤyj ;f쪴Y;Mz\ʦW7OިIa O TLRf@HfZKQ|W&s;9_ڊ g}~I=Q)7HȨr?K깧w1F} ̟G X0khXhyt#QJgD-(W7 /m[NlғYE%JUt&JdRi]Җ,SΒ3\S fn?%`/͒@.ٽX3+owna (Df>|MYZ=mx[aKE`mzB 2.V u6Tz-'8I2N wyyi= ڴ*pjZ77oJ-9ܶ2Wj[˲{! #Njb>2d\W8\{)dUsC-4epTdE$s|N^V0%asE3s֜:C!oz )FHj A: [ цC?ERB6!f1B4, ҕ6+|WEWW2 +@P]][Dr݀\*E rwU1tU -?Jٰ]iGWeJra]`Θ5tp9n.RKGW'HW j ]!\i ]ZNOW2S6X++-thM+D8c*T"B2whh:]JA+"B5thE]RQGW'HWB3Et76l+@P p)ҕԂIf]r)EW؇p=V.b(S+56 `5tpjPJ{CW!!ip~tW{1 e5 Q J,W{HU򬣫2]O 'BZDWBGj/Hy  U@$#6BZBW6 eҖ;bZDWX2{AZCWVT!V+ˉ5thn:]!JA] ]q* BLBWV4w(SWHW"j kb ]\vT+DX QIҕTs)T1E2 L2k8+.zYzْHZ[ЈՌnڗ 9=vӽTz=?٢̴B ]ntED6{65"wٻ6,Wa?][/&|A=G%Q+x{.%Ӳ,G,JlvuuخFWuu&9g 1~!ܵ`;_vO,'1burYO6Nj./ŷqC|r5Ƃ! r}]Oۈ? *}:"6bǰ( LrMh] V~x/>;"dx@Pr'/Wpnѭ_A-@q!ϙ* h~@铕Ug M_,0gђZ ]1Zg:]1$%]sr_1`φ;l hÓ(ol" !.݊ě+|orD/N\M=>pqtGGGAbNK4.(#>&L~!ہ޾7#jr7^}@/Ѹ|\oV݁J9}mkHOm/r=ټ+Gk1^ُ.5wd]]OQ8LY{q՚E nͻ_'7R 'Gy"y0{?[& |D|rkf{G  ɑPO׿&}qz~/[^,j XgaI=#lqqtmf%t]ջc־l oW'߾\C2x,#o'k$ zucf`V˓.eoF{(I8Jn i!ZOrKFn.E͍zc+! حj.ԹZrSmr)fͰ8i^cTgkg,tg?.4ta}mB! T˜S ]6yCюzhs&tDK)je"} &XN7-b0&ڠ-h:hѵѦd顁\Wggo!bi j+0ljf%z亦ܕ)aN",=t&3q@kD34p'-Ռ iTUiԒDF0S"^qDu hQJAnޱd,4D2R 0?@>R;iLU7*)$(/pO+Ј$|o.Ns6Ysզ:uY(/`TJw!IBާIgUA2țk#8s#ԬF[ ރF%k S'֤@{ur38:Akc0mƀڴk'XK!Q,KBJcuߺq)hK -F`1vyZm$Bɨ|W!'c#Z씎9[0f %xDMU.耽Vzk#Ҋ]ϝQ9 _ E*US`#LO r㯘n4:<i@yۧGtaaD)SA!V/qWb$Xtypl5M)[ɺW6`Ņ%̆4<ƺ2?K+5Rl{ XU9 VߘȆ)~9ԠMEX m {AydSXb'؁~oktufWHiRTL`6(:\dE:, |FEEt(MW@ezm|#rd2 V_gmO J+Y1w|э!ՠPw^KCp2P(SP|"('X !a@YPѮi\5<cn-$$XYC7a:?%0:Z QA 3GU_ ݓC{X +TtgqJAQ"Ł.0͑&:BgIwB"=dH_(4j84Օ/H!8I]ZD5ڽWQRA}k܎ f1J%)j0 !ѿT(`NJDA e>fdYk|B H\4l=x_Q>VqC hA}p0~Ewv׋qJb6(!9Yh>3(RAUvNZ'$̿2.`xu&o֛~Ѯ]-SV11󃳣n.0QCیZI|tx݃K<9~~:d6Q t56b ڸLcuLC=GN`]@E{ .' ) >@(E&rZ#d^S1ˈb1;3]Kc_%f0A 9ɐQgk#ԭHw 3xh {USYߨFbΩN&dPJii&A'nO̻U~ÎWVh"$uSjCn ẟ,wg1+aX0jH c,1@Z*(.FR<&z U lU?<66TƀumlbEJ@:X{zPkFQLzЙ dU Tk>Q?mAOy~kPqab2|ƛk΁ >bѫ8vAi "N֔70hlCiV3 _B\ RčtXzGIzAنFp*qKކRZG7$Эxx@A4T\5fs˥ZpU].!b!c-ٱP") %wR@ǵ1[ Y\*y׹]O|w"BPQ>J+e`j?&\o?e\m J\KtLZ"*K-goQǞRY\ %1.]ֳ?v~of|b`XpgΧ^W׋&x_'oN6o_'ћOY78fs>|RB>L/ Wo6Z nc߶pf^fM4of324%50ZQyȳ4cuĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QG:buĨ#F1QuxQQpQh7+1睠~s~o,km}-ߜo:cW='YrcAvȻݥ^} ~74uD;Ea`k*F**?lOz0u/<,dpK3ʜ 8'sv9zPZ1sZl0'π|MO~](z>tEEfDW[G ]1\2* LVҕQ?#`mllQ7Oyt卦gDW͆nZbA ]=CBEӞ_fCW عQzztMǷK+(R7hE"KFr.{I˒-Y&um͈GQ B4Z[}u QRҕe֚&mdCW2B:]!J݆ڟ]5ugߺ =Ǯ6CkDW<0m@Ww=;{F?2o3N0{@?/iJgGeRSd9=6I/_* +ro ÛDrE1hi EZr}re\[DWXyϚBW C+DIuKWGHW` ZMDW\B"\*BWC+DUKWGHWB0y6,lc ;7fh`3Ftut%06e1tp9k ]!ڇ +DY;)1ѕTPwxC҂TyOOpfAorf`^Vy6XgM:&Ґ{G!vͳ?&d"0bCٍ0`|Pn|^O{WfJL\Hee ÕڀYӵ_*f$ZY;5K㧻:c?0xA9腴Ro AOdܕ*TxbLf6-,w4̄(6}Pk [f*U_fVnD.Ttϳb=wwCp@b(x_Ϧyy:?1oe;FC%Wt=$Q*Jg7) 2rD~Y1MU'yQ5鼆 ,SPM&4k1L{4|.0K}'g`vI>^uJq&-ɠHau`U6cIz )NPW^bR=K3NA>}65)f"ئ٦v0jg;B:e? PD0; 6_v^oeqW.6fN4 VbЍW?5M/gOP,;(T)TMxM{ZdNۇTyxHǣaxU4yv=saw~-7ٰ1mr͓8ֺeݷ|ˀK'yj79,Nnɶ-:L3嵦-z^~ء|v=qi[sފ ܠHgB-fa jPG[*̎p%oJ"(mEJ*ugS=(-YwW8Z^O1u,roj1u5^$bŸ\*\}kj?p]gD:Uu*ڑ-;_3޹˭'7LnGE7 Ʃw{g'-6qG zw t6mvsԂV9]Qb*>4{u6y({+zbܥ_+W )b( n7V kbV Ł]HtaY`;>]Xe:,*:zjf8']Kη ':pV"Ows=ܳN;{Des7R @sOls%|Y3ZyP}oܣHow[-ѐG߫nezOjʼwd~vh(Q Z^B7fy 2ژ%D]^:%#lT$Hs!\jBWV|DhKWGHWVj٨I6'a>pB3 Jӡ+f nX훮6kd3vOtJCZ=]T1Jh  BR5fNWҘ1tpi ]!Zm+MlALm ]\FYS ut(-]!] M)ku5k ]!\B P6vut%5׏lKWRwL>vɤTNW*9j.l/( hMZ{\_'ba6݋zϸ?F@Ů i.O~^I.%&x.}V\giE`LP%F"'yW|xQgS2ޢ\~A|?Oz<jy<',$7~<UzY<h=lj Od$׿ Bu*U|׻g*üBVj\J%N|81̤Wry]UufoNWEy&KgX`֏GYdTv/ï\h8c)NLsU@z;ՠ/񣲦*oOY=Cs6V&r^e%G!? CKcI֙Ջg Xpϝ9.`,tRfE& 1t>ӓtՂ@`2 aN~\5?(8jo+))fUGpPjF̮\K'U)3‘,\RRLy ۚ]ۘ]_hY-Is ] vaKu!'.BzF<ͨ5eǜμ`hg>- qzLeP'OdkLg ' SRZh+#-4A{i+vtҏ,ayq$-AC+t4O')cO9gpbvf iLh-7ZnMI݃td#TbN0SI 2͂qiF)kͻƘw*xV-iyh7RODKl тgA \ _[Q2ndWFD)L *,bqwGwX^+GwdodE\tL`9s@os2n.|Σ 'Cd*%7'LOOOkNSʭ!0,?ʎ_-rVYaR+a wWT&4};^O.]_'?'gu&It9p}^ɖ+Ej&%7U5U m$.VUVWj bQ A"~LU݋iVFnuUl+F絖1KP#EOJ" dPhwqbM~|zϿW?> .P2Hũ'+ h0uOXƮJT[nz|w??{;̻o޽9̅_@žUj}|_nl#U U͍Xj+wW&zQJE)Ā q?C`U\xZ>uVi_wm64AQ$ B g 3cn ݪ/ ]d0LYƬ91K-j2+ ⸝:BjBx$y.3i.!`$4&J9Ml=ZQHhǎ̘0CCL0t{\uCP:qWo]ۑ9;Ӳsʀ龎qEְBYZ Jpa,W7,7Zv4W"N0 Tߕ֦ĵY3,cre[uKQc.V6]jbg(VWnZh4Zbۡnb kޜOW4&hD'~B0=At-o0lS 2b%ժHWH-tq=6D6.kժVժRі ]5^tJ읮6l#t\KW{r7C), ؀DKWzjW +! B:]!JM[:Bb+)DWX  yt(UKWHW\ADWXrB5чNW$hJX{ 3`Wtpuc+@kǮ%kcWHW0KjGB7ns+D>v(-ii%ɸ۫NHnOJϳ\3ꀠ~oh1s ģGiw-m[]$ ТAmc8Ȓ*}!%%[d1AՈ8߼Ι3FfB^H~? 2F,)LsI)'rXnK0Nr<4~mpрzsFĶg@FG9nd |oݩO[XCZ.Z=fze"4M &Yt0[2g奵G-*yOQ@,"kfh%ʢAVݢ΢9@#mL[XGh!ڇ& QjҕT`#\mb+@ki;]!J;t%7z#TpjX}ulQt f(mJ6+}bYDtR5&ʶ亣+Fo]1ƣ+m,thU4+*"h CWWfnOW"á+64"Ye*Bt(e:DO؉:EC%C+!6"B<Bv@Y:ЕR֖FCWhfl1(mm-[GWCWeIDtuD n< ;+D;W!ҕT=H \MY&hm;]!JᎮb+aիǘkXw `#r_O5C6Wj@WV=K]`Ay4tp87CyGWHWLRLDt52z_p䲣+h * ]!ZcNWҕZјA1μnEWVSvBttut%QDDWX ]!\Nc+D+[\HWiZ[r#e"͵<||qZ}0zY[W(~ZVAx7dm#\9¥mx) hnE(,h@k&"+lB-GB] ]7!lL4tp)fe_(jLLt8]WWRuۡ+aBK=$iv_g{6BI7+}jm)6CWwg{6C+y QRDWX ]!} 6C+Zb] ]qDDWXI ]!}ǘkV3vBvtut%aGDWK ]!\b+@+(m;]!JHWR*TDtF]ZF[o "J;:@RG0-]!`++e "ZzBZvtutb-Xw08++u,th k;]!JDW(-+Dl QN]};te6z# c`Fp%sЦfh]5CZj7 ttuߪ+#+d4tp}q3F]"]qōޏUτĺewo~zslmYM_HɄׅ>ìX,ܼhaAOR1yb=ϻwG~^~\K'U)3‘,\RRg<Ʉ~՛/_ * 4ӣ%9TY>,L(/o:RCx|# ݼL[,yl^ɟe(Fye]\f}/95W(G\H `I;Poz:w_ze7iHpq4BH uwzT\J8sBN )3L[ Q(!gӌZ4V,J^e~J0_.,o`!poJ꫺}q84W?5W46tf{Zq) צUgIQ=LSwQ/0 KYi><Ř FM, Ë W҂ﳰ,:)Cͽ*,3No;Z[nn:V縛>Mg_w)+pPĢdڿo&(ƣi 5]|^SEC q?ӉZdA2[eE-^סT5y0 q2 h| h{D,HA}-A@/u e T"t6O(=1S~3JdGˀgΞd IISN`nDbwn_IB`5}^܁VZޫI 0U 4sC'"ˍ%6h\ _[Q2d{sMPB7J6R]BZ>{MM.E]uN[)W}-h x0YVBCðl0}tŒ:+X;Zh-4{0iH;᲍pO N ix]XW OCk3[VO6tW9`3S> 4b6^C7NBڶ] (|N}SfÊ^&K1H!ϥϝJ@0hhEOP%9PEIH-PA]FǤ+yz܆곇mQ_b+ FNTb 6>KjQ|[ Mط!ĪxbW~ #]wks8]s8MYiiP)5G1n#gcµ6AV0ARtqc`?Zy5|+ہa )8(eRO3\(a;,, {kv/EsCAra6ך3LI\Aʨ!NP,Z_RR[m rԆK/G[k@7>{Sܕ%oCUnE"+g؎V#w̡uXί!W_+ ݵ~dƘo[ lE`]VmE()!<ש AiBj֚z3eaiM,ͭ zL={ml_VBķϮǜ+[erywq<@4[k<*eyBWZ8C8Ƴg;E0|^x3"sey$*\P֚(c$BCAzz9Lƹ *LZMrm%ǝ{vޚK~r"V(jo64O^: k^~5њj8wʅz[V0u30}V๕؂)920$9AyWs I Z[ O15B]k\4y oVi-y6L; X!=TC:BI;Z(vgllPp?afKʍ,`ЮΪY7i um UPefYAWM $l)~ ]0ك"UWC?+W =XsCȇ0KFʵL|W]T! ӱcLw՟'7].;O^߀'CgQ>fL&):D0$ac1}Њs}M,|Mg@g0Fی:@ʤ̄payllEaXJT_O#Š*[n/˶(6|V|q3AȺYS:[ [k70%#h8(ei6UFnOuuͫG/S >,&,0x|Ha}D{4U>/~,_Z_2NZ'?~>_g?q,9u=\B"l<?IFjU+V+ReUAKbXN|| Ջ?ϷOOz{B9y'o|c?^fd::/_`)I J[$͍ i+wyߝ"dY^Us?K 1!'O7 DƸFv T:*DH}v{i~nbBd q2R2'x ז}~MO1?/EqF !5!](K0/7(͘1EԌ#X,+x,fUfԝW,a16̹\J+D+ZDP2>DI"+lqb\Iqb#ZNWwNoUo{S݂%?Нt fh͞U,VЕm@WV=帷""Lh >p]ZNX QRӘ k ]pm,th-o;]JAxGWHW\b+U4tpa jbNWю`}+1nX-y`%]$g*2Ο 6>a~v㤘ΒQ*'GU(h4K8Abhmyb8j>ҏ_ToY⊇;wT e}*Y3kmH 1_~?ȇIfﰓlBKQvs)e%"ͮ.Vu)+" 5 䢶I.Q1Bգ!q>:۴mC;%p=UipDHӮz#n.H{s Bb(_O7č{86oWhVVOr8KOr%vΐUB(5xgeJ%rpzDM? 5ڮ mQ|wӝ5#G/rʹqJ=`ahmc;v Dy|cjEEۂux)atL`Rc CC"s&AS"'47Hъ0@~m{(;dmf7qX__&j:Z/k&H1NGl?mt>P![)X{#՚!nC:pNz`sm=ܔT}pPƄp8'ɹ{ZDS>`%:-qMA{OUyU,:%"Ӭ}S2d pϊ8TuTAZj`ɷ;*?Yy $73{q_{v0`vLk42 umᜌ'vx9' ׂs:qqaVADfWQ؝[v@уa!hZӢ6Xq :Coѵ | rG.U,eg캹Z>:1 H$'ud,3J[@ڨ7r7F c/;}L#)(gR4A"gt<>_-G6FqGݥX1Rfi] *. <%iH)BXdAhQ4xгcʤ F c-bD3<&`t1K8O: с%*ǵ8YP{F9͕Ra85TG f{ctJK"ְ7r6ԳZ5u.AOKtE1W"J<ȅ(_5qeE UA +kK`&AGL&RqM0Xcœs2 kk"@7'^r"H MBT XL k[#B8#aj~o.ȧ8+&uK1i6-BTMYswoVVſ,eOuw"OpF ggW9ŏga@`}R1xʫjQ, Z\[~GA7Xx}وtB)ӱ\570dO#Nq qŝMlaN+!ǿǫ ɂK5Q}{<;;_kP3:v<+ei`ۦfR6/V ӓk(5os>ksZ]o/.m ̯T_xw1;]| n#bdE̙2rt[fyOx2wuH\t5 Fa1^ْOՒ)9*wdרśըU.K L>?+9`}LH1Wʷ>Ck?~,c/FW˽VE_%+)B<EXI~_?;t'~w` 2>>EB?lEކ??`h0rhC \qS/7ƕXk1g^B4`ꐷvZ$;}u4|e8&)u~h^ K0n3kc۔4VDQ֚ 8>[斦#Ba̜$ e$p}9F{#+WӰ5daFf.XVoivTӼaMz.gknȚ' cH4UZLrdꇼ®O̓z7D2oqyQ]c$~N3Ӄo5I`C@L.UI3߮0S)H!4^F~EeM}ގsk:Fcj58`ՔK~7*a lM?ꄇb/ oqQۮ*G>~Eb se켐m$[l}N9iZt4A,vj!=j8)4xZLy*/~˳P# Lzmz[7; B,7zzzz~2 [QyWc˻w?𦘧4}G-ݐ3OaZ_J ʐcŌ(ƺqQ4Ͽ!6ǖK4_v+t:.VjIwB1=5[ƀ]G P{Rm&biDR ܛ%c>sEi$GMПybW7Ǎ{~|vP1PoD>|JԧOh*f6:@b`Loejw+SioFKP1#uhmG0/[ < FڊhL,Lf=Xl.nȯӎp\@R$,pr+Ի.0\jX7ŝPdsFUg%e9U[Icұtf_iL&eYu Hgy1K("'┺'.)З`hM4XkF15Y6O9WC˨o_nyUjLJB{ %F\+%o!)_@(LʸAQQsgB[H鎡^ӝ5kktFF\_1}52,3 D\4M4Xm넸EvO`w_ [5[<#:L{NĈv!Ă2Hj9VHtjk6]mxY0->`:>.ߞ*?sNUmY{ޙ^N4T{JԗS7zCDrY#Ʌ:᭶&=ioGIiÀ8uI6';bNkIʶ&[R7/_ݥz~^6*GT KIwi7gT.c;Yl37qR;K[\.f3iLB2l~uv&.X̴Q*c e2+gI1BCğ6g7/x/lvd9ph$b0Ip,Sa)<)vBHbUEx9t۱\ (% L{NήyΪ@n;h!?eywyL_'Ѥbԥ!Q(&M$gάC@ea>z-m{}S,zP (B f+{m,5Rg_1˼r8zpǦ3nM(q](b[sWO~kt &wY2S#A遬̬3IVfT!:- ?~-mk bc&<~<̩ȔbTYjoQ]x EV@ 54&I:D¤soDӠV}F[eg< Vk;,h?YZYRN~" 8k`Wy4P1$yZe6P=ͥPe9q}y`"_3%|L2W'?Gdn[B~զ-AePɎ{﹖l{>ZadϻɩldϘqq|{]ݪE 'Cj;TEQZl8Y2;P>r3 @>p֮'>%IkH D?`E˹Ћ5Qu˧TGuړ*U:Xi fhv_\*q.`9Mw&Vi/9vf1eN)bxЄ:/!kC]mVkeTypb+Liw kC5E!_Ƨ٧mNQP3 `hf)1HWмu:]UK!>rW&,Yydcbh AL3N%0l\UL`OcWMmE9#U40f]o}f~B>ɏVCCfƪk\o eҷ8dGwN'RInPqs8#gpr+ݷ;φ\~i J"(%љy2)-N[E]os R-Iׇy(4;! .}-&Y[oGz/']OFzla|>hRe*}U*' Fc>z3S"7bțWwf>ƾãhc,Z_/#Me]8>~v_+--VW&%(=h<7W _g<=xLɂI mRL˞P(`JK7 yn=<6qYoCTg40E",cz3A9+iL k̟-jk}T}OU,;HpŸ^ӠĚ:I#;T Z:͘"(Fy#iJJpp^TPҠmPV# $0YfogrvGIMoLDK/*cǴоѵ.ͼ)\INU^DэuS;TV$8M&HRRjYa) 6pjVFn $1B@$sD$1f}O;9$TX`(F%!NcXHD] =%B# b!hQ`ԑ*"irrjY.W:ƨzTG%6 V0)Ptȸ&"AiӁy!ܠ"b;hR> u =}`wHe<8TIl$3h:˅@G8{佈՜妃cbI߸ӷ̮)$gGI3#S)D" iDԜȜS(I߽PmTno* O,UMu1ۻ^v|/rw{utFB8fR!0M$tDX0acy/:ik.#k,V"Rc oXLxW`s(C!aF1ф'H'͉+Z~t4!ϕѰt1 z?Iՙejq"Cِǣ7frEl"ܑHUleABWD!x e}f 6TqP߈y/31͢AHail}և,V?<}+g 'NZcer/S!z uZF@BS:Fq/myQ]؄ A2 xJ;f2R k zayִrv;|~L`/+Ȳܘ9XK?P4LFgMDFM05 k%=Ha"POv$|AWy+ߒu;"E Rr{4V"HD0 )$@2AvJQ,30uκsֳ~$ ?:Z&"&`ހ^5 xY0pA28)l0?yeŌ ^3 $"Cuڲ!8vŨYAzNj1.& 5Vgٹ?2._JO/ܳ/R-D? KGQSF:L{8&}g)*k܇4( θ딃Uzr)Ǟ"ȟϗ8CN`sOR"]` #x`]\KA)L&.&SPaٻo%P0 8+:mkr L ^ަCxݙ\̰}So6_^].>0 8hz9L.\QӧiXqAvil zJmo{~+޼_|w?=x蜭/#Y/|}SG.$Ն?OoopAaUAbRj mR FJ0ч(QdGC#m$JvGVA֔C𡔑}4vi K K|{IO}78wC}Rrwc&)%+)JL<=_Hp4'MIUm*S/>?,^\/oCwW﮾&^¿]LUKXsۋ^5'Mx%hX MM} _+`ln}Bs^pvZaqKE@[cC_ْ;b1P>iR`錾6:¯6ƾKaexP\YS!GJQ&Vw3F3Ư;#!PƸDZñ[BgR;lN#tB;;Rj#/8 wܒf& br*{f0:Y1zdǤ1nf>!`d6nC̏wO>Yc慑!J_=_v{|yqiŏxvʕ/\Y~~9nβ˖ݵۆPշ zdGcerAƕe_O Jc1K}2lP&qx.~>OTCejk/Shɣ7InŞa=YRxS6 턐c͞n/gwpf C[ܧi٩[k*<Ϣf6ϷqWnVJ9/B)nnk/``פ45^eR10>fjP#qy?9?uw7]nnwv_b+iq07hzLk!Sm3`huc9BUYhKҹ0?SB @d_"gy('V̀Wo~?އ_^ {|4@7uɿ1r@}UD&UtFo ݷg~zݍG_7Zz1]RK7Tg.yg L2o:h*`̑o (r#/ _EHi<r ~"u֠6h/!CQ ͩq553ʓyJ a.އw=;BofaXYT/N|*hfEDs1ɠX=9EqNm+` 9TX|4ӓFO0r `٬1GLGmB3kKOyBYV۬*ZZ$rͅ|Njp(_PKo12gk NEBm/kj)k dOű o<lrʋmճz%dH> oyָE;q*D/vn8d3A=U-e4`k 5n7Db-K <ibtYڪR &V {/Lz#نt9myv$K[0] rs?aZq RW:Cdž)[>޵״l0b)O;DBh DQ Ņ 1b_uAevʩ kW7Oߛ|->h f]f6bSҝ9݀xj'pj}PјԮ$ E7wmO> [k7 kl5<`ң'e[Ƣc\gn0)np0s[E‰uX+m#9e/t/6_[+8Z3J87 (&h^PuIǓ3qi5=[&ph< kx ޿-r\'EW+^'MBB} ns0w2 %nc-}f jtwy#k=uђ*/ָci>G#z!Uxd!R&R/5eDDL ` Xy$RDo=gC2aئBSXuGW:8w$m^x g=R5wh5Z`(3*h2:ڀmT 5`!ӆo1O\'7ܓU}59Bb"15U_/g.G/T lJK : pݹvB-lʙ4Ҝ3m?{U@Nrt\"Ax<ƬB`(XP‚t._jH'Ҥ} LJ)+Ƿ+tK?D:/ |h<\ (C0!B*Caj3jX$"﵌FM,MVHKDk)0 NtfɻǶ uҿ'u;՞80$5AZX,@_vG$$r~]`TZʩR ˽)׆7'WNrq쨚 𢡊Iidsw6*y{c*B}_B))&`[)Q[yW12#XP1g&bk%8dҺf=JV4m*õ*_>*$T4T xN0+D{-"#a:0- 9+ a9># \HL`@YGc9AQN@#=rl4\ d6lL+ jbKFb2$S, A9p0I%6la2[Zli'ma{0Y gl)O׎\ٝ'7LcZm̎6-/k[fTHCKK)FDoA\tVNw򫧻r7y XHǞ "k/%%\2N,s$+gw]9;U6ݮ6C+8QNt_*@6 k 7C؇LU^tj* KnXҐ}}_JC6%(9ܜ?bU7</{\m!K"(%$љyJ)ߌ mUbA+XQdE_=;ڀׄ a3ѰL| r2J[cu1fGdc>6LΗ5ϮHqd"ƮYJ -D( c3 AA;h/x}im}3z7jDpNG01\1-5u(GwDF t ͘"(e # BJ'0\0PҠmPV# $0Yf_\b kf׮|>sF  ( 21@z湂|Bc*E`H!/ADͩLː)͎1 Ŏnq^kS%ڽRTg-EMfoFۄtRCTHI0v8XacixLp}i|]l||b|۠X%dm,X&9-0vA⼑p5&1$ўyXփXGkrW-[AX6Krv|l<]^ticUH$RYȴ !ׂ4Ԃh-ouZ𔊣h-Շ|7bdRqL(Baf!o!w]Ю;S}.d5Q)F ֫h2jxYĽж],p) %[륆nkY๭lI<4̿磆ϲy V9XK?P4LFgMDFMca2=Sq2D?_uZxjT}*V=LQ+H@P tY J1uf8 {u8GQ>| WŅDJ ٓ&ϫ8}5Sl^4c͈GL*TѧՒuW{;Vvw%A af]LݤV 8lU RwfrQ}z1 pB p 7ל0[C;>t5_[G8Z3X87ד(&h^PuIǓ3 L?lQƃް@ۢb\׌kWgf[9YW6JT0 %AR-7f̖gr3[nfl-7f̖r3[nfl9Ol-7f̖r3[nfl-7f̖kR+2[nfl-7f̖r3[n&l-7fl-7f̖r3[nfl-7f̖r3[nN2[nfl-7fܧ_$Y9xC_#R1L/Ux7d kQƝ5-,(&{巯!4f6ƚX3kfcl5>g ~p.QOI-zZ/3km /0@(- vw/.y:Ȓ+Nܢ=CR2%,QCҔ gwfn2NT '2gߧW({UI:~;,7@ֳD/ .q`Lko:h㬯:s Oԁi\aF&0˴$9 5`W 7 Eq$$ *1<5D~ʊ uI:UVJ֌!Ve-9?fŘTP3ex~I\}?L~նCD*gRbrXM"_As#C{-h|Tz'馶輍/<ǒE*L<ihDc0e8%PŜ(aHs{.-^uVy}4ejǐ. ̲DG3=.Iz1>v,V/&'_e-_~E9zXS2PR Ɉ? k`ʃ?o|(m*~m*ozȐJYDD]VKV84䒪7ˤO,!K'#k <~J_1f /cL!2 Œsc`( Ĝqry`sŎ\37X~U+9&r6TCFudXDcZ)"V"D4}Zv+>wJ` l*ONr1](Qk!A.u iʮ(B`z6e>;$暁6Qk,eTH_I4BAʫ, d&h"1*tcTa#Ё(;tG([>ȹϼv976|ܨ4<0TS!g er+gbaқuS"Z xd|, ^v[Ǟa=ha)<) R !9 ,n앧 3Ǭ9%,hLy ny OĚ?h# n,5i#[7oÕp8Xa]4끐X;9*G\e,:u~?@)np09sH[G‰uXT*P:.@!ZGK!{48P=x:FY^1Tk%i=j =5_e5뤥-`/oR4|jk #$UuR'8& 8spIqQTجweI{xr OvI-=l`Y_@ͩȳBjdi: ,\\M]AZA?.tB`nrt5-Ukȋey3I7YDV ޗ_+t_Vv~UPH󍚿JY]+NKxd߾1+ˢSYΠj*b7$c뭿|5fb)H{ 'T:/S%Kin$g؂%7ih 5(V8ddXm|I̾Efnf*qufar:xb7oѾzЛ1~(>p()_)_^ㄩo?્SŐq-'e5.qqB滾, #z!Uxd!R&R/5e$^ւFQ4`ʘuSy~DvZfdP 3?Y:7~lSKtJ嗏r9y-Nq`(3*h2:ڀmT 5!1'9-uO44\{iD#}Ou[T]vSj 2#SsPJgt&yJgt&y=Io7vVCm0rU'[-L / 0Fʙ%vV4gH/u ϧsT|7{p^8hÂWY a'O$_@h0$ -,@Q#`9N]`HZʩR ˽)7Nob Sny{NB}B))&[)Q[M*Ffd j0DVxkQ=w]KP|q*W_6p8<Lg (CW fQk %I.P!`V0/Sh) K5Et!"'U2` @e"\HLe@YGf9AQN&=rl4=;by籯Ǵ (QHLd$&F:łKsg H\J5ZsZiNsEϢNyv}op|ڍooǣ K{/C`A|L֓~l,:S]ɵO ~EY:7$Qm-Xө45\a%:zakN>ʩ]uRWGl4B'ܠbͶ=)@-Ovy'PۉbNr]ȉw{R˻" d"̶M)HonwD2xNvc ERu 2O0ξ8>ξ>f{8>ξ{y9紞8>ξx |cnFW(}ʧ y]0galKy3|%gS4akwGX/,i)|zϲƇt.5`k X;ĬW逰@^ wE%Kʾ&砤x 7_T` l}u71d}62إ?(A>iSLPolI|'9~4ƭ/OQ}ut2Gϭ[ jJ@ԂjܨFxg'U#GhQ nk,ڧ_mq#؉ RsY04\`D$+h{s6%~wːW!:)q2sՒ9,\MԟeRKV;r Y$(~rU^ӷ^500"nƁxx؄5:ߏ?}HptB'\?~wl1;la$[0.f&Q@8U)Tt ⪴./T<(L4x4œLx̾Uem_E^,˛FwO(uY%*{_>*~YUCqs8`Q<.'i)`?fƹ/2.+Ukf@~ji@lN1〭׌֏hzYwt>K ZVT9+[Li T2hF t7#gEg׊)swxxW7dYSlԊ`Řjo;Ga %E0t2AQ+1G+NcB)7j<[֑aYqG`2kZ bu:_k ܚ՗kvż* ,AHMI#D5CuDjn&Lz!ܠ*b;hR>|.?3{GXƃ3@FPA8#\d}Ts*GދA栖Oz\6 Bbb1"0K$RH T) G# 6fjS}"hS 6kZT){Axe:#T)@UA]t:"0mVBGF7 GDžOHUneģ뷃4+NE>uyT\=!uW)@hu At7KՔfI!3brMz y;Kkj4Sj)$=h "A3Aᖘ0Ohf욷iLҼC&?k Am.Me.~L࠹Ad۠n K?hH&@Z{iZ!^lVfbm2,g 4jo5^ynmY< ED7a*O w.C FA3Cuc[҈i;q5_p~q1o/i(uz">PWGޔ``j6>`v)3:ep4!Sԉz:iiEq!BkFC^"CuCH8vŨYEK.|c<\Ʌ' Oz~_q!HBupf/Z$KӎydÏ@VF,IuD[ާ2 ߜ=q-I6Nὺ1<Цz9=N+pP+t|.c3Kۿ>O^gϪ@0 H %,e*lPdAM 鹳ΆY0c,ͩФ-C ,ِ:2!Q (m^aNOO&W'j@4=n˪MSL2(1@R0.[|=`k]kebepvs0I烹4z_zAۥ"Od/`N~ٰdceKh{nnqf)}4p] .8K|^Z|櫯d@!8s&6Y4*m$m9ޠ|(W^H?|w^}&޼¯/"nK<~}hpz-Mu U]S@lj3W kZ 3w C  HY/[:GZE>u̖T>1c(١d J4}}1M}S5AAȧg)E|SY?2ëa?e;u$'pPf.GZy'H[f:JAszwOl|C*}%Q&BKkM B*pza"J[$hi!T#PO~mc2ԯQ[tvj¡uUlM"p &t%]ZI1l!PmetH^ɶӮm!tk]!H ]&6sXD2$=&%wEctU o(UVU/ a?~/Lʏ7EǸ{q@) Ŝ9s˭#:F >?a.`ȇ$[gE푦5gg=3J6gBN)nҭӈj.D;Bǝvg 5r!oLryå O=cF@1gz4‚Sמ6s@SHa.7"v,ݔR?^A6bь3ӆ,[vN_7oÏ4Mpuz7tRTפ朗\h~]~~X;`}}*'$X3<_y~!Cs^: *Efmu/No_S:$T.xW;B,UlW-'%&r0EN93X?p2ߥkۇrrȶcB)7[(rneZGEdQ0A[HH!x1E5,م ]D0(323ɒ_a 8,r85`V#޴ÿv1wrFlo| Xp @L%?m t(fЁ͘NPś4nvvk{.O1oQK7Tp)sE˽J蠍wmT"Lln 7jRg>Y^F,B0#BXYL^jHA1h#p|1[Lo0S(Mj0攋~TޥN.wFp^~M!oq1VYE 62( F刊`) NPJ2MFs>'._p`dGL*/~R /ܥ-6L0}x|R }>h' @b3m@ɢB6EVHPDOT,!|yσM AHRVa S띱Vc&ye4zl"Ej4BZ"oJ2o35uP( W׻w g8X{4HQ~s{e{pEfBvGӳxWM Nٳuhit{5>a_ ?yHv1v^2u{wvT#ԼqN.77Z^견-/]V^tb<8Ь ܋ŞFˡdKRf.W-t;@8<fR y?9ݲ:wgf,ęK/_|"PTfBQQyLzks='0* x)׵Vx: H_^p2mox!0.F&^2X"Z+Ù$[`9Dv`gs52HQh'+,&V C!fH Y@SJx->cm]cٽu#|qaMt۫|\(_(w?^Gܹ8vf%%(9/ed0?yM\#IQN)93Z1'Hk#|V髂4'78C8k ?uI=R UYGO.e|i]C7mB\ƥL/*6ﲱx;43Pθ%LAh5K 0w]I%Un"(+ MIw"_s ]dH>H7 ;X{ޝ]%gБ /(~G*iKuJ-COH%H (fZ`I+G0ή$ N2lTHTst=.czXr}g}bgvH~Lw^"3" bO%{1'I$YK( es"" >w"[gyz]i[f{jh6;-v?'J[b?^U%hI'^=5Cs{5c__ zGtۣѲ1mWlѠu\W1}N!MNWAmT;M=.(bFWf~Qv\|wyVfhxoaohڦ6ia:#\1cMo ݐv$03C@2=|MwuPj]Z{K) a"TP#e_0Vj.{ڏVqg7>ZQzn[*Y*]ϓ4f`6pͫ4A`c|S kZZq5GAohЋ}ggWmYd ͷG<&0l5p~QEb6޸OV;&PNg =Ax0bz@W2o,0U=`㌷%ݹ Y%'"!o2 =4e"G%ur=)Th} 1M Xޔ} J*g<6<^Gn)g7/8WE@;k"{' [xl5oGu g.+mGj4M lh u8/ttqѥ.%Mop >pכz=pq =L o1\qD,, .S8piE?{tե5\pG.~JzzANzX_7\=DZmlY+p_c^;B7y^d(~w.q͟^63ff̼9Z%7<)VR1I^r>Zѕ[? U L%Q]7 GB|˓\7>濞ϡ~ - o~^ժ7?ī>&s}Bg'¾7ּY$3bf10K\%B12iҭlt,JNjS,F01ͧ\TCʴ9,S<З's]J%|*[VXM)vPU)8] ĚjǥuN{ro9RmDf!tg%%񡆘SToKӼ"9$C6&)Fo2,6- ͔bR {*KTd3+ 5G_RTMd:HcBSHK-pP)hrbDnryڤXX݌=kdOKJBJs2Bn?J Lvf9x.ŶJH-j*&,!xҘŔYK0KX"&.YUR&-6!fPZe+M4Pi~٧7UisdNСhg w3`VפH pX1JFQ(5#;HqQdiq@ֆi/jS)ZW$ gyU\"#cecT0*R57ipa9|=%Vp߰itG}uX/ŠACRӭiG":?Qp^G9]nf1o^n",ն~"¨ @bA Eۃeis(̐V~̐ewSO,ГY.]έ>2. SoTd&v-I Qj)JF*J -Ud#FlF8A*Naćzo.7wk ryƿzuUb߹eݶ#ϭdTIp|N@Wδ3Y2mٶu(v]Z}ۮKݷmGFv޷3)k %pw?^McXw͈$:ZiJ)MTiMVtl6<''+,|襘ӫgCȚOE޽6OgKyߏp7w~u[/FI2z}v~53={WF U˳KE=ݱ3GE.A [gzWWo<1Տ]q WΖ:~m7Ż_#]zhxoWJo<恸w-\ce5χͺ?4}|ln_%}]|koz5~Zвف M餳 "sܵԯYbڷty. !*^îKg,qڭ$7E+g0x2:Xm5Bѧvh!EΫOA@x "m?_Y5/c1'3n#;\ej4$Mr8i0ʖH%S oUzk^YGKJHKۏFƀ}ݑvqwKbB\*0&}{Ml*iNQ-sAi)VFO)BbCPC)z{ Rk和ܜZ1\$*SpUSJ0- s]"X aei?)SVUB)F幰!4)yAߏq-Sօa6^*1w[gv*KšL,M4:cC")W,4XZn9S1g%uQ:pBbCZW#.@ /$z{JXb&;{9,Xs)m6Z:M))#!0D6XbKL KU_q0@@ РyI5ksBpC2)_`lo V{ĉ=HXU.PJ7b=V ޸TdtC0Q/ h쐰BRXܢXV PT@t[ 0{?`>jva!PP'5H[u>U bdunX"LՔ >!as:i4 ^ӑաVɴ}PqV- r5(r4zdRZJw!haY8. ٻZ(r-b&!a1.bR=!nAVqb0WKՄӤ(C22A`>&T$2 =@\”"X!h0%*u Hড়2b _p,&" @ `WNw|҅ՀT?!W c2o )egG5w p@BL8 ̑PհCy i3XCfi }'+Tpo*ASDC#)# VS`T8I#=7/:* vJ usvR1嬅Ucޓ.5 Nޏ]T~)Ql5R΃p@RX,k(y;+aTfxZ4@;w!4 l$ jPx##. P,*J H/ME@˥ wS@y#y1M*,(t]xO(x$2H&rZ!d^elIZa0=ӅaaFƁ^ / GtksW<om +>&,XOd*P?Q< sam,9xY#QxXXwֿ8;9;AUEǒswdXk#p`1w@c nx2Ť%تPf:"(Ȳ!w 3+!#.$ tA%v p)zxi5YY5 OW$Hw*YSEXØܢ`A#EBy\Mkqd2[ì\B,lo.`a\ᜏ f=a%/kCB'!-7,nι{WEMm^^;83Yh֘,ٹXh}1 +QUU8om.BXXeIr^N w~>^k{_J<08<^d4)!q+`X#/qRB\!7`@~!@VOg7,qcll'4zu2zzOߏYۅ糩v!ʤ &[tv>z7J P@~o {U_P$KP ܞUVzNtu t7ϞWñ_b!DoxX_P^ܛ/_-xv{Zgjyl#FiPk;@-9ΖPs}Je+aXbA,[Hl$aX˛q#yyۢ1Oss|}rE/0j#0`6鉫.oޏ2&Fj4]FI:Ė؂]i񤵟-Mkl6tNqcϸ0gn5 ^oX>zgq;imzw=v Ѯ?k7 Y|ï:\|c$'R@kuLЛz}-΁kOZmv;SUwu\d/>6洛b`2!iʈ(*¥hYW rX>lFBHvWa6j Lg֔\2̢Yhׂ>J#\z߇=b!#tւXǦ%iii!55J:mk^ۘYIlyWl\ó+:kMyT'N\DNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNTDNToƐ oyJ u: :O!9&!A"'!dȄֈlP\hݮ=<16;l-@;"}X߂WJVh2Vc2Cd&4M&h2A Ld&4M&h2A Ld&4M&h2A Ld&4M&h2A Ld&4M&h2A Ld&4M&hB!uN4@qd8[+ 4[&sp`6vB\ׅtBe/ }a8Tt"HK 柯:^F S~>p}㥀= pl^8\\ypqq?q~ճM OUBs57>m;`(p!zAOnʭ}GG_vZ|>FdC؆,q }d-۵oN6/^Ґ- ==n%"z "z "z "z "z "z "z "z "z "z "z "z "z "z "z "z "z "z "z "z "z "z "z8Ǖ]^2J5Z%yYQ/4NyPdc&[@ <.@lyH$oIȫFIӛi~=c~-W Vsyl|ݚa1ɌKB&eK( eM>l[qo< 4k ,tB@;uNٹv9c6I}⃺W|U`k#CJeogk|Ek7 sV0:\oͲkuhe;A s 'L6'L-0٣8zf2aSL-2)L6kd=h,~5=&+4$I0LXy.lUi~2FRW~ܩj듹PK`WGhrqѭiO#5jC(tZ>]80?Ka^!'E~;>NK/q}=ooi}[YuONǗ3Bug붐 Aqix> ]Q'>߭( ŭ+uَQal Qͫ :xRMtX=!%"0}F,x:fe kefh&)&))FfMڡWoXCdc}Wn]I=ɥ6 …OdO޶:EQO<#+wLVnc ,].  +&i^OiqM焏eajOa>֏hN֮=:\6w=RۅŅR0i8ݲ@vb`pMOy#bF=[7C{-7Z !,;Jk'GtBs~#J m|5QTM'Sޞ6Nzp bD DM"Lۛn$?q &=;҄vs)V[)ddU)!:U MJUه2r闃Ng^+P&AWХ}T9UVEacZ#(rY5h2M*Ч ~-0{7ȣJ諘ti ʕ`2k.cư*&%(/Ԋ1Iyo}KSдW9c? >!ei dy%se6Ö*F)pb8l7x,j#}tξt O9`2풙fT[X;k\;4X/>MD|g 'x YrlŢ3F&T R;!mɗ:.RT(Aџz"NE;zjY;̷ zSc~t(<\R4?;긴> $*dY@ F^ sF#~C`}b7SP)\ٔ ;m~Y~x2ݦt] t=u?Az^oN&TԩT$cż!˘dPftE*8(0Z%.,ϙ T%,ϺKb,44!XR}}MgONf's+!`WeX Q^I;[&2d 'B]o}䞪@^qso¿y^C*/̅!{7`Wf!e^)z!j9hmĿGrM(JRYS2Eg yȘ.CRiAA|E`oǛUkNj<L>73g.>]cj;Xa@Qx|X_&>4q<@r) }cH+oK|Kgȕ| nќjޮyi|%B?%RLRʆU )J88 KGOuj6ƾ)KaexP\YS"GJQ&_]7͟oc<@!j Ǟzn ŞiBHk$awB5`Ы#3Mxxf+$GN DV$ -s5 $Q[녥(m^tPlBU:qcyߵ [YdW$cZad(QHLd$&A:łKs&9bkB,1wIZ'iVKeBEmt_;P6ܼ(R6sr!̅`l<bjצR D`Tr uμ͙ DG/lq)ςoʓ8-eOeWlqx饪kE6W>Oj#V#tH>ˆ>uTxd!R&K)#ʁ0+Cʘtc K /۸OTTL[X;zi|-ߑ>FNn:ׯa׾~M=a=Dp7>p_\w'Hf'``.E֝7/,XPX(-y2Q}ô7^MdJj+ڴܷO&odCֶ dz^R;&-i64k-#Og!)2W?|-`tr4G3.3oq?QeNjO?=:<6<1g7܌?W"ͬ3HUf,f f'rz3~3FLI#MT=9>X 54*Ռu }U_wCʫ&ݎFO@.)asB,BN Jp1ރ38 L!l )g!?*|Fk46zє9֤/E )5~5˄8bD<Z_Hs(s.S:$ǬFc~,dUiAB^5=Lpz4 `V/?@>}J܎n~B;K J7ĺL d`.02u;Mƭ3C`ֲ73-pUw Ê~@. 80B7 ַ7,g|,G`/^{SUmgn< z E1>h`u9ȕt0fjf' ;JF :I-dX{֋U#w~Ts~Nj5pkK7(,vnwݻy?MI#~3 -o/a<+39y%͛ ۜԏJq. g)vym/ܵ{Kߧl2^@P t8 gܞ@~q`L6C~9d ~H_N u)B'%-f6)\ofrɴB0r?XCD9R{ޤvAE>-LjOŭIIңOIPn}Zb%Z7-7- T:^sM`qai.ˉ;МpՃVqN (D`|OPa 2H9/.ð2(,>@AON&x|r)#avzZ{f[eg͎S*ٟ.ې("ކd jMRSJ`?'W'3_eOj.4s[J=( kE8fX; ;ՓQQn~Ɇ'gf}e`ah;)}ᅡ= CA_hs;(=!Mzs0b.9L6Z !u%'3ñV@͢l-J#{Npn][':(Y}eINuN6&AH4$̩kO¯q` -Xx\Ix4w2)ָ \|ޯl|+kErcpppMQ(Hrcu% E&Xz'Ǝ!`4mQ8dhAQIHF'̊pzI=Af5 \ڒzS/w'zA$#Ȝ4'^l)\!P,!|cy+Od l !H&eEU&w= .na)LHaE S/aj5G *o{.G)ZOkpDl/ꞣt`bvRtge!Vq; ilzj?~?)m20%1 ܃v@'h9=v՚ h!-RW `EZ pu[ !r*Ax q{2$pIk2$h)?vud]YSTWHh2Ŧ S\ikzLFw2@hf]Vs-c1KI\>?G4MLr.dy-j:Rjsfȵ1'HGPV+w݁ϐ9y LqY̙C:<NcʠvQLjZ,_s.`9O7J:ݯ^ZG9j/9v3|%gS4aI%shf߮J,=+TkAVo%Eҝj!)D6Jj+h}Dﭠ5{nhKt+0U5 e5*A:+(Nz@@ ygZGcJiK""Ja͂JP(9k? RpdʴH; *XfQ&Dc&H" Tn>V캏Man=f$yAPH(~H[aLp#2"b̑b 1;bVcOJDXbagu p#y2b l'DOlTZCwžE1+ruh2-/q0!?OB"m'+;8]hՠ=cZ5fsDİ޿)x’yؠF" M4 tF fG5`ut<:jb1Ch:tçl(1yP@!L ˩\$ r3~0;^I%q,,g78.NIzcW؋`wCv*\Xu Ղmc5G6c5..,Nuԯ/_Wz?oc(nd^̯v~zr[2_^]Ɗ}b>\?viWE~aWDzwy:} ַEy˞NOa(!wO닋y|+ 4~lJEH&5bK=G#qM{ȺKJMlxuOښ >RB]0V}|\ә* Z4'3'3'y\rmᜤ +¸Rti),QWHZZ]1QRtŴ}){ƣ+K~W]YNpqiߠ”ƨ+>[9LЙ-] Ctw{Kx-lR@la N@2{$by5L-BI]Ba]zsWʒ|={;Dc؁i >v`Je0±Cze*ĸFBb]WLYffG+R6Im+u^]rJQw+ܰ$;j\WioIµCMŁRZQ斁tEW=tbtŸC' M>w]1Jv^p *rKm}5}{8^m5̰RټZsMqajɣ 3ntmq^W0xv/EV DӃDJGJVSgJ,Qz_En_ G bG}]2Ǒ5} L$鋸i!)/M>쭜 e=jsv< iTt5F]YM<:w_b\#FWLLbʠF+0xr3)ȓEo!rNNB%R) dܡc3ͨj31kw)٠zhJP[hc)EV0k3vNuz^6(D`ola\3_̴N>aJeD37Q4㢗HK So 2tYg\/FWSjWt5B] ^"0`+u(EWLBZ}ߏEoO8AxW0Ú}5vmijMЕ-ڶ+@\+%H ]QW( SImN`лjmmBSmbP'O6ͩmmJJzܮÞJR> //_zt٩PPQ[ǜrjXX|@e)>m#D F =0RNq 0\=`A"09ƵZCb`F+f6: )9+'EWLkaJEW#ԕ< U*RtŴPt5]Ql Cuw{B(rLP! N=%gŴ,FZT.)u$ytG0hGn¢.(uO@~RwB61{<)9B&W^))}5*S]j#쫅hO%iᛁIUJ>-j"IU&Mb̄EZ)} ~t6,zw58$`'h@t()o+Wtmу'+JW Lrtqx)bZuŔN]PWvȧPEWUsS]9VЀ:ۅXW|&=mrsGC] g> 8(1YhϽcJ,M>\"1b\imbA+$ȈUr&ָuŔXAǨ8 UvFZ1bZ~4Rz_t5F]9Ŭf&݌ӯEWdUT> CŢbmT֞􏯦x>Uo]~HI&lVbD2-QHAŖN881b\")SWt5B]2@J"+FWccZIJ1ѕ߰R6incK Q&0ЕOЕ/ڶAA$` ^@FA)bC]WL[NĢJ+F".*#EWLklbJ닮ƨ+p>n9UT/56Nϛ‚ia*_9ڦunbEzp>|^d6иTBm=X-UU m2VʘX) Vn~mj^^WUg궑(_/.|y] m_^]uǦQŪ~L.s|1t (4OX]oylO0Lz7n}6#"~ap?`5Eպ~bk=Gi>lo#Ae iIƵb:L8GJB8l4i-{}x]?Vc$L^,j"H r-S+lQHe?]1p%kҫ2:F]9MЋM2S)4R9u2ӄ i3V)9'ん~2~Ss<7^ISؒ뵘H;X$J*;TG5B"%1;T#m0*.;Tǩ+B$i25{rbtqɈ :U[Ѣt6,p08~0_I%oU -(u֙%w EW=Xϖ]1s쮫Hr Utu]iho΀]1.]EZ?Uâ1cz[յQ6JWjAnܢQU eA ,YBO6nm:3BoUoUGRGR#~O|'gjQm% 5ۡ*Y VҜ^&bz더^qaN :G\d*ߥq%St5]Y$銁㢘%u*w]uTt5B]91rL6<8 h_I̘TM?'dIU֪Pdɤk5JV{{ إIPy=zEF4 R`\9LK:MR HW ]1.))`+2_鿞O'7k[:Gb?v]&.m~݁w݄ gp? 8 U8 ֞i]=SvGi5'HW𑄍*)fXpIV]QW HW{Z()bZsS*RAl[_u(EWLK*RBnXH"E9sW]1P(*1+銁 'EW6 D(,SW`ojEVwˋnx}/A?W[_-3:>[Wӻ%Ϻ]^M'?~q?z4-\u|bO6Yݻ2זZg|_K$,cMVC|?/ϛ[O޴۟M(cO'zFOexsv=6ܒehX:.wբԣ쩪pvW;udQŗwg,^Zٚ+=|r/`y|<`! 9`&?܅Ko-^$\x2y~ދv]~y>kznߴ4ܼgmiqryO~A)޵^ВqCWz#&ie&l}5{6Fl`ZrSAj[]ANMF5d\r1vɔ>|L>[tMx5tFd`"1JVcbJEW#ԕ IW Ery9ٳ0鴺tGĻ{rﳛZ|^ ~o0竷\/]5snryu˛kmުٗoedڴ|=k^׺Ɲ!rjWpjۍVes^hZ.Ϫz'lg3t|oWg6֫ykk) ?q+㍻UO |cguMKDuhg d|٣ Жw'/neke|Y5Lvy]V뷛䗾dDj9yk snB~J{}^z]C;c[{ʴfec*F 5fjc)@Kyr*oIAYyDe~'mQ㡺!|`^i/n7>ZN^/'WqMr2 PN;p~D\$4`ocKt-VD?W"*]Ôٚ+⬎޴7 N1.2UߖXo׸ڞ+V_ٛgzI_6fxf 6bF!t")xLj(-R]Ed)L7fni&-FhTMo_:SdӜu&#=}_k|9xcWW -EaLKn6fqvLD_;ϖ?iOb~4G~4G{-M#(BS:DMsenK+$9SIRXe!Y(+KiiZo}Eyw?%S/~,F:49Dq:KEi;o껢җ~}w/?Yd(Fo/z|`o<5 va}/ʆ3 5)D8` Tb ~"iSf@+yFǰr:'w*6[E8v#b:3$i^e(1CÜZ{2'Ca07#&e? ]nrkM=FЭ@x#&n#/eL)FX"y![V0dccG.y8FeUT׼dhdJ,M.LIk?] ʖL<|L!h&S4:h',4%yx2zIRu_"SDM%*J94,r0Zwq2} 7 y>~!g: gwNԟ]21++zE@X^nJFl*AbDm! -Sw TKF]^vXFo^uVo(!?@:&ؼNv{qq]҈VCOU5 G.tۏ[(EDWO WBҧa^ϩϿ1 g,y!PDgױ>}^\tD{jQ@@ȧԱN0ow,bН||2KwX)H_3fUm^_aLFƄ$O<;V^X>36~ /21V )yW&& 9*vBFd9);l{q!12)$uSf,e4X.p(R&f8Ŭۣ}N-|iz9 o|N-A@Rfjc桻̲NlI?՚'cTZrdؼ8B:GHLSJDBC0̄z96$_0aEu5E1-;)6WאY QJОMʽr:b-Ua%LqMt2D/MnwFZw?zv#.,RYpkt:@f%TM˟J[vƝ4Jg= Ӌv y?6B]Gp͢BIkA;601LJzSOt#)QABG\Q1@#NL%&-0-WXzv7EDynS`9{`OǑ6Lm騛ѝPBfXo\7#n0̘@[e1J ]_eK32 oQtGǽ`TQQ `xg4Dqd>yfZ-?diSmKZRo6b)a`Ω +ۉ:MƧˍQ)H夑[{JoJxG yj& d[0r2`Cv)F8z9 Io^SUnx"YKX=F @,(^ Yl+!.(5mzuZ\ 2B3xWZ Rj7c2kqժ8So(f ֦aŁ,c7.5c⑖Q!#t;7,i+FO)^cXM[~j <!1kŽ 6{>'u3-ӔWVk 3;ǡiX_Sq.Գ$?M|>) I꤫lfu$Jg%QFR$VS,H_cŻ E8wa1G}ASuIaulhW>b0n !F0y<\TyqבNjCi =\8Yt"xIPfq.,TO(5M,:8 J"/cGo5Y»y+9{+_S['EI}0/06T'Qf(ՙ R?p_\:HՃ׻bɢT`P쵆"giW"UrNI5jeiiA;nNDU \NPt‘qs/04d`b%~q14df3U]Vܶ˩M!1\Q0 @\pH-°#ʹja`]I:Ryr8}+Q[sܚ.|!Ö&f`lėGT n;?bUKHjꎒJ)s)j`_QMT}A0 jvF RLzһU??eJ.'`i-4BIP8y&sxz\;/Ҽ1v)Nɬ,> HaO1{ Ӟkt$z?gd2Lh'!&@I]k)b3zƜ0NJ{ T,5L+p`5Y Zafri >E8XJSQG)@i;ܞ{ʂ6M<NqIPsY^Ψ0^(xc8>LFp6C({(&Ysx.C4+UBNdFNdv z^^Nkia$FUbjZf<%ɮT0aXq#KUd2eK|].؞p{\)<:qձ?g1NjM繋Uwk={A ̨fAzx-"ς)(l}'l>DqE+@+A "ošO+9V?}trunXrls)ՂgYaEzǓŋ_v㛓I+)b$K$2E;Uۓn< ZP wp) |ck-(g!p 53,DYsM.9@rNք&[ef~fF`gώ"0f=YzJQ,bߴz呧wMc+sg C3"@Xuп&5)o2{Sګ[?2PU4'qǰѯ? ĕ/R2*ި\xrEݨ,_`gijUfԨ<74'(H.&$9pXNs'-c_~6 syr%ŨJMʫJB:O1+ C 'U٘W-1W(pefsU5YPItD@\"hd׮-kNq@/AE-!5W3pcf2c ̋,%j@9GTJ0ǩt+,(̤T^ͦe9M& h:@IaFoW#'?}~U3vm@ /9QG|6;15Ϟeb;+owp_?>.^fT mB3@w 4d_[H9x U|$+O tG)|Ui F@D٫&Rx`p׃tEBW1 1 l~:2\KeB0$E L/S՜ U`gyƜ8~z@A~z+:a~ֆqi8eRsMݪ6m ڒqb;@[aO:WӜ:`fK#ۧ`r×{R֔fpQei` @8 ;;;]HZOa֡iuf,K522eTkRW"HHcu,J}O 1X/Hj1VgO b(VS?!B * ?_'F:(%ا2 //A#u9,}+14p0~\$X;]q<$OvZ%=HEim?7_⨿V.fY 6fMp9pVC97X1 sk dw~¬k\ YEGh\6S:.CJT-؉5C(hYԽKָpE%р_ XHȮr$%xq5>][SK.SX>RhhE5FH!9`E8ո%TP%Оr[o@'~ wů>,IxZ{+ͯ Jm!- >RK˻FR jq"rz67xƿS:T "QS|r@-4Ryl(a*BɹO:;ݱ76d#X_ xLPVbҹf4(6uɊ,Cjs"vk)UblQ(P;Pb\vzs3T2dHyg?&7({;MHZNfv.Xm:dE4G0`)ݎf @\NA"6h}NfR3ťC'"e~EcorZZnx[Ps|Z (E!SIb2OIL.G*Yx)np0+|N3ijN2H&J/CU 0oaT2Z\քf'rȊPk|F"] ٧ThkYH[V|4*bBe2L4Fnq̈n {@p7%!ҏ ގwS XW^]M nT?ql/R6En c23F3"CLrqͲ@W1dlJ&)1(٨z}q Οz y{yYV櫿]hGVI2?W/cIܖ@gg$4)7j0Mw#]*%ysM׮nqxf7*-&\gF/r98BUꗞ_D=kl"i DH@0{_/{@(,P(/åD5/G]rF◜c|q~S\aZ٭_ayDžQ ܪ,k/Ib{Tn#Dr$R>KM[<&՜)/9KQ1Cv&<,ok!?T#QSe3bju̫S2VTodiɉ搷)󼴌qbyc:#q*-/βyTh̼J\TkE: i{h'Ii{xf>'7YC j= 8 Tfy("+uFhqוBJX3'}r_bjV{4qs95UB I hb*(ɕU4`l?pIeʛ2ZX=&߹fwGOlB5ʋǷ' WzxQ6q>9ˤy8pNVUSJBTQVJ듶(bR Yx0Øֆ1mҹS^B"#Ko6uC Z]?R0%{lo튙F+t]16}ݧs I/+EDK1 r&SH=mrcK0Z&]<|RY2(HYyw`z> I U*x7bъ{ o= /i*j8w0R?4‘=Ё]*B k5\=wڂi"sJ)%M oȷ^FĘP3!Ź` FPSD ּPa -S/`…gUwqAO&%gP`ekX*h NVЀڤ u(5ct\?ɮ zŻ[r,.X3p?$8Lo&%w 05;,]c_ڊeD{Ak-kMղD{4j{/ROJ?mK# d:AJ/_S.hWi,%ƅtVspȳ2/4`2t!j* MTjGm|i"z\˞p;/ =b0ܵQIy0)/)*p| q4 {?kg/7SŽrHFLpҀx@?bHp8y96djt5z&]$ꌙ,,#pp@VoRj挕!X/4uj&!]ל&`q9~xO)',*7֣Sf4YPc[N>y'.R 3ԚԒz4ay8\.D?5c6#D@딼LbaFS0?.ee%VѨ}S"'8 ~/~|SJI@M(lCib-crj R*g(p5K"`Ė8 3Q0NIuWpFBQ: $†YE {#PI˔lۺ)jCN׬{A#3s Yph9l&$p5Tpy(=i qF|hK]tO;fǡsS`O'\~2ٯTgY,/vT0) Mv`U.A ឺ\r?7JĿ|1M>N~Xy #) E%0O\wAB,I0^$ĮwృgzwbL^͢j jՀudX%+nxH9;PI/X} 5ioAܙ쪟S5Sބ #-;JtߊTT^Y}^TE UPdONM4`tWݒrd $:' Kf s*YlB*y0#'/;bVwOfT Yy_u%;ߧaCԛJ^jw4} )i[LkG>j3fHAMY[!SGtʌe 뿼0hL@ͪQ*pA;Wx?ڡ)Z׽)R9gjVOJ4LzA*qiXT9 {rcP7rPK~2F|=$l3 D6`t?WQS5@UiWߣ\Cɿ>o]ޔ^9z74ƍpg6=w22^ARe%w1:SƘ/QijwyFZ]msĬܔ4kA)_\@ԴM#j1 ڨFn\/Aԋ']'-2=g 0ԯ  =6\D "^mܵS!c6`tF54cm0OjI^lӅkCی6VRT07YpjU{:QpL AF h)LO*@5TxS;\!)!XL3MRݯSFP Xh8C+%UM>YD:ǜo*mX^v̊;xK y 5,؟[߼A`AR)]Ae~ZKb͈\q_&Axq@l1.bEt+2/?zV3_ܼ%f?zu^hoI[Fxp( =1E |®0 )?ׂ19Kڋ(`3.f2MfvN췿nxYՓ>{v}jQpsfq?oV1Կ/6{Ţ_2੘ưc4 X* ƽ}ыI[q 0E秼oF3Uj' BQ7`|^鏊Xa~ȆeNQسng8TlFf=65l kC3?*O@u᯾Ҝ$}TcәLoc9ׅX+MhH,R)X]2ÐsƨygeyY@'1H9'1j+c Mֽ9MN\wc5 LKK|sXP +ɿ9\K^ˎΜdHn4JhC[f .{~hv/ubFH9;ocT"?>jmn'h;׍|gn*>- YApp]l0#N2' 6Zrf0+y s@~JJJsrdǤ&*klc\ K6Uٔ_u+=ۦj?/_w<|{uqצ8+ 6P Df@"/0d{7.O?1)C_ydvs.₟5D';XK!JUl. CR;'T>1.6vnybmv" ́^\"f\""q^s//~ʕ𞔣ǗîXwo8Mg2 '|-)B-R_ʄzGL7Ku$UvΗ=K(fkSPm0tA|)*Uɘ;c/?hcK% j`౯8MLIL-GdXpA -ܵ`؄_ u i,GL.2Ӏq1Q\uUkə;MZRI BŊlXZBȨu7ҫб\<$ǡ1R{[4)6Lq]_ ^m4`tdHl *31RuC΢4Me뇗62rPE 1sd3 ,Rӂ4`l{&VR f; mǁiF9I2&;W#` <HJ?õ=x}g^|o UDE!h 5y=KZybNn%k_v;[bF@=@(+- F˫$ΊP—yR>aO)%{Kh"}:záжEcP} 7A³$/klHѠ\7#@(:" jŕQ#R74ǘ0!*Ր7@K0ٖĞ2߀ڹZyNFq펡WČG* ?#lh_DiRma]$5-.d4} )wmmX;;`%@?v 6ݘlbɊuq$Q2.VIX.RKCw+vPme[/Fe2ܠbk32,_hIP\EAiƜT&T P` ]N8#)B4{Vv:R|*ހXގ`VShS؉$NҠ?s@}AIh~BǮyAɠ6dzMB4BF]̬.ÝU6k&R/j'=]rԣRˑMPp0cqB ̂\`vr*Ԧ,`~<2^PvFV"h,/6h ˅lU@ϳ1>> "wucr *b j!0c .HʲX!#.䭄uF4x}U:]B~ռ<>f- .iqML8r&Y>ď8$a ܏>:sxXIBVRƌj qFJ0xRq4ysr1Ϋw1HO`9yCAt-TxaqiсM_'R^zxԉm׀Ɔ#C,$'}^9O gYoƎ)<.)Ibk¨2yd$Q8Z Kž9sH+@:z}0㷉tzןw|efCX9GNhjӝUIx8X~r.r=|6{2V+(tX Wj`Ga/ǢaՋdJ/YD%xAN5(a48vFodc6da<LqtBgľ-Gx~2&*X@-BSBFfQ(Ké-Vtow%&e1|ui6ד".WI[ˤxI$[L Lg5NWsb:9W&ӹ;eߙ[kڧq骽o1uq*E'̈́=nwauWlfƋjT!Td7rϛFTKPJ)+~:0#Õo5Omݫ+OD:g &~SYWN*Eˣw,dqy/!#.|+NaBnM>M$Zh w@%%OD⳼v|nVx/‰c8YIJ׽i>l|t rN!`zI)zr#dN ~s?BaH$.qt+ =#uX%^R@'}xw=[8uaYcm i^Uu5T$ިT%q*J])خcVLΦSt{(zRqҐ\:= A3@ 9Dɪ6f\$*pRS 'ifFg-ާXL :upnzpYw"vkJ;ܶWXu8A(^oАz|? UHZNg~4и9> ً8ZG&i't ùNV"i[Z;zA$[xɂ1j`p/(>؂d)&^ER\Ć4Ah~?0}%`bzg'C_#h!a3 w090^ln hu;>w]n8lYcD` Rr MOi*rUS#X gGq+09)?NT܉qe{M hT\^zݝ8v{LcsHOvɩJ.5bNıJs7gwvU?No ߺpze2-!3]8aژ8cab6\r r#2vzu . Jcg\hnAJ4S=aD%r#)A:wz")bv"o ƫUxD\K

\%ޏ 0vĶSԹ# ( PHYA_q(`i:֬g '^VR6a B: 45VwMtBSAUU{U/ OFq=Xd`& WFm Dϝџu`J!C[VVcٜ|YƊ*̝ksU(A5 t\óT#;/zBc'1U}`t v}hQ%{|[PHsB5~2FO9/ Y<+V6:?)Ts Xo(G wy {Ɵ?/z3Sk=uC' lz ͇+WV9XPpD㔬_cE))xGwt>fGy;5@"HMnG[=tyj ʯP ?uK!xPׂ øe`þ\qnxܢ7G wsmW @M/E (;>WxuNݡ N T޲La1,mc Vh1PS ; -,V*6t%Rɸ4x}Et;TFEFBE@VHa),ּ>TRqU m5zBnrZ7)T/agr96m`_B}x9a{R _?-2+\Ǔ. s 4މ6)(ŨXK V@aSZZ hfo@&;BdRkMNB(Q*'q{`UH"Tb9ßEpS^9 ɕL7,|MBODž9]80Rf9 Hq×%9HjFsH~(}UC.l}M0d'!)8U)yTEn@I5T"B5sf #hSmn=9FkI ڏ=M˥!LQ`aBFrm.R&~W[@wD&";V*T`9237onj""S- *z k] Fx"At{ACm^}dKFð*lٯ]ON(qCдaP1} ƶ޺"i9~V OX8(Ϊ_?;؄hY~GV4BЯ2vJ=}J󎔝 H4eb܃]PJ?o#=C_^{eifus?~ͦ>_<:ٰU=0e9%Htamy/!1[vS?.{f~)ოeٹ/N[ÉdI\ϛΦe֩vh׋λ3Fe S۔~]}^rաxJrh\,r8&w| v=7]Alry %"gC 9P\XTkh2nUB3 jj'N5:E1Pb_X-.):X[^F:d=-PՐZ c  r0\9Te0?d50BVNo u *HΦ8WA$gS>@s) Sh` hk>&`o^\.te~lg@/>v70خBӖmղ}! ƆDxHPR5Է)J i߯|>ܒ(`3Bo191@2vMX0G߀V#Ir NJuI3{;24{H>Rc7zsB^K!oH<X I>-PK 4zg7d22H"$DrRTȕ3f.zm—3%豫Y}~TB4ݪmϜ-`!`rk!.0fox1|=0S# %V煐 NK: ( `2a]GU0 F\zL%jlԦwpRJkV_efA(f>)>.]J+[4Mhѫ 4؉k >Ws@gba:P^F@?%z OLgȃ$ p o;;+;#G0 $+x a^T Ur#U bdZYo$"Fdc!NZ)Q{dK$YSUlZp2\Ο@KĤ$K;eA5oRèZ^Rq@DU/mf_?۽@V p{?9ml6֝u*J{ j~G;|,6 p ;rP(shKe@9c ʵe\2RVcm1:Ė+ׇuqVWؙ(zX}\hgUdR"+jFOm:\fâ3iJVzTUQq#vfj?{ؒħiS׫4wfj4a3Є$4eBC@ոvls/F00ҝ˟w)^ٴQfc~6ZE}r4s'7 !V?Gӝ]<9qy4]8Pl?'Wd-Pijj$9 ! m&;F;!co/N#8=x4ɠd Nr1"hts!ע~^?oYҳ/XB2*_UJ̈́U= |1ƒR#0_,&ɒ 0xq(~oFȏЄhAxAEURImg@ٽ6RQwΫV4o@'(G⌐((&ض%u@猲#'K$P*O8vEy;[x3%.9ѩCqXj8Q Kʚ%C}s<8xgFZ|n*+yRjTq16ت;h|XUJWdKg3zUD(F K7-@ewy#5^c՛(kЌK iK=g0{6rˑƦu51)ɠ?~[ %WcLZүz\\JOuZ‰4N*eEGa"1 =YQ^߈V͜؛*a~<1 qg[+pV  -LKK[.>auɛ=.5jz.]u`uM7y=.JZcY( &t* ୌ*޿x><{ d6.s%TUXK!9kxo\h'is&#l< jOS.p }6K(R*}1qAiB G/Ęo}R({~>>lvB)G$̚jGwnvLf^钆.;z.##?ۤVk;hbZCYïY>(g~J;VZ |)C%`89t'EbWWiW+S3d_}j I2#uÉYďXf5j<:ԂJ1f<eg i#917Bg:PK*PK'WqCiO1Q72_eqamvXMd]Q {#ɛ-XmvFLؿx 龓z̧. Л$޺@J o|cIc~;g7+1㋫f@ELjlﲪ RPw?g%فhC)/R?rf o̒2u9DaxV[5)-zlgn8F?,< ˜H@`~55ڈ #Ƙ` GlZF YOCC*J˸r AL+FRaXkUf d+UZD+-So-1zp Z-BU& RLJ2)ȤY܊%ULDwnRS<)\H9 $5w4&/:,|.֢qQ5Xl^e$JW~z))KI^JRҜ^UBHbPޚ3>iTP+' p XuJ޽_KWE[|d9-y<\hdXوa c` TW2Vjk˩`{ mWg.>~[J.8Yl*KtEBWD2K TUTI ? n b3UWrQ0D{1^֥U:O̿-19ϧ_8_ܗF;zhG?tT,NI&~V`*s&7^ս҄?[;("߾\iT_O,r; 0 :W٠{[RoIA%4or^H>G|Ĩ32n\c6U#h;rbf|UVVO;h|b'F6&lcV M&W 8##^MT-C0< hYԢ2M+.9EyscڹNo EÈ7EJ0T$;h< /ѤQ]:)lkC_#2u" Y'0: .3qe"2Apmt`0QJfbZJϮo5ԛ=[|`^hA}+zPf0`7df .75;-悴A6P1T@}1̓.oUi8UFՍ+v9^b$66zkGG"5O' `1)O'm P}xwEeY=^Y񚜛.%}!Wm`$jlEݔfߏ}n<,fh4qqFe.f5؀BR3h|?Rָƕc zs)3c^jiTZ<^5xfȼieElz4<ل2d}H#}-QyAð}HT Rw/%`dC*}v!=n>ܨ c (#xgp7-Ci *2?(*2KT–kgP[d4卋 an˶Yc3qkč6uarLPKމ^~KRQe MZk ] l̥F,LZ#_ ~%̿R*P0_B%A _f# ՏoTDO{6r$/w=&|ɪp=f}X ni+eǒa۲ղZڭv#nU,֛U=ʿ'MP 6j9@PPY*ER& -`%o` dC,lJwwu7ondAԊЪ0A߉\1ƃ,9k鵚dqr0'yFH?@3I:6zBV{/vK^@Z\tl=κ%W'pv֧-v>6n,Ҡ-<>U1:IE7"5K@coƊZH9I);Rc1}a.E,BDˇ+[3í}J=Ȕ WZv Ln^IИf%.Ŭn^-tS"%OP@R_g&rzmB2ْ"2[lG}2#ڿXeYh;֩`^Vb*]4|aOd#=m66- o3>Þ/.مG=X=V|.{oksOףR`qvh>mkYզ 3ѯJY+7npZ_SHUxM^^3 EF ,1DbXH ?=<ϣ\}c~%0n}6 @ޫ׽u.?YOqc 1LE᰼N&[_.u_UTuw%U])8p>JǨ vo}GNK2sJVGut>,FfocCIVHux־kv݉=l k8Xٯ];wX9f ܁z}9 ;NJ"if^e`6 SLfwqċd&BSS0MCmm<~C4VSbk _u/`a@{ OS-j^"/ rOG3qh@cV!Xc|K{pr6  8&.]d(I DN4!Uf?wΘ5&0k6+evڸۦ!lWw;Nu>1bAecŖ9n7O,+DSle-YzD=6F} }nh=ki1 ún i Cs,P9915ƠvևF1\}:ȒH`&[YX#}-}Cޗwywv;ܬM8w[OA19_];5 Ys7/ݰ߯SGρo;bl2?%~5yk[r"wl{:)ceL|s OPh=<7vXz__aPpdYIgMO V޳CRu"749})=aVka׷7RVKn9m+X\J:VYs,~o[R, T@(6HLAUA'b6Aţ%I[]ԅ5<{PJ}>4u?8"HUgy ˉBe$mrȰ G}(bJU~ %KETbvE)E/4z1}_֑Z&E`BRbIe,vA5\J 5*!ƦJ~DeɈLEF?"c&2 vx%" :cKRAjVG޴d8`XTg3>=$4lMN(wl( +û ͉,s %[x<&D桕H' oOrL: ]e'I HdPIQ*AdqΠՊiOE02[fޚ+]1{Q"l~&lގ]4kcMbUrcqvNI|$'=| MV4kztҏNXFסsfyoZҠ[",/gkL}k+-(8WȒ7C搀I YÍg kxŇX_j[+IG_;]|R _y72#Kɱ{!80( VAjhU/O[L|bWCǼY:H3R-I9xǖ1=jCD|h<Kx)WF$3Gʻ0Z:߼DwK5c9D1 ?1l"2\݈w4~D/!҂@WBZin/^=oFCG7yȗdh3SݒK^Ze44CzPɻ=m2Wyd(lxY'[e"[tRsO#4Yn-х>L{B`?} *?{r=.c튌~ڗiIUL'_`vhve:h}ϓl2}k-ZB/r [ʛz[d{uYWߗO,Edt`MVo8O<_!,EqX\,_}ü8qw'\yc3=_dfp~74s}l8.WӇl#kEJ_q+vt{h vzE[Y =}\߫9Oqt$îwFo3v`ͭMKX1Fm+0ћx \?=\ݤչL^ms.7i|,'a$jfcIh rsL ߁=6%b`[\c%erA= ~ ٛrw&<؝-6Zd81O[=y0YigcsbK_ ƌݨڵ_mK=ܺНoe?\(XR2̎$4oM`+"Į\0=9??}}Wl5uuIhLr3d錩gRM'' &E^Xf %ҋ(ոWwD<١& W ]OyRXwbkeٗh #溏ε9Wg[-X>[:ΖXg:,BL;&_53(lb dEPUSFYd$u3Dn7&4j2ĹAFIޑJ4>pF37H=6Gr 6X/QV)=FS kΦE|(si[5%GңM(CR[GULf?(7x>bLU:={8(cE J>l"1xMO(WӥS!'t̮B tegeT6WѦ'%tƪX24Ld<)6EԘu]c[p$~8}v}8fZ1 XA&cU)9jc0ee{nבjX ,0%R(0TX* ՕbZLVA/A{dX誨mAr*4nba]`tD?#RC%Wn \0)Tʸ2kfQl0d%RA*H4A/'gp+6 PX>3L^MXzTaHHp;hB*JQBzr;.KGY&,Deu2a΋$bfQP(D*~DT~U-/>{'>ܚ7*R;}Bd˘0 >]nJ9弍ƽ w}BXɶI }壻獳\~J&^yʁG~B=7E![-CgzȖ1fkN╌%Mu HIq>șcź~>^/5I@)ӪalB8gK#qc~SCnA'4V悥X$uZeݸ$q[E~)4^M6761]['P[#Ei]+8;8֐%o_7~ol#>'aavr8~?_c+MˠQa3F&Ƌ?ɣ`^IcQ[i=MYЮ IЯ0*,܉caGn%ͣkvE=G몂Ae6DhQ;0[#Ft#yhI;r1?G>rN=YI0l /ژ4_{B.)oG3rNkN T [n;l/Qv:j4v鏩6rCld@cY}%%ꭝ%n^Jb Mh*-a0q6oSW{O(l@XLD(C;}"7WbŸ`9%E/P7\-hbhUMo@)yƙ =mjZn۱`X:X<\IqUWh0o%uҽ3̻䴟HOgˢh8ǯÎ8i5"lJDD"x';wL1`Nչ6O]o1}tg>g?Y~ R5ﯖdo1Sz6ghǪ\ >xoM-hqm/[@̫#sSʏFoo\1hAK^-Nq&4¥OZ~AO.5Xk>}K(YҤ{U߶c7}0݇#Un`Bdm/5 .`_4y =y$*=J)?ٗ~?ra9㑣s|QoP?|kjQ,V6BDpؑ-cΚJuwh?h~rG ^-}*A[#Q;ꫧD+rUz˶ \?pOf~LG1D'1%L·oʓ ?zj|?n \=tϤ`ueVV:zT\jӳ$l$;Bb:K,lege^EʼWf򷁺,]DNQ x _T0Y8 ?mPx20n?SׯBXbtTU{X}a cXeFK>Ɗq<W:K,9uHnB"v }xfZv;Zdr>mG;ʍid=.Pm\tzq];B:<Că0.`]:U`œRȏ.KH;>n{W)\4SɞֵDetXPnTyCE]HEsw֎ZK~툠3rx ⏓hkR 3-;%ŔfEp`_kf9ٔC#0ZKY;>hv,}JC6M@!Κ@*^=;kRl-䏞Mi`l\/`[;B |&8dl|!NOdU3 You3_vn}}킥?Q? ~$"@L V/vSUǿE~N_W<9>}}D=&$zꕉB6g08{9Y~tfvFs,۪!:c=l] VtQ;l6$28ʐ\ީbmMQ\L5AS-KVl JRVڇVrjzлX)wܽFc[Iwfw 5LOEwiHpJ#wHRi).>4QB&}x"{aeH#.IvW;4ʇkx o2jcŧcJbm36;)%c? ;%d9ÚRaRuPQ[&Ska% cN92pW0>`ڣ[#*T" >ilh#δGnV&t-¹Pv\NfT[,8$,ӎG[BpVơ6x5`yDk&6z-jy6b' *GЌg[ ^dk~ND\ {U3J@E6pSkec Wxo榿fnM]ĸt|:vMb:<oDRzdOT!znojߟcN|Isadjv.~'#÷noD($@⺶ rI4yh7qVt[Of n6Lwֲ )mW.23#|_Ln_Nr|Ll^mrјKjl4,ywbL3 3X(&MNu(i'ܐ7̓-sr0We Z2 <|nKM)ՙ:N>UGCN:90|CMa7[(ӘL[/O(:*u'ժ]u"h~]ua#L\Dc&`"pL<^TP;Q3̀n8@R # XNvq#1֙7ػ0Km{`ihSPv#3 #Z0&C|ZBAdh^1x[|K\5&]Poo&ş׽{&7hVszHm rf4ܤ&z]K&F+D,r1iN&pP#3 Dp lf\W)<+c!G d{ OaOWŧ=ړ?4^ɾx+78qr(^v׹:}7oӣ_/Y|"h~ٻgk?gN>y(>2'/z>9{>{O껣k}ٗo&Shgobho?["rlj깾5ѯ:刻g?m%^ ]O{=:Gjz}&ly7]eb"CJo^a9 d ne&Q}vѠD^j-)_ ݙXF5 1"s(7xz-NIc~픹O(sP! T.Jr(W A@T:x*# Hɓ9yuI3GC5Gzѝ\"aq9t62<[l>CKրoC- N)0;|%t`C(*Q-f9Vj- A"f$NfuiwAa0[ȴ ߙ{Ƥ6~>BCxD=65dptrYM={v j-.ޱơL]ZSKkһ.oX% q\plJwu^JzC¢E'uWqڥS7j]ίy0&۴k؅$38K4WrV*`|5 hrll'\w):$J { `^摗Fܔ=4(H%< N_'1r8}=!CDV2ZQCJҏ)%cCJ!%n鬷LEeFcgdjޢOɈ]vGv>ϨHt92PP`;R){U/y Z\SrfNWYw<J8ZD}&t&%(ٽHQl(PS*R *Ƚ6p-R@*'C-{quQw[wׄ y/hK0k;w?fˮ lҟqx(??$o4.#YcN6~zo4 ?/N>޾Z̲+.fsr|l݆,{P b#LO s q/-5?d^ߋ;7ꇑc[uRIۀKۢ*\mXr5nY:7!.ɭφ[ڪ  K 뤓(RLLQ\؈.rok**ʪܒ~|z|<.'bUf/wmyП/=L {D~lCϷ=~P;^/xlaJ۞~3Ն핾ʛ}ޗSxpN/Wvaʻ:>duJU 7vc{X_⤋iij{_R޽^[ J@:u33T,pr`G2"Ҍ},F(n߭ovt*{7(2n Tez^~?,:L!h}@40E;m$͌L@z6j'0&߁- 'oi[pڜ|u)'U/ m ÜSa N{ziPO|DޠݽB;Z(vA7hƤs#J"m AcAo#=%hm0~!:6hwhgA>ns17sbbe26hA mOhg8}Ўm撚4%{sCvU%F]e'7dw deG7v 9ߜvopޑm#Bvb+4{ȼC6},~v C~v9Q3Ö`B6h7/$V{v)A;y$#nFyesma3ِ7"~*3)mt[ 7 E7d#; O(ҎOg"`vK/*6hDݨnv*\妘Cü"6h7/߂#ܠwl1Sv`+6]@aN] !$rchgA9rDÌ& mnN趂ڋhwJAmZWPV>zl ե4G7CMSI>D.%Rv!g&:zkd#&udM"IKX*Bپ;j}PtM R}^95 6rz1d$9g; Q|b@cfN+q@[k.SDqcrnnr)¸Eω TBa4ذk )C8]7y;4VV<9{>z1U9j{<ڕ8ì(Sn3&SH1E {c Jwk$id%L$MN;ѝ%7MdRj(:SEbj\xX@{(`n,&xB!BGiwwAkEf!?^EP29ZkptRS@֦Տ;(CUQ5>eN PJuw|OƼU`Gdt7on 0mIJ>眳=}:z/|Nx>/& ^ l>k6r=2ƈ7hjgQnQ4Y֟(9hǏ+XKn# QGc  C 3A$ r<]QӔ1#5Dw`SG[8dHUG\.7ZBij E~G\ UjpΙHզ}4ؚ|SUj }Ok俖+M0؋{%)4ſ]KS|y2k'N].뺏kѬG4 ~cPP0T`L +dXE]*˴^&B8ŧ}Ο^~/0A^yC'6=.-BN ; "I1hl\Q@Rj rm4O_ ~ciĚ7xM@fj=sF=opMə9S\O82x4`=VG9>Hq}H]Uz;=E_iE=D {9}'4Dub\?LWYS4SmfCɞ?{8.193r}L^CcΚPȁhf]3VCT_k&)!ZO>%S=סeoy)kjՐM7=8',XY`iY F@"h[f @AvEq1Ge ? #[ BcOȟ-OLTfSC_Jz=QپG_ʑjSXV]Υco\ &dHJ3=BwCJ5r0Y99t+ d+wYCȣv2CĐ3RHH4?vFU;U-:͎WA5FqBM9wʜEK;@#݇>psrϗiwQtœ+3y.гC]1F :R咭Dj->ȏ>j*'6ydð"zA$D{yՆ+Q>ɖsNjT NՌ)=fA*] zud31z_d?Ɉ}7O'ts5 i}ĝw\~@htp|r,R1}:- [ϟ\z$|&"u(N wtq䦂'4=b@/B&{*KvAb*,㚸jr@bpMQVBY廑)LZXRRM1+L^2B8n!53l9%S* K svgxn)*_sהs,ZEv)\{c`.k8!HtأY뚎3Er> qYwm*=#?'ɿ9C.wDN#St86uRt`(anEN3[eiYcOֻi] ,D$@K+\֘ ~h3 н94y{kbE+םhv_yqO gv|2Te>#D_ _ l-!H;G=÷}h4ţ'JޜlHpfIbqa8D3Dg vIǙ)ři t^s4.jGCQ 3cy%g$Yyطe1` c~bf(ȷ7sm#@ xTy _c#c+IzF?v6迢򗻺I$@W͇Φ6٪MգDuLln[7[̇$n ؚІ< Ρ%ku)_CE VA.GgQR`{)} r깐ØWs@! ]];:pN=fd%X!d `\& vIb(`xSa}6bo7+zѐ^g)J@]hk\ö7{uyD %Lm;[:,8?Qodԩri' FvF:JA:sKy>x77VAo#2m>:s%iMGoݯWIOY#GO)"z???R&3#7QƣlzȁrFs5U²lS5LHK"W9t*+KgO_Uߋ 0Q@^1hM5WhL@Fl`#fL}/%5g@&G=olv/9+AcvW, S:I "@#vRѩbB)B;jJv9TQZVsZ+BhwDЮ~s"DuыʨJ1Y@<#N=^Orxn{һ*]&}*߻*ef|ABVe($aI SU\R^&M2__=hn؍o3sxIXcS̢O>Z 9|TQ*JA` EZ/0XB[0ՄL%d\iVB Q)R)8K5k)ӏ([KW$%9)g*|oǣ8@8ƙ.?$. Mc YY?~}[>;)j";@YMfMv%9 "&iL"^^17TC~ Z'4-_iy+)W XZw/L b[|b]=-DI%:iiT4j'cZk.ϴ}:'lk ّ`VkoyۂVS[EmVر>- VkuH]7v?IA>׍a+)@Un NuN:\VDɅ<9K P V67 jJB:GB9ze>-jx |g1kp!>ͣbwvYL`Iƭ6$@/6<TFBM#5+)8NmxnQ]_!뤁% Imn3OT v?J)Oo޿9Lo&)x<,w?vV0]F c O>?҄@Sf vLNSNPw* FfPd‘)9&d*u:?*2}NoU[2eXtv>C7+t?840'AO~z>+Z'W?9OFlf^<(U=X?j4J>@o0#s̙V\Gfʝ 40V.)E$=nh^C4мyԅ#XQ4мYY$BͣCvG!DI (怜A` ЮVڽthgcvƝ,5{F0ԞX$?EdXUQt9׏"z& AS\}GejڨH5 :FzB7ܒn ĄwsM.?~}}M t[7[^&kt ֊vt~WgNr:G;@2/f6pL6 r35o𷃿ݨI-CCqPrAL-9/R 9~s,.@uYJfB:2sL˟ɻ2-/|ߝ<JZ_:㵜_ߍFߝx-U@. Rb'!>2f9 3oboKl _{dl H5QzCIT^T@D TrV߮J A:> EXL4A02JL)-H)ф*+@܊\/*&:͊bt XDXL$e`etEaeZ u?ܱ }Vz0yJ֩z?ٿO q2 ob4|"" 2&yzyNf(v?Y5B#ڸ "0)\6j\gZ(mlN.kbôȱ T:Mdo\1ԃ-Tꨎd),gt2YmdP\$SҀ+mF nU\@!ӑUO?U5܇ )k Y]&]@x`ʎMrqe=1~gh)s_%F&͋X lGu=<[Fmʳ]<Y {%#+I3N2ʮ/G=`gk Ue旷?߶ЀUٖ]~8};*@Nj=͝ʱSˎ_ɰf>{?P!plȯƬ<=^-b5x-]  ((DpiN(Yb )F4[}٫]i+c1mo>Z]μVjF/EVwւ%~1,&8kb<+58`/nj7%fy=ENT54ީ+EBxrS]1ra˸b_*:F <( ~gޠ9hcB)"3,J3B)nZ)Y-m5u{t1?sXEZ-ײ}xnL+Fu ܿ S' JC9:f3lxopNferss4=M'}O>f[nW>@bs](~[DI]1k/2 z1{p4$d*ؘ1)JtMh1O =zxnÊhyrh(QBZ{QH+8]y[K% Z8Mʣ"BpqC~rvt&[pHj:ۏ1[kPFjk*i ڲ^7׵^v!oGuJ~%] %ע#wt08yb"OBܰ /[.y5z1視J=+< ڭ WP#!# /ڶ VuIZ<{S:Ԇ9r/^X*aT;^,H|'Z$/j9M׏zJ4њJ)P`NX tuU[ulu'qv_<gE1aY^相 o[7ZL/i¥e:^֥.TGIS׶.\1P}l.z1?; 3S"&W2'S23Wd2DsE*PONfx[R{=%2-5 !tMQϕbe)oFGdD$W %~DL"MȀqsBB6w2,Ckljsy]Br=7%0_69ߎU:LoO&N+jf%gM0^3Yy#s1(mY$/t:2 d7~ .| ߾·//?M,aX?C5igB6.T,u(= $`֙F o~5J^ϺanNо˷e':|~b&zc7{!s5Tk'0߷ָϬU>gV]T3Z#"cUd0T's0 QVdyAV1Ì^ZזfXV}R}Mp/_&8䆒]/L$m}첛,-Tj㻋ʎݿW7sƑ6bG-O ,Ԕϔ/EBOe_,h!źŦULVFir,--;W73BV[#1ǭNL}S;AGHLD;*Ez= VN)x RףBywfUZ8kcϽj%nRNԵ֨-}ORZQLr)hk$OGCƙQlEY֪\yAV푶ئ:!Hlݚn1bޭg2VjpSūmGwMq):4nÛY5x|wLBoI<sO9>Pe[vF.>s=zs<dKS;PCBamPXUCˀiWu*! kG'(*QuݡW+7In"\y8DU {Pe{4l ŗ{w7#*[q%7.UO?,b@v[ drYsN߁OčѶOòn /zk/, %3`_0}Bcd։Tc)Z4y {ԶgJzr1l&5Ꮂ^tm, 0+6ѷWt2L]id;3~m#'޸]Hbpg1~ϯ…/fKx CGYCZV\@m (R/19U_w_FW6%X=Z3[C0R6VDԌ=̹S{ǠG4F|=O0!|R30zs7D %[J}+p>r^@I6)#H|XPQ*D:!`Jlɚ }BF i#[wh# hvs;__(@ )Cn3]2{֍|쪯`uW"J*y=E?x;nȂT_!LŰ0cE+-7 6JP} 1-a2eofX4dűoWW͛gt ʱr^6Uh>f7n%WnA9?/V=M\DzqCng[ ggqNupLG\xBuN^馽j Zd;k ?yu̎(䅓hM)ঝht-3`&pUh(X fbDy=c&pW!y2X#Fp{}۱oxAy̽p(fќQ ."<M=X1ꄶ8b: Aeᐲ$}beY%Ҡ^Hz-FSL࠵ e 'Atz:F珨,Q[Y##.% D0B3ػP` ( RN0bB9%,0):cˑM{TĹGE{TĹGEsQ *h-i+0׬U6|(~#v^@6f;,#;-=V7MkVont{ε W`Q, 0X0c4cFJ7PA8kK=Ar#G{yDM_R8pMŇ$> A%I e Y2y=&zH\U獙_2QyxyIͺټĊv4KOCgnisL#hO;O\%fd_-UOZ^kl(ޘQǤcA莱p)"s+|h9qc2mxo2#'X͋ q)e&.t݊@YŅb\1AŘ䣟zq`,fv7N>=gi1Z~wӻ@zN X>֨/(WW/cv⊫f!1餺ŕt ̬#;_Þ*y[+ӏ_f<^yq:`]PGe5`H = xF, k6 3RefJ<@/#T a[vsuF~t AȂy&c\xDb3 +NԐ)Egg@cxFOBMerHYIEЊ膫Iuv;Qcq>=]_p :W P<(JD$3nWp$`$$ NVكZȠМx,\edDžǰ͊8*!elx)UT:vӱV(ΕzW?0΍5">=$D8 !X,7b_2Zi(+b * Mp,ʻ[P@ʱA;.#2:.#[ގ-K)db$ӒIN^,RBsŕn~0pJ_J">2c;Cl%?Zϯje?|wbBbtl6hu/hYOvNg" _ll#Fҷnh޾MmFZoؐ(3!ʹ7Ic -w77iΞw/6,72N%͔!ܬ,7eYJ-n| ;=w#b-06A85I%,W͹ֲb1HԘ2x } <f%Is246 A qb1뫯YCZlS)DPӺwV-=è׵ѣ="0@y[-Pg\pT];yҺ{ $88ƴW]頇8粒FJoӄLZsc2cs͇9oXY)C5f]xYAVlOя~,su{~F䦙-Y-+fpkp(E, j$0>p؍4,:O!/FYAyÉT'JgENVKB63DN1f%e3!]5'nԏ*?P({/)hi˫%c4Vj4 ~s+&9d)8y(տToNsgvjP5pmW&s1Qǖ?UĄPǓ;Dp>7qRӱ"ܿPT¸Xs%N\(b^Ad؞Ԝ?cOq.}u}XnRWMMqYI+ɯL[qb5{\6*L߰FA⊵+ҿa,+]ctFf~;?w(*0GJʨ5yu0#ipS%uP=㍮M8bXw0!=>%;o@ŤQ'zuL`&z%؃,ciuj]瑼y$y^+:;ߒ^cr-4:2MmZ`}y*Vn||]:a|wQks{15B7Ƽ>Of7*zsF^!vLV\نwviwpegoRnDU{G㿄ʎq&VO8BwڈW%t>@[*LEh>gk/Gr#b‰ *LIMA`Z } s; 4;0zbM5=WOTDl`GbLa;)/\en~й^,K߿ȌmqWѶm%m;Gg3hfWW_X]}/E`ews9c b%)Sɳr5(;2%C jA\h@}(2b.c(2b.&eT+-ZZKr4`K! •\?{W6쿊?5 x9v030,i$ٹIJM6)~#(J_wuUuWCJ6#A^#G"F8]GBEP2't(EF!& I LcIi#&( !72BlQ,c]N; ViLIx̴ aъ8 Xp"(3?R /'Ul8xB^߱33;3/l5fsЛ;/$dWcpTAGڀ59E1D& m+ ~Pn8)bEY#QVMř5#BcV$+1 HkL f0M"I(#7ʀd15KJqӷ\K+.-. 1!J4ܷ!f].\kn{HѮ,\4KԝX)+lsū6G k.uC9;1 NfnqE,h b<1 g(5s&`.́v5SL JMmzű#z9d0}LH^}lMXKW@;Z9_,J+'2(FAa7PrBqh~3 ܗhd`NS^u?*ѩG-)8W4.)`..q}PB%1nmI"J?F ǔ,>vܠGVB "`*0D$8T$ 4rvۻpZ8=xH**dq|rBow4ÆG>VEiVeT2u߭.*c8ƒoziWᤎZ0ؕ hI4߬e{>M&J&> 1n+d{ZuY!0G@X!I/y2AF[.mq򄛂`,8C#5 { $2G0F$\sRAĈ26("& IDD R1qI9b gS|>[E( m(LҤJJ3 I8%ͫ2:NNmdzsW9Y)d+*\ٳ|j8gl0Sq6 ;mT)L#yrKψj(Y~~I.^|~V!|S~vpa7 ?wo/:`:p˳3 lKq!9LK.hST+D۝g# >gy5ο!BKHִKѬB!Dqd2oMM$S*-HRI aK+;L7k8Wj"c7 8"M YPU"I]F%)J4L|d(8/mCyBaBB cfDkD9 CƄq(S 0JPCCV/i|}zT.݄9+58tw|e+_YkgaPT&saD b[p)S|YM#;1$@d8}W,߇ފrS2dW'%#1dJ〖(wrhp Q͌J"H=ƨۺxȭADXC, @,9 8iZ"A0G )DIH,I(7P\BOTv\$-mTVO%&m2N]Mы`rwht{h6-D[M҅bhĸedA/ɮ/TRp2Y0O}[͎~7fޛI>j`MX.&q E3w#&{d"[6䒙16PhV\cO3IvufkW̯ͭ ;|**I.T2z|WZpخޑG̊㪿,+G(^Jk)UEW%-rX:ݼ 5G:j~k#?"pIJuW9?LR]ޚ T'` E =wvѪD: 8ǯ&Bnoj3Ds] H @q1=N *ju)zRiȵZW\vn#Xnw" o"Yt8HûI)&&\ q'1\5pR`8ɮ"嵈TisZDe EEdv_CҎlV休rf&)00J{1N::FF6'`˛z !hekFUf|?UZpe-|-V.MrZ~`x l3 G7W`hjſrL 00`r\ 4Pv` .g3t TD)CM%6p}x5`=֐MXv`0 o qc#T,''Zg )]ѧ'}fDoNy':Ax3:Xǘp=ƞRa?j4s˹ؓ =.q5*/H!i~u| G4hZz挷&轴 =FʇtysPEno V#@[L'qJ^jxQ]QiYYdL\f\-TjE佹 XŽ eCŀcaλPq 8hZl:& MZimr;c*7f®ČIg<^Kwg Ydg1l.f*x F|D)X0l9唱z~ڰ"OKґLpi'H0h/Ju<-r_;}EZLz>&g5g&Ƕ.F^/w!"sK~5h_])jLoҍs/Э 6{2rt;t$FkHhA MSDI,1[Y nnnoEKnYxJaVDB1#)=!4d-d- GM[4RN5nv +<ո=uDvgp&h,v(:.aV:\곧8Cr$EeZP<ә&|IAuTWUW.ڶv.P9Nьl')5>6pC$mCk pٮi{FuդXhå}6<'~ThPVѣ|&V,UIy.FW_+4ɧCNn7JUF1vΌK+BJ0yr0|eD- LZl_l0 O$?ΟOg߂ hAX<+|:^AY/.=}>?㩵L9R&o痣; L-Ŭx45ug__ ng/@rٗz nC׳ s?5YrcW( Kŧ\gh0\|zi2yD%ߵSPu9rc V ~{j>]#ʮ0ʳLOo4Ap8i]]ڲK/FMm2' ̉|Y`e\=_s [zl q녙(>a2q3;Psf_n?X:f9Ww ؖ痶}~_x1Cc!!{\MgܮG]"3c>_Xܓv_A(Y&`:L>k变LOoooH1/a(޿}+P ɕ}&0w ,!اo/׳LG8?Cy_?YR\Ʋ,8+B p`4rGd>_7SEHN Y(Mh;€-"8*\al Kx)vape"{${P?yga*B1yK~?ˬ6cj{ȑ_phE2@>%nw/Yd{IN&~dKZ/m_ՖL^R)UESƫۺا@{1:l9:X*m`-y8Q<,KbD8V^L9[ E(m/^yg N Od= ݣ2jn tm@8;yɐJ3 #0DbrgL ޷?T[?w<$E[kbofǐ/"GfIq+!l Dۻ,6v{Oo\@ .  2\Z ?\^?fY;1ϲt2r^2?|1DGAa$X/OkK)AiKrHR`y6/vlBdb6C F, ͖xn5"IKr`;V=,GWRUw̟t<5Otj+SS|<5/=5_ 33Qvpnu\& (Lb 1HyMXr@rx6A4p$Ր:ńC#>WH=9ji.]*U"[s/D~PMw Z kK]YK-fX+먇{j4S-ЪhGZl4S$>f9,aXD+ MzɉL.V7< mWmEzolP1P|0uU Rؖ[WUL1_I˧{G0ZiT Q)c)8"@ŘeJ&s eCBdh24Cg 2 l3f>cB&,u紐WΗxM>cu2@E%<:QGiSx@ rSb[ ]wۦD+u5H]-/)B9T|?]QѽniV "i{\dR1R9n%цh:SSKP6`ˌŽ) sQ߂`tLx8e$.tl+ThίFEHщ@x`P& KXXzg;7\CR'L{J>CVf0t!Gr!{J~uIRfyd `J A2BTK -r4XOB]o* c) &˫*H7c8P#o,Á1ÁJdyb,02&Y<eT@*(;IYl\?FO^ȍgKlB[zSxXї|6$|ލKd ݷ2e)w` $LhI10cZdgJ! dv s#̥4XYd%lq嬤Sr䓒Iɓ_ѓ=ѓ,J-|ĕJ'yh7pZ 8cdݙB:f%f%ˌgKKzv#??V,LCW1w>|P>)%ݵ8p!"JÇ_1d zkw{<BK1hsL勇ÛmvQ;~k f+%o}nE5mkexuJ1]Wg,EmޱZ)R_OoyCQս%Nz伮Gs^/hJr!kHmXI5i*4=NPctw&\2S:,p|7wL>KL?)%no>Sf숂!yJ ^׫z~ ^I>[c [yhܺ$ԆW*P^jë" }V&J0׌XmqF$Q )E 9(Aer5IvsgWh|X|¼7sY!oŵ+4/o ǝ2+<OMg^D%~` Y"ǘLXg_e+F2+P hJu<R1̤@B@fANX#3B%NG[j5Ԧ/mR2 V%J %*=nkS"4*562T"n2c9zZ2: GϨ J YpVX)HFIEĎ,o@jxQ"m_ӸI_4\`?wd 9uȤ҉"K(3'2K#DhdH ֯4uF,iqa ,z j$>rN,ݱYX f \0eVk A cI58ân5;eg[1 1ThO=^`Nq% #ưrGoކu 5LkaL0;,Iw|Z=4Ŀ?ބRft!>yS;`gsHrիTO.Pot-㟆m~1gNԛȥTxd&k{:GI E3zBXfGm?ZƄJQ9aq Al͌f9&ZSrEM6q#C{[@>cCnfnX]v31J?1sk> Nqz=Y]Y }37>,xu1xqXL~&btID^{vvqNj{o't:}(y\X;">.̓>K̽ c‡"i7?ofsY$>[]X("e^{QG>Jт>Аu8a%`p@)"^SͨJ-(6JcXWFE@S^e;[ʯS8HAP]3Ye3dLjRf }xf 3ifj%c?׌1Z]Wݮ&ݖۥCRvvà0 Ւ0kc=K~W"JC'#Zo}^=CQ#hFpO+z+y"i{].8](] P̗5Wګ,~41'Y?0abp>,/v=~|$Sޓ(>z/)u2!W*)N~[oD7b477Mz*~VG_i燻p0 polK$0U˛?a{+j~+\wRʹwJv LED~ĸK;[ S%VUxLcBݷBG\ZͪTARot XPJn0F]hgJKZv<\$i%D/uJ#kMIݔMh"2JuW\|DKfDVy{*{<yCVݑ7>h{|U+yj+/]?*;XUF=\Z%~u[Вd`h.$=;v{#8 m受϶j$G_ z}(}tF[%UTPm)̖r G+ k֥r M8AթJHs{%T Lj#Wz;=*nh1UU̥þu4 .Sm_ˏdc0/Z|){I$hW!%{ 5D*F{]4]\;AQj.yǠUl0g0qs3<-s;dj\*ŒzxԀU%#Nf6ݕJy:eVL\t:;GNOa:Fګ@Z !~{tPzNgAe'9!ޘWW/C(Fջ /0{OCV:$A"R+tT"V=T 5L9 $5ګU)QLXp :/D;G D8KS)ʱ R**jz;XQD;dLdw8%"!B1h## ²L[AOQ zJ*`D=GjmvlbE14B GfMUEv6A`a˻|!nM߱yur{?y!t1e}#a\@aؖ_[ZNw淲|7lmr|rK⮖,䅛h#*PU[ݚzG 1MDh"Rpp M̪$RA~.osO䩜n}*SiƈTZn6QMat#TvobkfP 2iK*b^Ipk{ݻnX%>?F$w%JWF_o~ [8?qÿlZ> |-Fe\fO|𫛞o>}qwSJqB`m7B}=xN%>H6A0#񆌍"zp Q*7k~vr= qn)lc[OcYo>SQ<:ՉQ_@`)[b 򟓷yίH+^F&\ M"dx]dl-zo[c< T_ ԻJ&Dj6a{TSrMFr) k0pj)KBJR [@f9{&YlM%ƝA+75In4,o,m"Uon\Ϟ:dW-Iܣ FL8䦾`5WvބЧbPr90t컿Nzc'@aB= ?VA>u~X,Z%9!61&4Ұ[IĀeO89Ʃ#&?ŠVN}^1ϟ~/X5&M,!~LXT 5\jƀw<"yAok \~p_p)Zd  s}7 WN'Bh4< Ȗ!6& )"\Sg_ dq0Cw Rq]}pI̋RI[$fvl+(~ɇdyvNEVR]v%.<%(R)?.z xX$m+ 3=~NOۦvxէG2ͮFaS5NxeLfԱ{v{fϯ_e?^?~ݏwgv%}|ߎOw  ݖՍ‡6Rdv/B̔ Br&m'֚U-Z,Ϸ s1imˬQ7?6ӜW._4..Q5.Qݸr8]7^$-kwTD+0lXyPR⧂g-ۺoZt.u>y[ KeWJmNƾ|,nY+]qey,~tX)- *_ԠE *_ԠE7EG e#ژY%/LDDHG~~VO1Q*:UmMͮx[n!?˷TBV ww!F[ ;S\uJ7Tp[{uty={u0 yL8Nˍ vKoð/s2]戭dž/hG Z/.v݈c]]l'RTF~tL.rӡ/.VI'8 Н0p(u;:)X-c,T+xĸQ*]܋Cױ't=jBkV?Y<\Z:I;\܉Nj*/x+''Y aF[2tҋFD(}$:91گ2<M3Z1 F1Ƭ6ywY~rkc jp-e1M̉E&uac2XG2Dz*Sx!bPRKG&1Π9Z:үч~cQ8~$_Ҩr,^4"G&5 |I:oD1I!%$as98c)>̽!V߀5h( EtNDFY*DR׎JI3dۇhj-䏂Loت.OlmMY2z?Jddyv5yM epHM0CJpKbꊇ>s>W Y e1;2VT/1ؓi{ķcB܎B+Do;[w{@[b;f0Twd2&W;TT;s7j'j!--P ;"n<KZQ1-8֝Vuۥn;k:[ƫyC>| dӟxf|i?ƍY0Sp aPb??w:IKj7& XgoS~-Ɛ|ў$  eX͒Xˤ ֳ/)L.JpPQ?Էn祅]wy nѼn0S2C-c:ɢlo!L':yd.KaYX|vg}zWDsku>̯o&6x?;HH=\SŲH 1˶xIJ]ju{>h][Vܴ"*|ֻ6y9y*"1TH{/0t͘?'[ƮNm;qCzo SwX4̅OUoQC͎wܥ;CHWe"|zYӥ5]8Te;j5HqZfX-$ (Xļ61oD N1G+R%$uuͦc0G?vuc=~vF@R k^g7wUbhK}g٧WP< +gM_oܲ~]K1c|Ք?a3&iF{uƊxkJNg1_c{0iŤL@B-c)K @'USpPEFhwm p:e~]}|wkC/=F_#?C5eBI dxY`Jz7Dtޑ@am( a)U)*%YA0q0zXj` 6 9 6^Xf1A *9cVœ hkrXpl|i$?()^^{ E9 EVp^C~0ZjsFn3ݜ 4̮(~85^9)T|TNSn0A6YD sVGĈ^P8[,vTi%X{pVbp8^5b6zu]?7R@vu:Cx㽽򇏷K{4NЬAaudu#}4C/ȼϚx6탹8z)}E˰o? zxW՟ ޯPb,Oۧq%2{wq}O:~rR ` sr]}T3 '1A$fu켤PF^ "XLH+Ȫ&D1$s(JȬ\.%Io9(Dyh@Q7EBcrrM=kb"^BDmփT$JNI.2!U!ZQ9XB̔`(];!?]͘}]^{4:c`QlTUWK$(v׃y. Ohl &)f$=CsP -[?f\:…׶{=j}%g@ώִߣًZ@k9hBRz&2ꞥg % l\qSeϳﭠv?u-Q׃ot+ht6#vOC0dL |H)J6%`bE茯R ej̬Y&)&H)lD!Ԛ$Š2zT6i FZ* yUaJ7a0ʈǟqS=5=#/i^^ZTŖ# D Zbb!\F1%j2#ho3A# /UHH+dc]\A9H{v$;yd`2:-:v}Nu/ߵOusV)ˬNw̓|l1.J,Vo׷n/|stwj4zB_X35׫w! ;^ın@o#0"»n}vF1 `5` HcM-ZqkH$v< &霖 KY fԶ 1ҚKvGދ?_%qu+~__Ϸ5v'nhײ=Z׵f#C-ʍYd׶&h[ J#ܬ"*l*AM}gPt=*')鬑 Q2S)C\{ i $zi!>)_U.E9Y'c㤎n*@^Q{OK,]1LIt;t '̱lKM]"-j%o6,޺nupLDv%KƴK0@d#zahl;I]9@o뎷a <3Ed6dY{۱C<}*HER,ѮlsLÏ0 xēŏx;Lu q;3?~de+yK('^@6 bE_n`M[;vǓً%5c["%R14G#sz=h7`ʀ>~'rR"JI }B)@YXYv lM>G.Ը|c֚|ZRm'tpewQ h)*2}0LqP˵ǒ:u<Ӳ>iDEE=o/(^^p~= \cZ ea}XK I޻pؓX TgK!\d/sz_ɇE"e|iJT!0|~} s[GCRF6?8RL=*4ݧvg(cYkX^,` @91Qx]6M*?Eݏ?䃧.7 ݱece#섺f$:zWS@Ⱦ1l1fفL;L._2(#{L*>IiL"1LRR.BnfG*OFi[IbHrab5E> ~VXX"g:lc19pK"@*2o]lK{Rm.?-w{i=xWۏ!wi}錕*U+K龳Vq.O-ORs3frE=j7k40fO?JZU09k5wjvZ=FNՄ=ˇǥ''7aj{âPǶPiō&7ӟJ!NC+ $//=-WM-J"aY)+=c+( V X @<sZFz!ww!dCt6O7eCݒfܞ◟ӗhI&x7i[HXlO R&M-ʙ* ;ڛ"ߔq󥣉GLLRD5hmhġ#>b$#LkHF2뒼$Acu= w DsάeWCcH(TPb&zJJ+(΀cjQ8|AˈQ p?JCR? 58"k&M 4bcMU<=i}{=]{{x$"X/@2;V'Y} FS2@MB%PXPJV_1B?c`Wk |oI34Ţg!?"\öE6ۆqm]\VhKsFCX1Q;_scwdEaHr_S'Di~j*i7' q?W61~pW9vYuʒԱ"~%H5Z $ (:cvThpBJܗti)C)% ISL^"&!k)U0s X9Wx4aዩ@BX:Z':B6H#l,-Q+>6 =!+>0BRk&rY|: _$k/<&8}6P13BC{!)NyA*Hʨ6֚6r fxĥ/ %0GG"w q̑V]kh{IbQœ,1ەx]uc wk]Cu]W4w5梊{gT NCح]`=.&Ru7^1:Ȃ+.>4k}ቻZmvޞW|8&my%ܝ<3OzN; Sdtkl?g]!5a&tK>]yd+ OWp!P880fk)AV}tdg?쌾]A]Aޚ:v4sHᱏ |2 vW} ],۹üowM]?I P0TϊrzP3^0x>nj`],3$1Niq{,1rJѐ6Fk֐h}\/ߙ͓OBOgnY"!1[3ABRB $ @HqT^I'b)%*㨬,i@Zu<_B['R|˒Y&YoYk1rT4Itc[_<$Y _$4Y+)h6J|_aǎy͇c쳐3v}i5+z(m,/fjF5mDmc%it_ڎe56iⴜguPݟ'G*CǬ@6ќk=z ŷ !R]Dz W\US}[\.Ocn},`U!x* R()Y˫K39`LnǵۨJWK+,2Ht?ӕ?6r%NLv%*@LP_6`)!g~." ("6 h_t!›m#Ҝ\+GyI1H: 8~,܇vo(_2Gm_}I> PqSgGix@ ͈cTLˑ_/O/?a%P3mRz3]t"Qxy8.ghmHMCfg%E/IEuOHM8RTw0̿PYHQl/3 ǓmCǼ[񅬴AJZx$B.Li#. \KG=)Z`/Z׈I J1!I%p9btAz3<~ao=:&)h]KϰR!Iٝ sT3JL[k=63T-V2qK?ťW$u%*zXK *t ÃvrEɭ* ѢpLzNgű= Rԍ P|3fR9rsCh ?PF,X6uҘQ,k \cFj-Jؖ,ڃSl f] 쐚S>j)hײר%2A+w C46qD޵Č$BqI _᱃3x'sFX{W8såaIՏ1\,$a>aELz)YG68QnJyK!똲uH>8ht/$l/;t/ NG=P8,ܥ r񌇬I=ZYw&T/μ LQ ) =yQ1aޡ~iT\CX+Dx9]oxN܊Q*'-*ϐ1FCXL@/%rѤe4`&EtXS<^o(͡ X"\=RWM:˹zq˹vJy6I;V_9 lg޾R:KE]#(g`2\ .= F6"y!5>S"^il猚z+SC Ot; 6ŭ屣[ѭ|.S-jŹ"! ur's!Dji-hjt+!߸nJ)uJfJ5?W͔rj&GLH>:?-?pODƟ9Vꖋ?tUPi.l&x^*hAvmݽ j\$ Gpg ŽQ'31T8=P! vc?0$9/&+ӋD*q$mG}B/e3hnEM$b/-n?->w}P BBв~ w\* ࣏qdCذrVʊa+Ekot0:LhDj5r fPRA[a@ȂhxZ" cDޝi{NEh S**NUPٵO A缵C\˹Re'kdù~_C,jsd&/?N2bd&$;)`!"p#Qanl5nDɞxaHφE${ J`7#I[,KLkAp +{*z5w8bq{My@@NQ%W Y`Qcڈ?OR2sY3ؔ vQoO ]4˧r})R~10c!& s'& mĘ\CqSINt#8 `2WiX+{^PuQ)wk]C^BLbO޵>mk;YL7޹mi嶓 0F_,J,R@FQI\"p<,Kb˚iެ+o8AP@0&dnmQ7 UIH:ս뗜W:gEqk\oẐ[B~\A-6,g4͝8Y%k:Fx,U0ǵYK썩Ayyc|qZo9PK 㤬5 'HdY31!*Smfi^XH~t<b^PH@KYq2Enk 8L'V=(pᄐL,Y|Dfw>s85dZ5{p*_kZ}}*8fj'!j{]SqW}zbU^.h}O-Z㇡Z:hYuzߥ_ vy匞{3`R}>.M FT-bLV}%݋s .QXן'iJ! ׍ݨO?娞nTe URnEP Ⱦ J jlp둍]N(q'_GgD七-N؁h__Rtx}ۅ-?ɻVvwy3Rf:5LL dRdQ,kہ}1S"[K ya6xr|}RS3xYDj>~[1}XheX^RtH~ڡSH9 h1H\S:? TTSkPpb1nq! #Jb}X"23p[ ì PN'rcot~$tj,$T@"ȜBRR1p{$?ι^*ہܜ H(\q5񼔓Pu5TkXH]BR'G];. )u ;tz);VtN,y)vcLI*P rxkjgqPrWZSӃSJKkzsA<5Y]k;䎚waͮ2w rS$N&הwQb>U{1YG*dDItt3'LSIJgxPx2@f=3}!ߢ=Ny/ozQD+Q6G$k5.8[lL|f гa?RzCЛ"J^ЍDi\ōs#GQr; 0ȣmWXQ[x7s]+:!r eo(8X.mٽ\va9ⱹ ͼt y1< ,!%⮑-ǞG7D,9{ Q\z~)9W @R5D, bfBpk4kGZ)+k+ r7o~xoo"nTrpS/o_>brT e!; B2b3ɹAm(LI! : jTR ?E ?O/6RUV-u!Q2ŀ+FF*I.Mg*ш$JΤ RPK)o]ĥ[ X9Fu5P-NIO pPK1/ā ȨUh2s`s-nןz4LWC_Uz5@z #/k՚x:op@ԥ ɋ\A!q hz:ɲXCNp033_섯Uzg׺V?~ƸU?fb>UмrS3D? `w Q106SDVM-ӛkk.Qݯ(Zh|cy݇.[ _x5ж95YYϔ /yZw9 a>/13N8?]μ뤨܁6,b;ЫJ,~}BOoX/x-wbe|RoF#¸)ʟ⿇g&+0 V̭ 06oC_Y9LA9߈5+)o-U;ՔXTv\Qog9*Uizj=Mb]d ℀p͵DQQ#m,:b=_zwJ`=y51\p5ʶ?>Sl82ݶ$Bjr;.@6'w5UQWf֭([)(>ռpX.ҔPىd16dւbJì#5(Z@Yti#h݅.M[Psuo:jF˃vЎlm| 3sV$H@:]Sɇae3̷D@H71Iq2ۍ܉[CK I y@0i= b, [s`{sZø<)$02 &,3};Sn}8-2 .PC:j )Ϗvn.6 C޶/IW]NپL16c= D#<@pF>P^e}=k^ۖ !5H'QR}h8af7@´cNCrn*sRyvH-0*nJPQ{* D\nOJc~@O !]^VJ:/\aP kG=BH\n )=Cˇ>W\nd&Tsϒ0AvaޛO3 a\yTaRa,lOnD S͉08̀B8"S88!X1$FTfJCLKBچ|. *> HMp弲'LJJkJ1h(N1xJQIT*~3pt_Yv7ۅ0*;3Yŀ+n;bMj-iJ+!nJc˕o{v`DJqwc+;IrErb7?*x?s O! DrlWyE4D`~X-fü_ʋ׎"`̣EtBЂd}e{\7ѾZY-N.oV_Z޾k}ρ\UOض_NQ 8<8,}.|BWq3X%#kϳ,}s slaf|d*on&թ*/Fz^: fn<]\3u4T3g$K~{S|Gn_خgJFBÛI^a^2 v|x}1"K$' o5I˴4u9``EvcCa Fj;4:Q6ԝ_3L9/棛KDkB;/a_;_3O^d?Sί`w_\{|nD`gdLg?0򹹚y]G5*ke3h/V9wMV}nxkh^ۄhFq Ԕ`N]=)e8F` 7ֈ c:T:Mwq9= !PƸDZf`a}lN#FcN%aPNc-SMPj q֤489CY\+caC>]OKɗ2bB ƥyjTRzg C285!ST&Jgly`)Pa :A Ip,2T ,,B B)S.M5f7K[+ur&qQK>XU:X,&8efÁ ?z m1Ee7AAQkfH V{&`VH*vMNC_>ƘZ-g׹s@sEyd~J/8_&g TmSL ̪hw3(-I};x+,V oIU`߰ ˾JEQ!1zAe %A%zJir|kw5:ns.H*X1v3#;聕aS u@\C؆!B"J ΦNb#8)*-K [R#xx]:]R8$6=Z@+KU ԥ hO_QS .$e=M䜒 -ioߍXǪ'.q6eKq{ ҫKo%r:IO3;JbWM(yp駬tihDi2٨{C5t>g}GB|wy+ F Ict .mf'+@p0{? owlWEOT5b /欦@G+*d֨4F6_Î8-nP*M1搵`""+xT P\,.|f> NK2/}7-ߛ-x6 qetK5HV|!X GYJ(zRd^:O-F qc""cW}43DŽ N;hT"^#*Lbh<ʅՌ- 1&=P!mZ.ޕ;kU瀠*cBܞdF0؅zq-nL276'Ik;NdVvgvvʊ#0m#lH#!ڝStlo*wvvgקHQ&إLv !hkcgOˬ%o;:Hq%Cg`-8^KT( ȮKK_"ъD{reڈh9x#<y(v |e>e)1Lg?0򹹚WdGe3.4Gu/f'yaNN޿W:Cs}ІΏ*}xޚϮxR[ e̶fU _ޝ 粶kڼ1XE ExO<;߄ᲴQtv]| yT[ۄ $0ţf>EZ kj1b>MpJL 0a.^vcn*BafhD$a .^Xr]6ۊٶpQ;Fnh0I 3J@5Zbg5cAzK6\ q"':qY8,h9#Q,ܩTb!#o]W"t)J&ˬ )0*HBS$2) #[>N;|)s5ӄ a &wTQ0Xc%sb(88k2ZI>7|+%@ f)ـ 4 ' JnbiSɑBq8Ā!!x+UjRB40(9JT÷m&}fyna.6\ ̬r)&..hx׳9<S1kE@vN _5FpJ(Aws !Z.g;(w)6u0xd م hrRv7"_kt뗏ކ|Ec\L'lfK:$@ybԻZa!x{h0*zL $n^"n㎄TK Ci| W1#;R& e'G,'EH|u.Y=`_KV~EOc i$C05Be\S{OT #1|jʴG*J "l-;ᇳ$gף`O_w\cp#uCA.pp yT}Mµ[?XE`z< \B2'Y2J 9SYlOM f*gtQ?{i?vԥ]FT ћh ۵èPGVs4BEID7(,)laLJw"*EY`Cij l#"dR1Z і(p$IDjK4O1Ʀo!{Hw 0ҏe2_$*)ىLQ=QTj;sY̭a;項)ASMK,SAA# 0a+*iItigBy2|[נ莨:F X! fV!ܔ If2禀$y}<}{Q\xF0v-FpP pZ r$z53ƛ^%jݺ!w& \t#Aq?jz\ZQ`C=h,Z"Sj1b4V=iy>BKd& AKh2^g됭hiYq;#fhzg$$QvS"bc\ \\85Ik {Z ;59#r{TCWkFB!G#D U(Z jIVc>VFJw} FD*_Ԛ4S$n}@ޞeoϲhegSϦfvg3icC8M!p>>0x˙(ئ*Sosy8 ŭo 'ktC@*^%{8Jy wn7[ƎJj" cD, ݸeCZkLEbq$y"wSk&tǥ%Tܝu2+ &= WDkqE&j@? lolv2/3(Yܗ=EQ%QbQyplwέΥ)?M!5s 28p)xQG!ͼ~z9MyI!U>U\:?Omph1O@֭Z"n_=a-,6 pK'ܧ1) =GSh2RdBWBG H U|v69orTf:%53NFᬫdZ3|oS3`xQխQꕒj=S!ywxf(ƿԡFTH)ٰ8GpPO36C :+$jGw330HGYǪAN1XGu^q!M!$4.TZs {‚pKJ}$THɖ44 sxgvAW\^vM#:M,s`/9DQINЈSJZ7~\.5++ t]c?0I9[zF sTtTO6Voo^{GlxG޽<0,?ޘ;ێ3A5a )Vq Bnxvp{i yWha[26=AL5.|:a:z˸NnJ"Jl]2ze{_y"j?aUM${X핁rNUex..jC;[DM%GM'.~Kv̹z ꓺ9BUuus^(x v<\V ߜYn}tLQ|k"o̬eꀢn,7ObE߷jW>^ Jpx,1ϥhRˤ=|az 7;h9SbU;>,~ P2VR/:?U+S,b҉L?%iAj Ҟ4WYP,HQ4$l[r+.q#lk/zQS"=hJ>z N sfv%HtFoT)Col6: }jR;l Mݜ[Ϳ%]cNntfet?Ms ZZE~(*] qgQ^Jcuh5RYG%BbN> f9AYQ@ 2CQ*ɉPe}2LpϡK|&.*MZ07ɹ/UkS-q(g\S뒴 s*dxUt# ICRoԵKv$9Fn66-mA+ 5J(A aU`-&[ƍk&i}U21٧ϣ>p 9Q6 Cq֑7#bI)6ȹϚ-/)I!u'haZNv+Hk7fʠ.b H[#/O1lE9DmZXJ%`|ZD)I}Iߎ)IX\ͦ{v2)"IweD2&x ux:>$+rET/t.ς1%´Vi\hfLJ WӤ*Ӏ5RS&H:6ggj kB4ud&v/5U1c-= n;ξL"kD` 3,Fx5na:K]#^sk*δ8x}/Y@sE&$Z4:ǢHܔhFjL9q-765XͱK\BK?N4%;!"\z6y7+E{ĒQ&C$Ʊz>ao,}mORbH$~%fAO.G9b\SSF֜L >{k\9c$i%%b!4ph:!]pWسwzקL[%X' @v3Q}"( ƼSM,5"=QaZu'80wk<>]BHa}{O4ʪң^>P.9R=dJlY6uU=w`{Jk;EDBQ76},$ Si[w$ת¾SDLHXZBtj(Cł5B:;XhIJ̅\k$-p  Ծj݄)gKCk7qWiӅ$&詔\Gyq%8m+j\1i4I3/Gp6yŪcP fqڝZP&;ݯ7{Ìz3\P=wFdziyA0_%e)*_-(c(O+hlG1.:U6/b$| 덉TNpɊS\ͪՀlޭRs oE?$+vU%J1'%uXJYsLS/ M-64C]jKG׶P*L˚XA!D[VFBvFaOeZC.G$/ d5`7Lbu%ZPAAHU%{4_L&4pNɥM]6.sr}8琔;c>up%F&Yo AJ>$Q7B8 R߈:>pG]E5h_2BIگ+Y&.+ZX/  !43#3bs +3&{iceVqp"QwZE"AHz5|T1@Ɖuc KȥNXEs>Z|r'[BJ!Չ39ZcWHzAI*$2/w(%cnCgEQִTJ1]IUf+r&/Ӡ1SK?ge?Rڬ9D5v;.r 0" {nz1b>=ꪨ)-O NhpS!̻UphRcIl/pNYjIJc[x1]#O?]Naf@UU$mY{#3~ yiC) Gg}yjV[ 0s^G} UO|KrvfKrۂHq   =j=wBdԜlK Iڙ.'@̊3{J %'i0oKIYIB֦vU bL Ak}Z:68ZK%zȚ " qsBM*y>pBUmyXhL(^V (L/ \IBz*":hsJwv[Q #gl%]j L)P?ɓBbjŴ$np%zi.2PnŹ#WzQ\$/0HxpމFB:b/$4dl + RY{tpD0N/7>^)cJ9!+NԾtJ9_,S NS\J?#3Si$&f+{*1CU=,? jnL䋠kp4\4FyIKG<*fǓ%ur,Zv&FΘMa~U|cb]*k0GJ ^|17{  az>1L3o z.ѺlaYoD+k>ilo +tYcOoa8tQȣ8 ;(z3Xׁ҃c4ߎ TI$0y<_) Ƶ/>$uĜ0<3y&CNo`d}1QsKyOٓ2aEqy>؀erz1;ծZG V (#MN0ps~|jmɀR3;N2C,Ͼ>5FoX_w v B}. fgW $/AXxYykˆ]$3cۀ1>Ub֠8 K$]}0Xfs;6X9I%[wo`n#?YHzL36#!A8A,U@7?0ri4yrgt[m$ؑrb ow{YVF y|>TɈJ <(@\*N".htdDxkcb(Gl>:7 gWN_ vpáU奓oeAGU/\G z .RօO `ܧ|)whwx/? z ?L1M"**-;*e`܎RTzbNvnApG) ޙ,~_/ovYݹח5o9Q&u՞I6m=ւǃbK8.5`O}YCh)~KhabKS$X," fsmu̙Rq dF9a3EUut%KYi"1>F3P"[@R0G"PqDt ')Q#?" xI/@LG.ب[+'\gAfiLj%Y^o@ iO Zx=߯J؞=M\xԘpsP}оy1\(5n2D(Lgq-G#&4<{プ a Aj<=~ $lQ<@p6?ssZoJnl0|kolO RF& <$2LTB, SA dD_6)JC(5i Qخn{*RQQa.\ڲTL(>QF$ ~rozIϟ7 s3c ~`ex$z̓LT)DQI?@5.;G_M|Qcw'ZbQo0c۸ww7~fY_?SGMf߿Gb=ſݼaJG6j_,6|]/FE*{5R[/\w☨W60-,zthmՈg Z16SC9H7q~sRVrd/'o7[G°U4z>J>ix/&}?vI;5X f<IVEv`wKK)x*}_oZ̘(~\$4?FNմGɣ9K&m|淓#Z,HY\z5`XiAxo#Q?ÿjuqR:Deqڭ]i>$eI6<}d͛O); pJXTXV)Ue8 øTJ~JU ~qbZMS`C=8s -L|PN?5;,'˯h^\)5\uӝTrk;cZy]4EscUB bM!D1j7L4nQa4FX1elm' m%rҟ;&Nxz` QL )#Lwyd)GL!k4e[e^xMV>jupu^$PG%cI~Fd+4ELa]ef8唉Jp~ϯj5³Cz&gߍf'[oS6e*@ZØ$nݗn7~x\Oo>YL@˸*y\F_ xD{jTȇ~S@|9G0?X,-OE$8&/y8G+PL] qn29qcɕS0I '5Y1 &(0NIFV*U,Hڀ4qqA(9Q-!he UY4n JqAkZ/?LB?nm^`+ua6>WV2Z1g#a>Z8ܮFK00h*8 6;@ h|^hC (v.NRjjObd{ bff xPLiX G ry 8o*%R Ҟ(& #b$ aJ+mO"Dp !a^jZvZhI-#b2aWc'Ǽ1jfC9R|8k$<-9=5ܟA{t&ApSR yQΧJl8i$acf@Xp9s)wwI]scIl,r4I\NR.Rv>T.SYpi7LkoP9s1ld(%6ņ9^6?.>d Vy^.?L {jH{xV# 6>Ȏ~x%G^*XLk,c圢R(&_LLtzrS y8-Q kt㱠b`,SZ N)miʋ%rZpA#c)AkUօ=X ىt_)wvAyK;&PH~*M!f w{qX'2Ĥ~dȼ-1Y"FrJĨe"81d,)EDT ] 1Z+ DW@`ֶ@w\!KbL YRS g_v(LJ`2:L7OS PV s< rQ I7l yyb]yQd/fB9d>\C ^Td7#VygUbl%î)> ێ[utNo^Izs; x.=)0 rNH"ɅYǝ}.θulwKL \^nhsX(dS ?vURb QoD1Gڐv]z)!  WX͕̗wȪaKQ7j%j0fpd8 9ЌoYK!W7;LʒȤKrDُuBX9c]"rqJjkq( EኦxKn,E.Y. 褞zKz|J>V&G89ϑ3=„RaZG,-j e6E9zp5=]J,Q{=abzzxgH./l,}&L/L@)&xBd=9UOWoԮyKr.yr˛=9]9ox!̂˃9KaDi f@~#7 Y20%K|iY0]$좠2DCL' '+< o6nYkp};Us>S Qc݁K^9-_?rK[N_K_^q+ker Ňs~v.Gs12 O2уe]åYBp\<Q:ǵӧC\A%{WdAlz^^չ &] 7T1ԧZD& "BF ULslPP!BV#*‹#1& 8{^a&eJVċ("zW=!.Yc#Q%&9!({! .j,pŕ]cJzb=Z FE(+9H7o qt#'mW zRuj/a4@`;Awt""Q F}N]R֕HJMcD2oԩY9%6@JO~\'"k @TP&cD1RXjiJBbNҀ!fVQQˆU:2M%@mƀTB1\i/i"2Tt'%7.P?[*H) >;&oP [#y}{ѥ0Hnn' Ǔ<. Ыv9ֿse'~~fA_f~> 6d6\+ow/Ôo^r1v/7^+W%FIj6RO6[:k k4QC.~Z 5 s I-XƔtc4^XꔘͪIoT&o80TN/,j'=Fh>fPsҧ1x81 S[%+v6#i^@^Myg_>S.n D)szse/mfG 7?; ?n!̖zm l>ć5Y1EB,%-d{d:ҫJu2"XgVVYk0^h[Jo$"j4M%89iGOZQZCƀ)|).CiJn^҃In^ o޷PZeIp(*K?j%M&߆/Qy|DAZl#|V=. U|9+d)JC) ajK3LgLi`/Zsv1JȌasԞcs2eizq8` jNCF={99WVbݣG]mq \S%1]E|$& \I>?:@C57ם=V=>S?u7TBCX9qN&M EwQɬt= $$x_dj5P܄6xHHmY%شf,GmbS#}Laq+Ӧ_e3nvęWh bGxsLԒU_wLN?ޮʅml ֨9ъtɋß܏Hl䋉v> ףOW(0"rCm}J)%nj8;QdA(?\~K-UDj_Bk #1s0-ώ]q'\kX&<+]YFH:nĢb 6=e : { ySc0 \A±Tti>HF_md@98CDQm9╡uu&%L"BFkLJadY$@~ Ldl>T,8'wލ(^i8 Z:w>T0;y8NghB'V ״ڳ3MN/;P#逆 }B9QOP\R B:ȥA@b 'ѐ< $ mWlaykݳ?j[eitbnmfMO!d67h-!ߨIXB5?|@1i7;WMVhym$/VW!57@{[s+)l =FYΈ0 % =y X)i4N< 񂔐@1ۅ=PF{.RbUDBOKD%T;]l?Ev74H}{%w6FS4̙QQ!ͷf*6o,;5&kM Ad=oBqg .qR`łFЌHim.ޅw[7 :*X&qOa9H,.ȗ3)J :ː="jJ6 IބPzLۈp6$2}\F.5%4@jgK3zQq%)al I`I[\ rJR#=,#|$higP'uPd;мrMߜgh} kT LQDK(z 'uY F/R:HnXD!P7 4X#`e\PI(vĈA,&c) I.pqދ1% VSG)ds@rùĀU %E/0(!$%#ePfܦ<_< Uρ/> y)t[*|RI!a_G߽{[/=x3[#Bv}6Ә |Mmx2]p꫾}EBB1wLB+;0SO,L(I>p9}(#4FTFkr CZ^(?$y}%HjMFrih(xWa"OO k 0k`S`XT,C*rъ$3-)&.s fM0HW/;e%`Z9Qx@#E4dJ21@G*(LZ 'egDN)Ff6F# If3F$;;<$bj{&n# #qjG:`;wҀ,oˎ/<{ƎT3J [rROrBKLLr#ry99CYeWiYal(vWGġJf{ڄzus|h^F}k|`@F#%JZj~U&靷h'.^$) cd|j#vK'j@R~b e4TbD7I|'HL~c}A@i2AbK>aA]f7P O5dA( 04H j΃9.ƥAsLlIy7!.aUWk.h=ox*І^s!a*E CQkS%1+j 9PO6D Ve(B{ҮZSQG+yw-iȠ n7n.f1b8117 oϮD-ily2OuF6 `],|N$;?QsNOlrj6v6rgr:=?;s>jzy' Ugo;?Q_AfV*ջI=w\ܓHnB/UIP]tBW)d`5I=|/811ֻPs+Ew `rb>X5/IHU N -M 9;m6 arh΀/FWZAaqrb䇇FMoB a>t`+ʰ7 !["~iJ1o͆>@c4<) Mۧ7< Mo؀ X"Oǀ c tfF!R؟i*< RB6>1`Uy?KZDӜ0wXS!Y g9iPm 1؞';Az,Qf k nN4Ģ 4̰;O#=)|TO9ث㛛i۾OKUdM*Չ=ȶ2D,N1we=4eaC#@fXxH^%7!>@GYdwd3UE{ ,VF~ԙIܖgjQ"B bnwD%ģR($0LX.rK n[PS[S/SV:ù.GD/ՖĿ;;hiG';ȇBaTy>mzajjʧT(;J8eL{gșlv]jpD;}fst1w/K ﲇetu:184~|f*oX?{;'(!cOgm[Z,9 {.68O)>gj73r߃gw2Z#w}po s1?!=x8.ct'cN( ڣ<( W8{@JdA%l"ޡ;9D20ZEӓdhytZ\QbB4 \BF!a$x^JqPO$%)JP@JVޛ@&ʸpDHVgz]U*xkkKQ?5c wҬcȶ s*0Re*}dChY( 9ܘ-<9U) ^ܿn2eys-7:v6r\E(齙Ğ ؀#<:J;jǃwpGlMX{ٶ Cc:-)W G H BW~Ef-^D; xtP8,iU&YR$"-c Gq\2AtUz~G6FѩN(N3s O IM1ϕR3u)82J ZMEIAtsPZ {U0Ԑ\|&`jqOF"P M- ӣ\I !eQ{FrS:3PR_^Q8jl7M /弟szym>n _[=]A!SuO%Hiǀc8E^{{'%:` \/^\euå 2nQӲï.yҢ1B5X7@ xƍ h zU Y4hǀTH)7oؽ]6c oc-<M/]d|Vo6ᎈYlu"V'偁uijYH* QMiѹFe!jgJ HKHB)o&vZhTmE5H [EŠ56idZܩpB1mI܋רux3-Ûۂl9fAr+j^sKko2?B^,, Ȇ8Sb &tu~A ,yԻ ,vFFˁX"5^s-Z#| ,:ZYՂipR9f$b<M}ɛ̥Am[;EԤve9V$V9u=G0eI`2ac&EKcc1捐 ,ǘaF qv B9~HK)'`{h'Ki "B/E6-UF_!Pa)Sǃ#x-c _A#6d}; ZJ~3y'yopk#_"5B9 qH-p,z`?y\Ԃ^GDift;\cBͷSy2e˦|.q(ϒhQ^-qH IkVzb Ћ|Ws7VM}FVbqz:9%֣0i^y1 ǡ{ \y R(Ыg %:(#g 0%敢>n]N+X`5F:x {R>|%A)*JgJ2@)J\#(FLqB)InS vpj9QbA,N;X)^?_'N=Ve1V۹ȕ+{[ƫ_)|iѷ̲-cR)}sҳr2q7zf_'8?LUwI5cw2nXמ5b琫d!(3F0Fj}փkg[}u>It>6 :d?8[*u+3ji:; }1G;le& id"z9PC-O{&d,ov>(T#C'*T?=YYk 0nrW|/Yj/^flJm3J"W蹐huiܩd*+:P⮃I*k5"E9z5ED["ND19c@8C>N)T9|7Rύgh7iN+< WgW숧} 愨jٞGY0nRN6[d5r-(X(귈Ҫfk=76UΉKP)SMå"Ңq(nx'. N@+tDzUh iEY.S 0==OQLzu 鷫uJ2ZG[N!C'x5b<*':RmX{\ $,`6uF1H7$h M<=11Vdo.$ NIV~a!rS.'Z`{JPSS=XOn{z' 9A嚲)OlIZz?@Csٓ`x`lg%7_0$j8I炽Mz0p]p{Kܐ]pͰ槗6}˂[Eo+ n=VKJ~b}A[xk}IC:WH;nW!8jgnpd\A Wq}aq]_.b;R!N̓6?t<3G? 4ֵ]nF.ևgfʳq,D,ecqpo'9.陯H81zkkARyj2pãgc2(Xg:F5#: D cR?,ų# 6@KK_4'cui  =}zZjaZ&0 A$:1oc d!JqPC70iIӝe !0 qj(82Tul̽{.C1/-w/R&b|fzI%oJ8Eh)~oޭGB:rhA1d/7qVf6?ŕymmREaF 5s^ ֱj ݒ,'=$K}+=I?/S=cxhhf=v`F8+leQLe<>HxA%c";54^g:2~u-uf.7}S]f2TM}ѫ˦ pwIɪ^hѡ=ۏ_t{\yXm|6oϵ)y6>-|~nD4ybF&.c`ujE$Ǎ dRBP8F5eTs1a֣ ^۫6Y 8 O= 9~7vQeԮ?O󩁫zmGLGF~:_@FFArJWvSN,.i G_:B4z6[vPЅhگyhW9[(mҀCz cVo10mW5=t<{<ʵ):% }-i:N mh5zK^<>2Ԋ)BCvBpZǞYAu7El2gugΖ VilVZr {/{B[.hwA6*A^RE,G8ҰܐOA!yC^x rĘ$T"SkǶ1 /LB:(uVpi-O]$"Ն2I4 lj'c,Ef Cc!|RsT4ɨ b"2pCMn}b Fq]mo#7+pmepH&S`XH3A%nzaV͇9MybhTIoPT(k#;^JbsrW!̵Hkit\9&1+BVX#sZlb2R#!ׄ kQsb-Fcd0ChRJjY)Z)*59C+TKAQb9Sẻ(PZ(!@/6%*6Xmʷ&XK kX-RjOʷS׊!ޮ)FT)q-WlJ%HӘ`"٠峄MKAжQqt bz6yC]\ \$p> Ƚl,b^T}N}jy<XЍQa =T^psqxwûL5;%s~q+IJmG?؎f?.\\>d.^WxWXN9Tr۠_=cM5ce.Sx[Tevزzs ?/VzUlqynˣkMZq6~ U9{t7meGйT"yy'DyGZ\]嶙omTQ}xr  ZwaR\ׁ2,:gq{ݯHM(""WM%4GW -T`iH5Lە\sߕ\3%eR39bg aw Hf>ibabj#O5آRm0\M1g.H&ɱJ'Pa֨ " K*14RhR勵8nD{iCB"$SPp~0 p-Iv Y\Fj6$O.I2U{/g[& lT bD'&ڭSDLvKOnmHȟ\DO).vy_ 1JJP'~fnbZ~g3 9X~eiqxG+I-X"T'bQu>Գe;0 )젤Jy|OD>ISJW5hnٵ IimeðC?pǟfVgi,_ݬ6fjM4:ןAm Zl8?F!sSTӐhbOO *bm,cq9-&pKM+T0A0#p#U{x=f-J67r feNH*x5nN)kwiH&5fր S5éfĆ~}譒U}nh) ˻y`o7| y.byV%-&#O)ǎ3K#uXa%S*91㒅ww8үoѻG `af[ܹm۸J|K>p@kJ7z3.FgŔHEq3!8Eq;eb9pI*KWXq]UbfaU N0Ki/1g%Japn^sIY;i흥`ZM=nRv:)ed8ոNFjRF+)e)I.&'0i 2#8[.M@v'^z<ox 7Wde$.IB$v?Aֺŗ/o B:e/ԝ6`9`k~\LqmrH8\8zo//{\,&bLO :*e)b'ӥI.ԯR3 ,M5j!e _5*pVch`ﭞ}.0}K3.k_ܩm}h1pjUQVP .pNF(N}nS #'UIQ*,~ -2W_c}jn^lU(qv˪܃sń30V Vq$Uh#LRU ;c# 30q*J7Γ8#ʴBsWb.BXZ3(1KD kngN<-=H[L`@V+-1!TL/K`jb{Vz^E[v~Y)8R$.w ar< 'sPIí()X 4/ }KSz ` ,8%yɤTp\CؙLD%6g4i[k94OO_~~xs%qaZE8;LG8ngz'dy^M6uGCq-0e-K6 +4zA m[@IDbкUEb1NSX5j籃y׏jI(}vPRUz YHshI"D)xKG/:&EI5EqiN?٢f;;ͻNb!mM"퓳53"{Y& ]?INX=ijPä^m'J(Xm Ur 9T TdN=m?y#,Tܡ}xq!|Q,+okBq 5|YKd\Io~zS<7?0O{n9q?Q{04l0IhCg! /^c 4VGuVC 4;s;?,9 zC[e|qUշ 1`[A0ԊPr6Pl}{ Y5QPcȐ̑:s\? <.PYȡ_>=h3-î|~ fG/ɋ6vu0`1Q$F}Aj7>s'u~Йܘh,ƾzFlԳx2 fFvcY!"CQT2 Df@spb;+ n $C7\i/7Zu}/&vс% eRq!Uͭ2&Ag;w<2 y]ܨ{A>M>Mz }'Պ |851XC`ZvwяZwNan3Fy/Ѻչc\A.K%nmz;~k$h|΋uKZيWHO (J]JhTxJv+Ȫ]|~yMa萟S.hU ٠XjylSs(,y"tѳp`!+6("gO;R":u)vQJG=G6퐍S5,99Ӯ ;)+V}L:KV)y_ٕu1/-sb:M1{iΑA:4N*βgɇp?9,n+4w(:޺ȋ];˚'`&#'vV$ƯzYй[qFbxxQl{Lh:^įFϒI{%5369CFyVȖPy-֭ 023.gRm7 BAJU"Cy+!I"eIp( Vw\N0(FU.@s%bK_IՖ퇝,k8zvM\@>>]buMkEx[Kn&(iKp $ !mKB1 7KPI$u%pKpg#m+ܜ,>.yOGs3%aLF>N߁bP:N\5~0jy?ƒ?7 W=bۇC|܄Naغ Y޻(oŷ:2aw8PQye*B$cdT{C(RWc9fq80ߍ>f@OŒaER;z70!6\V# Ų:sh\%w~ts}u~#jj}?\VK}^_~xX'̷w6d@XCtOW 3ԾT*kaZ.ue0ۿ~0/&~R=| TjUUA@1S#[ [`XcG\%MJJv5r_Ƌo( ΀c0#RFeVtag10Z AP{k)L[Y!8w%5PeZ%0gY9cBS[I"˜J 1) 4g0e8Ҷu]Je.1e 0QMy;m쑞 Y)B<X0A!3b3X#Ií()X 4A{[Rxwӳf,x|(^2/In.PaX&3kZY%9`iW))rRi\UDW 90i @6<)bɽUX"^Zh טs [9|-ɛ S̙kGmt4 /_`C_'af=߮ջ8*r8<~|Yq{ ~d;O4jՙ__}_7x%iqzZIQjE/mzu0G/n?ӧOlp;Sd2ma=ْL{~L.=B" Ǫ℔Y)z# MQ Y Dd$νv6Sժv/;۬[كVpڹ?{ܶd _H~Jljɇk^H^>noHI%Y & I*)>=gϱ7Ge\q/<ܐ5G憬r4=7/v͏fTJnË%Yh9Gr~-~3Zf-Wh3ZfhGъ3LTbB-qDC!Nrϸ.aHJZ(i X"@B ^WJD67xW;FGJ67BhsC)JUVEs޷aMogަ QkHprӠ0 49L\Giگmk)*V\|1/B *C brBn r`c&H[WN޽%)/Ҧ^G6ug $B$ܢXHJ!|'6VW.$4j2LլkPKSO(Kq*(bo-q5RCa,wߠ{٬||3A?7Z3AV+nFV RkI0(j~}ZխZU67:^j+'m&ujPs-NZBi%%\m/m+V(=u%&h=D Db:̦|-W,{񄌳tGeĄև,y<>n8|9xƋF1id=_?Ǘz(HR8~f[.m*j,cYkg솤XY_ySΖk!G$LirYztÐ"Cn:{lWEEtftk!G$Li~+R3wKA1n{[5"nɍusrMu/S0RuP:$o2ڔEVѭ 9D0%ߏnZCn:{llDj,X35#h(ٲ{эRun:{lƀM]e\75#hF֒m=6b)mJXB9r&aᄍXs-UuzM:MwXoݚ@ȑC4STj޽RuP:$lCV!}^BIږUc']g (o.Z.SX ɺIOԬu%Aje}u5 tC٩"W]JשҨ'ڶVjQ PXb":c9i 5CcK;QO`}g`eǬuZ@1kZwZǬ5 YSBY똵F=J1kJ+1k֤'hH5M(꘵Ykc4hc:fQOYȎY똵F=An{>5 c:fQOЪu'+!1k zB|lFbc:fQOPBY#i1k֤'`E51k֨'J?fDS;fQOg5B+Y;f 0DȬ"Ykd ˮY;f 0t5 1kMze YvZǬ5 1kPJY똵F=As>f 2nZǬ5 Ǭ1hǬuZe 5&egc֚[ȬqDǬuZ} a8a^Ǭ9cָc:f)O51EO"tF0pS(|#{T Z.ӆcPq+r-Y*7D@u\(|$!^B9bC88A@|N1H/=f _P ( ~\'jJo>ܫxrpF8rg= ŠQ/6Sx_#2äkfndM?9_hᠹW&f: (/zWfz oAU7:\ !6?rrߏAK7 " A/R_;E'p׀,4T1!@›h`̍ zGW"&Z+b1 jJipqf+^ bwW`RocQma (g\sa {Q{Yup:bX&N]S,߽?UՋ`X}rE^"6ќMl`³roJmC[2޳ε`)R}8f|8gS&L\Wb*`9 ~ ҽ &*bܺ^fRpO1/8͡rK,\F%dVK̝HIV`mL{нdUU($R(H 94\q"E_HE}a.»DR~:Y["hR2O P$YqhVXJm +D P@!zjhBPwzcF*(8LBB.)!1 Z@U4agxa}NJ/,@qP8ǐVʲBۯBЎj-(p FXu NI:5 `,5:9(B\䉤 wRC/dk#HAW%&>67KfE  -NބI6[Bb[ݓ#{}7GllH ۡKi6QX-CB{ %`1z=oG1_fS raz5p0-<ч5}˦i0PjpxF[Ԭ8_ }PdZZOR V̑{ +-'H[)9q#\ /.X`;n48NIf}/77e/A+͆kPVMe&XigFL{'q=K32t!#Ӌ`U^ŧΙۍIQ0]E <3}f>~la0zӻ{8 9"3S}VǠ^/1ȕ{D{pV8Ļޖ.qIQG $jrD`!WA`+yQzOfL6VmyUi7' L+(hZҋ A%J7"sRVAaW?/>$A!x_LW[xwe<|z bz2tlj5 VfS\PJRq+'ྲྀzzf87}9!h˱HocF7~̙,EI~srf%ʸ rr#z~?LFlp p^!ͯjJaPQd+udn]C!qx_f.~+5>tM$<?Pӆ~I LdRCqMWL~Pj5++^O"|e`4SpUyzi/ܛp98CT62&1Gfw>y2<^ `N,u-!/CCBy1Ob"O+"O//bf$7$i,OeDEnY5싺U/rx[ט@>gԋZ:w(;uZ%tY_yA<=x啰b8_`.Ld q5u1ZS#~HaM\0@lą[e{q{YUVPģbQPE Ik>`2 "@wD!P E)\a/yLJ)1RR&(0bְ"OL̸S9ʬP"6X0 x4!s.f`pv򓸑iEQVn*{(Nf|2)~; !KdۏKdۯ.ٮ.s8Zb+d@9V hw3qDQ,q4oJ>Ә#&})MjWb6>'JVFg03/qn;Ƕ~l;Ƕ~:w&4˰6>>RI^p$u|08ٻFn$WZ;}aK~ l%<3E-j$-vdOO$#db*%uGݩ%ݩ#v4gcɹ`:#71j6m[j $6N+ڂ孷 |3THR rJ `!˂mGWT:&5OݑIKTe)7I V#q@*֔y#YMnݱ WkT H1uYo,' 5; (s VDjn3Ÿa`ssf. lDxiCY}[YCXgEdw!  yC[Sڹ CxR˽0V S"{Rc.Xm8B@Ӆl!S>#z2e-1qٴOS,yd!.Lj 3Xe"Kd<1 ?zxgΦ`\!uR :c;K+{.~w%jϣ n`gv4gOs7]L2hH#mո/a<}}{{}UO7SV'*.EY޵wH NyXc^*HZШ{Ew{d$\iW1ŻFxսwzG:K=U̼X>z;z;SG&3׾Θ#A`I_@#Ex4bx}/7gqV2&.I$|ljƌ(BERLPRNPI.D!bRxbE;Z`(-*cj dlƠ_^҇(S59S\O¿^N'!چK)Ớ 5" "~B?^߹`t\ zUQ@a B3XZgVʥj)]\^Oy5ا76l nZ{_U DFó(E_mF2SToRAC|[b&?d+wƚ|Bo>D!g-IMь?&]tw\Qk^Tvo8gC SP=%fmjvl_:n7 L+ӴB5ΪkuNxPYð^egADvP$H\) mET줆6D S{0?|mCz ՠ 7-Ol>-+vLTO"Q uI'HqrTU:{#Xo˿RUXjCF2HCe~đ8jUsYLu+ ~Ey!s5r"4/ (Ԓ*I:*VøC 1p+Tk1ktV u` jQ ɝ)MJ(c)aCRe :t&Fr^zNSS*ac,Q e9K6 M wD1{gzU:V%ƲO}STyŇj4CKn:uGWZ|wwJfѕ?T+~NFs<ՄJҹWT{SI^VZh٥,tcC֮;ۋ]fXFߎatQf>v&&:mǫ WMt|44\2;vtx>o1Ga|5K6t5Eե,NU2%^YU>Z#DjD9 "zlL c we$<{F ު#EC{BOqEIǧ$Pɓ8rt`Tm2zNKpyϟsْpDTá%lC;~( |f?_]'!A)w1 ')Esw@ wF_N/Z\ X++O**A)}=RP/y'DŽӮ_ˏ^EnxK? >_@*Ox٩.ˮ;6[ek[\gZMh}i_~<<\(gWvd-*ۉYYjeqx3 3+.Pقy.=sY ̷u \7WӃqu]~X|뿠Hr*OCۋ+7qI?lfyIԏn4/*?M='Oo#1? wgO½//q8(ޏ4y4V du]CZOf\/FdZs魝eBqŬ<{5x#r=ξe >17:YU0a.t߁H?;sK&Aۢ`cD/=u I$]gU$ `ޝtw!. wW |>f_"%B6P7?vt`o&.{,S33&Yqr`XY~h9 /a Eۣ=w~+Gݯw6d)88,9\ o mI`\#,=!ꬄ:Bf72oQ Riɳq}g\ŗ3K/k#0dNL< =s9dZīCxeF!lӏȭ,g.r2cST g8\XE*b(:sx=%e NJ@V;\80fsj߹vw`e}^0UJj]:;3/K,~6&iow^P߸H# Cڷqf~+gni꣒Zdg;ưr,jbW8 ];{Y6~(5JhD!Vke7͓'?^"敒<&Yi3.Ur'66Wzg޸TeizkG.I # y0;a7.~I:ACKJ7|Z43ܶCd%жs0Q႐Q-$u\;X)λhKj0F{(@A Ǒ zh7 P̥8Xip]qBt/n3@+E5Do#O7=Rmsm?@8Pa0ahǨ# {i $cW ;# "B|%7\(,γG|<TrueR x?rA ;~1Pۗ#U bMRAJ?0u6Uxy^~^W՗Ճ85X!(aC8qR"01޵57r뿢Kʩ­q٪}Hy7)9I\ ˊ$*:~CJ%!-E }2U U@Et#G,wd`J)U| .SLpb\6c˹HRP8y_Ȕ8.<^+(R{GW3**|A骢? 'Ш,] ˅p<\& hNgR=V??^t߼{{Ν0)H3 MTQ\2O SfߧWd4}sqm bRT &pYuDVznX6W|1oIeCDlf }Fz5h$WBFS¤5_I*(rdJz2\I%D_*WJ\k:Ph\.yv;AuMy"W@ ;m7w^Cʙ‡S§ieh ъ'q!"hׅR+ͺAP4pPw*@_JEnk婧y1O=wȟBb;P0ٕ`'~Wv\"T}̎^#C.Ί5DC'z!m% O]0Kլp[8]ѷu`ȵt9v=l1ww5zc?lu/]f=6}[ >??-^?%UZ9'ι?ǭ-]l>,s䥠F/npёZ֣<2?c17@:?(cAyЛr? GOF& rzD٥O $&]Y0[CsOh-ħZa`ⰽ' hL)u6zM٠ڭ)v;߅f2"ҭ`FVQT2ˁ s)rrK4[9w]NuR6dqc/}"k{l6)F#{3OV Q7'Ϯu˛ppSϾ047u&ɰ17)L> ,,iDc)@c#٧&W8 \yfJ=әl=wעлz XCM⧴10^W8_dERG:Z5ۑЊ4# WDif+tJ6=H!t΢v:?G;Q[R%pڣ E^2[аׁچj=L 6TCW}{h8u aL<߫m`6ή_dn e}6/H1 eFAkiT<I =U^D& \igդ2Ee=tn[I;~1,?{] ii<:RpJd:i@\|D!k*&[+l,N4zvFFS|E&qf/|:r3 0pf#t 8+G_?]߮taŸYL@4/q.i.1::#Ywi/"c]q/nQ1RTw_Hsb\;[ٸNQh:1.>U[x|k} )hu [uwX}kq&yN<|4.P9^.虑pUANF׍H)7%ss3QÀrZ]ucoG0czCCl/.:ma8ҹȠcPpM#.,6fF40{!'{*KR8W婂@м́"pgJMw¯%DʉDp*h 6Qy/茦g*|t,DOVTkNN[e"ڂhD /dSm ![W! XѬ_b=Xog4SmГhRAN1x9FK$7.WJ#ݺލRA:vgRǒ9ydEtH($گ%*Zh2ePdW9սj3wܵ?]]yN+}wscGJ],7;cp|&z!$N%X/@1oS)#Ԃ0Fknk2{ߞg=58,ɂbSW*mp ( k!lQQm fqo0iL܇hK>t&f"l1&SIڳoaV@ ]rbf.vG09#+f:yF[x(ZdP&ݞa'?`{xVG*+g.^UƦ\ pQqK^m1ϯ伱bo=oCrGCwlyG/J)%d~)['ubk&͔,m7[Dų+'|Ha*F͟d~h͊SNY㬁Riy1/UߧyCjS缤'8!#Q_&4eMͿ|M#-H(e1.&F0fHxM*b4jF!9{[IKzsLȜd(~L_v*(/=m6BC7wW 3KmO!`@dxgxM-N22,Tlp u@CU+gv !0 .hj[VX[NU»b-@pA`׮.՜`RilllT*Uyp6.ߞ#PGϤ?^{=t9H02g w&)+aC DF603DH9oؖ,3'/ǂ2؏}OR>95J6сm;Pа 7gʚɍ_aewQlHtDcl_vC[]#QqwHJKBR환nLd&jdnE@'@MXNA`&D8x|FقΘ(D%Tߣ_o*D%`9Q)Z.@S6YT3ER@J>9ZA÷+"m\ aIR޻o{cͼ_.gw%[dt(a{)2dsarx K8);irM+6?Q*`r|| Vj;V^XĹw%~\hA\ "IF<.}?=: O6͢M>,Zf7[ҽ~jv ZI:mr\XATA-ԋE/O%gwiVaV=Ŭ e+y;r 63{y%Y<1ì9S;𬊚?SeA*z9=5 {QD`UA^@(2c÷C{Gۧ1hj=]S_=7:'0D6 .G7GQ,`T8CɔѼ4(^IӺ]G sa!sv/2wF.6[^ a(0*<$)#{f4@Z5q%|6⻌RꆡlDK-n%@~u;3޳SI͕C5Hp뼺Nh^N޳H=pghd۞G98iFRLԸ)Kf$v+kL[#:*!,oe14\AI5hpƈQ.tw .!E;T|ʱj 馄>v2]1(ah#h}i1tRsSB)ݠQҭ+%mңI -n}p;g5*LT\3tҡKFؑM ӡt_ѮHHJ1hxknLTQAEI $> p9#%H>_9!uX3{yoxae>>WwՋ*EU]yw{ х^{_fwWPDiұBJn./Q{&+#{fN_S'#~[HшEN`B:_ADRyo_Nt%d:j걵؁4`[nQ+ò4U6\+9p>mJ?_u"F7uכJ' .> Aw;]|{Zw77heة6WÄrO :gĮ*GuǍ-#xT&otPM2Ґ*'eB heR/{ȴ6^h#%ݍZG̥훒C2BBKyCIwV#5IK9%IF3#u; )R'q9(4iP$&FɁ۶D za-&ZHZB8yAB/-j5dqg?C\D(vE4orM{ r%6nNpq˟-(Z"UdBWB[9l%sC}Վw(/P'%oN]HUFQ(YhӽM9~ ݨHFӼVN H!3-& =9f`?pV/htT5 q̍!Hr o+> C^ͱ-Ar( ո0L2?c)r|fϹ!ڿ>|׏dRo:0/lz{C˸ ֟hy֟hy3Z]TI2Vhe6 ZP{ # ( -MF5G}Nj]mE'qAZpX$ξ׋kQ”U؎- ܝ1; o2/yYZgi}ޔMΉ)%%.qC9"L)/(m7N./]_7 I+֍,Q%廂Q: (>4zqْdudYaF?"+N*3Th+be t*DƄ,r]c6 CV詛P]\ߤnv~\CþwX<= sGjO/G3@>~}3Q/*|;>׋+Ա#LUUMKw7_;CPPEHE%{ S r({8FkmlH͈p6Y5m8K(f~1&1`c>beL9 kM3^Ȍg鷩g@%uoA g\[%:=a1.$ѡ^c I2 Wfs[,Ff=y"lV S2]ZXZ 1ܪҊTJ(Vl&J!^r=BU,qeֆdj^jJi'T(0k :<=m {K;dJ|sn3͔]YCn!NÕ_q)Y/U{mjrmܬ r'@]z>Kr8y>>1&wu6ezJ#hF{9놅 {9uAg/=|nz >u#,Xu˧%[*o9<%sIT>~ z{&F֑!?\2gcf[Xƛ8nmBZ0mNLRR=\9f k 1-i[}AOwjů^]v6=_O ?gHw0M!v*,p&#~{xYֺl<\w'Ew1\E,Uyr4xfvS5 kB-kX/!Kaпݖ_!A$I=*k"t; }O34JGW׊zeǠT TN Tu ZE! C>7ӗ1<7=p/t<孫:7L4p$  E)PTKڛjȓ^'}4\s z y$ޖ2ks2]HEFQv?83kaBp) \TBpRyP SՎw,6Q2nzfzeS:=+a_.0cW?254O 6rײ[j<ִ7]!/}-;QepkIQ_q@*W[!6-t@3vȯEp.)emq1 5y˭A.9* 7)2?yjS+Nh9c.:Eg\ા_~TٿQӿU US \wŭorE=ǻY rqp)_+ԔBj./wpo  yG TF@Jb%Q3MR0n9Edj0V?jAVU~Vfr;v¶ =X%`q~TSI[*@k9V ӕu!^шHx .N m.z ,f@M{ss%az;u>o --?j<-X9N[Ωړ}XD ZH2UV+k-:j֜U%m=zI6JYߩJh3#t z߂ˣ*]ORHF \> $PKg_VOO!|4.'|NGf=|: S r1q 6T 3 N2’(/sN|3U!>wt1g/Ҏ`|g֙ tBt:kA#hT̛-P"qR[VD1[<'ӂ Uy尴fc./ʗ}J?l0H~@:]:EĦ!C9hbx& 92 a 6Sb({U/ =x 3*W/DՑ0e=K;y>[;J*D"D͑N>hDh<:]E 2E$wn;|o̮֔nYiXtǥ UT2aqwً[V,R<넦wOfkDAK$T'84h&ƭs !K0 >:"hG004bMޒ`PD0Q:B*2'BJ͹5$#KR AA#Qxmr 1 iȚ{٨X`pp)DP>vBPFj-[[̛+D~nyf)m%Uτ?l]oפ8Bw>9 V A2$( yj ̼B{KM8n-Άyrkb0L*%ZP'ZGkUNƬr6X™)jadBl0h<Q)]ߺzj U-#I0)WZr*d5蒐'X/|=98Qx0 ÷@[gxSX>&7fz9r_1r@Z.֝)  6<0:,;<Òd%KjuK'Cnw>X$Erw/_eɵ ׯk0J0{b^@̳́5z|Y 2=X\0ӯ3 D ,I*2LYjDtGyP#H쉙Q:OV†YLgOET$ <*3w@3+_#fɪ`e)=!hRƒn(IuI)8fJp% +,TT'ВZ' QP ?)Ru@kʄ - *#Js imVDQ_isn>؅tFB9cgAVcNȤLr`oO_/'Qu긓dLu#{Fk'wzvSvwiKՖ5/?=̳y&vtC{'iofR0A6,ؐ T7$N4Ƹ v+øVCb={,A\TZ}1*N_yd ڞq/=2wڒC{*5p/geGU?."Ua+'CW;qORq푃y9r!ZaЯ85 t%w%융J)&KO1r c:-R&^_Bm-:N"缕9뜧קڢLrWt}=/ԂfdrdB Ds`$ǁa$ Hw^#-j&D%fmhv W[=)3ҫDX"G6g7u@2xͲ_:~.-7텴d_1Omz'`J\R\.̥Gn>[y 0/b{[[ f\ T2JU|βk Ϡ11&/h.H/9{c+3pP*gaHklM@фH1%! @1G}Θăc_NIFzc*lsP n/weWD(2,jx}]A3wiI%g.':yS&ty5i>q{N&^o/'!҄$9sJivwNw7'N=;nF[kОM_O=n|2 'O̴$MwE}*7*f{A #Jlzs i(=kZȾĚ;~.Ux؄u#-ˍ94,q.-e\puo^t{7&N,8sl L*$@]!Κ Ҋ;0u*%3GנpϓOA;~zPiA\ϡBn-#3w diJ0xGF|SXU]zΫX4LzoV2øw&z~q;zt ICHFz,\OƗ8%[]r72p˽k6PVgvw_713f/*i"n>ԟCwlt/jj Ԛ) } i`xǂA?bRn=ەj|WƏ k[u\Kf@YiӪEr;1jդ/MXh%Vb"NWOS7.MIݔO#Q摨GIa8VL1 }wo)^\;kXr^3۬ի|w)n|/~o| W7JL@;Ɏz~|7;[^. vgsKj7 7^$ 1X)MVցϲ׼=*rfeᆬ/hޫі5zGf>b҈^hP1 !γc'"|L`&{s}`.`מ岗VBi`MSq]{w:.Wn=–+02Z IcGJR?0޶ZF[] B~5oDڦOVjZFvse y"ft3QޭZ(\ڝ[{31ZRfLjdҋzvG@~'fw?s ԉRZG{ϷWEFY=6N'Ѩбb1W[[u]zbnQoͭkWd`֭k|\=\}Ŗzv=ٷ.zTdzӟ1@\Qe$WWY+4ga T@ۋ6 Ы&z9jW#5QQ&X0-2@CjAReM'peSt JzTfը],ɀhЖ)l}=~GTV4}߇tNe@h&5T.c``MDF *AI6`k`_htXdl0ƫ^A,|Y?'ZsЀݵ/x\?=Y鲧8q̙L"OCSovv6Kw+$Viߔ{)=/Rz^Hy[J o";wʥ$", T&KK+M3_ ('62"($OI1X63UPz'ki &! "6х~4k [sTPri2FS}$=&]o؝pJy a iB5rMP鮂:ʭ(w@7y03i_)i{J:#kZM9z0LvUwTR]X"5cLZSݐ^]I ȃ' 7ogjRKqre.<-5o|"H['s7.l"z&X(͖@szݜ/b賂Yco/2|,^R1jaEy3U% `i!/ R+P3jOjH丯#1i€G 9 u+K[<춺>d[w_=/2z^d->3楳f`*E)d Dkm$q7YЗC$@6hbGK.˖*a0l<ǐHujAK*X: 2ni1q܆~p̹r^KKdh}"2+ٱNRI'm yw=<[ V2~ 5rMY}%Rpjamr[nȚ\ )2g$rRK7,PcB+!hhVєS.3,t6?|@E|7^Ndk3a445qse9 \J5="+B=!r4]yJƽj~,=D+Sr],nDŲtZ/<)\(24ޓϰF&'ȩw0:4o+:UVWbr9~$T@ɲ?K1զ|_ QQ=^V #O#׎,ynpwѼKR/bvooS$6U!sS1Sb"~(=Ua0bq$.H%#u':˾Kg}{1߸ew~=1{efZ<}LK~Y;k#[||mm9rM[(K7~;?oIw7/SUrȅ-^f{d'_m>dvmLטo7S.=vOԯT>5iɷ=%WYi`Ę)}M5fM93?Tyoؓ2Y*(@igbbVW Y=iBOlw ;e ;/|mh[8n &5fz\ek|MEB*2AǣxTdBxYaS/}jV,{GuǓ .{).#9gy) (P|j 3JSO&."$DEg߇z|OSH:XFŔBviGyW?HT\ Y1Q7##XTUBlG=p)OE9c;JNp,q&~xa(y'KUBT(u!h廩'b04ϥ3qi\F:/,a?L1t >cqx gZk }L{k{ꠅ^PoQjVegT}V: AsPk)^(N(Ł>djGCw<ǃ[ UY{Dd uM2\~ԙP@?R[J 4pg=3IkBP1u\|o]HM _) ՠC  t('@gjA&]DImMo5 ;RxcQ4%1w;U{UסV?.^ݭ)6`46NlJWKs-DL_Pj #.*Yr m?3vr𤍺7xmoߦf}=;$ BtAhmF= =TQvo7'i̔`/2s]Vl*p%؏\,>PQg ݰb蕵jQԩڦs;7GsFJ c ~J;O-@h}%חk_KW~}`=]3hw)e0!%E-v}OCq.[A?~P OO'Yyӝj#z>*&cSq*7VZ+Sl?]՗!䛻Iukg tAK+c l֜/_id/ٜ&d`s.*Y̢` D~6TRq$kV(׾U ;)P.0,K+uAmxg*T\)T]U!u%F寽=UuT90jSEr( ;"$P b܁WeˋNRi2*jE{Z.Ts~}FWf|t9kC9KvЃ0tSyng4l$.oy3ZCk6}KEǻj)(dZ)~WΛA~Bi:3ni/N.φ8QDDz'g$ d]I i[% ыco  o:j-BP %|dP6sS`Uz-eUx+R_3D!s aQP3Z|z̹i'zV燯)_ӈ+Wq|b+>OZr JQ ʼn^.#8$X˯% qh񸩺N >d\/ds2S^Zx8Uq}'5=َ ne<Ѽ?!tP%(5MZ//I\CN p![2ӈ @1\_8q/>\,᭱4WM zu< 綄Dsœ0䛛Ss ?U9Gfj_k q򵶩0Fewո'#uX<Ku%(bniJUCeR133ja]:ى[݉˩T\ =?fj{vVuF.?U Y1q@TZrqJ@9ذ$tW.*D/ */ź"յ=U2@jAZ+n7w24C[fհWLA+Kcp"?5(Rt:_y-$']'B\-񢖌w$Z*&-XSKƭ#ꕲڢZohjܲM95r܏_ +0P8&Av%pg[W[>wXMa RqbGK ZTKwu'-1$b.كowC4t\)]\Qz]y1jf] ]@p"3PNL~9\FE.uHeXE8|3_pDNNWU!0 nb )Y|LRu>JKKyz x(%w5o۰~іɒ3'KCf$"6\';"$ɛ|wD[SܫNOkqqr=BE naHR9QYֳSS(W&:XG <}uH 6@FΑ0[4,_@HLeGqzS M-W)𰤎DQ'Mi8_<)6+2^z{xy9@_H?&\0?dwR3+шNuv٩R~tĩ9#C2k!Nzb!Xѿ^܎L47πIỵ096TnnĤ:aqI#"D{o qS8v=7 IVng7au7aS f nG@ColZU;YD `\ wm-ߏ츇9;"7+mDA%yg"X+:x9giRJ :DB,j;VoaU&~Yc xJ)SYjPTYH\r[?{D+?\y~?ea!7t  VdB^cdJ2Ϟ*1fZ2/NT$ kܨt 15,#}'BTH 'n>Jz_^uip'r9G%0HNYԆf,PYd,gN*31W3EjPP:$uQo\>t2^#@3˘BX^yZ\-C(%>%{ I$quAMk}v3R,KˎVfJe]{E˪^2ɢ[/?,xIZԖ4>ORI# +r|=PN򓔮y~2OVRKJ6\ϗ&R%M9e3TT4AT_ fg 'b  pz˂b\%6ljSSAB=5eju%/`oDҎ$Ip9|bowvu^i$:觍?t|D5G[WEܿS8њpmu7ەD&B`]2b5UEQ]n*֌iY,RbK6grG +5 6el$kBH2.,½h%ʿs]v$%kcdYu#m'Iq(pxD gCGU0Sj)}my8觖9ՐAW(2yz!rs+eSlĬfo\_(UG3.e\HSSVT r !& s`*Ǚ8Bw]f#Z앥ZҘPչE`E Svqv4nOg4 1.m~m50\߽l'MW3Ezlp|8^2hK-1.Бu S*iUף`2&76|Y9)9Kg"Cr-WѨ2j^|Pq+U WDw͢ɋMJ׭\69r_6;re]~95@k &9juV/ [juxu#B|#]/mr\,i]IijBJXץQx1 ֺ$LY%aWg]_!i_fJ%K^V2hɺ=f ] 悮q*\p\uml`]rI I8-ue?:) '{ˬݛ֋Ζ&7TB,*/4aN[Nܛ6+m&0#@ Q%sGJv&Jb;EQ1$U,+uoLVLHl߳$H5"{-$sy(T.8M~+$?@(Z;C @Q^4&:FzlfcOseo-,C Af(]돩u"h$9 -uӯ:o$ޣafOF /dpƀpLz ^n?ʰ2< GD4V҃a1*MW7I\73&Zw-7ǞuZ_''C,{'ЗVKm0lMCa+t3E@G;ӷD X*񑷲R3g{Mz\"cěLBs Ⱥb3G|pˍr.5-AR)VdZZ* Ye.bg%%xIf{F1#;M s.K>c#Do-)DL)9SPD K$a$Ch}2 擩X3UaMzғ>c> f8&`}upŃAȽBsc:jābޒDH% ]U :*]`;$FH[Yv`r)ؐYPX*U<(01w^]]=49Mw畓N}kw,~dkׯ2Сha_pnUModW3gwx={@Փ_v{o?{gí x{ݿo0Eǝ_Ixzqe<9s'3Zg o.vpf[Cp4:BI1i[ftL_rM匋y:0m̋OKӑ< óJ]ig2: 39Ɍ~# ߾O«?5{8$j14] qS8/yԗpx'UYyޡ:ŃD+g< /?TT~sZ}~k??O:=w-<7TGy=?K'yvwξŰ?vӽąy{ƣNaϏNr-Bpx4pOcOFm'd8kȸp5OH_y୧k;k ^cpnւ`9 `s":&!BjI$|SsllCK쉇]{|z|Ψ AS[O>ΉN~jddXѫѴvNogvOt V_ӎ7׳q~;4#x~◀1}r|d}]ܷw~=t~߳`t{|KM4k5UFp+nC]tTMN2q !x,tғrAf[ f?Yra{p_1GV[eu ?[ _[Znl ͖g=Z_j V>Pf9(eD1nhn9=z76v+of9[Wvk; M?.֡&KE_Xu(L_ Ŝzx<ehCQA;LG?W\12!Y}PX4a<1;v iOx01S_Ku"A[2<H\f,@ILPk0 ^M=G^!Ka{$w!ུ !8d s 6HA l`$ "ZǤ_ V '2f(!Ӏe2#T4FKF| 3J=мMC^j\*,1#,gZFT ԜxEJÑ2k XX#I$QkPq'i!^ zH+Ϝx5ī!^ ZZf̵L6%WCKg:eB'XРycbzzSA^A#)]LF*G#쌂$GV+ϭ&L["'<`%`6RJ>`jS BAH zu-bDjVYlAjC/qT:FN湤tָf!H!; ldC$"5 3k}F㨮%k*2gPBnH& ~"^.deTR$!2`/AF0‥ b#Oז:% W@(&f _&L~nu|_p[X˳.ϒycuclSQtS۷[0Ne38h̴|JWsx橍(K(+6*cGS9dhvYLց8:Ү ..Q Czmod XB3ںM*骀@CLv0-a5'ebNji Aic_GV;dӲ~^t5覵!%PLgUZ@SNdk3=ddWrVhK+"u_ZgH,`*9ع}}h+vg!HO.nB̤y[ ~,R֛WS|;YŷOw˴0-dPHu-$c(n_#I;<~YIݤ_A| fρﳛB['}zᑜ?"aӮ~8wwtN1mlhR5cV鍯h&3.%(Sz#yDsRz-ȨEyvdFtfP뒝E+bw+ NMTJ2¤prVp1 J ʏvp7s l2uY w2dge|h@8>y.'kIV *>Ir1GjJ|`nIu![飤XS֎1vYW,aSS[_O1{~, h]a/?^hѰXlp>f_`K??g'v㼭v)zFm'[GWG?釣V+ ű:Mz1-oCK.ї6'RwtcnX|u=[Kwʽ\Kղj{~7Y_Fg]t'+ԦN];A9]}P$Rr{rh|hK&=4W{ 75v&uZ>pçps5t&wc)];fg€C8hOntn]8i \y1_|P Kփr%7׎Һ`iiMV<5Ч[YV$|H| 5gQ^YI:_XY U3mr5v^nٴp(h}aS&8v_4iNދ8'#30o5٧ڻXL-vuoGJޚ<9ggBKW|#:ɱNݧ۸xhT]g̐*˸;! <`jȮ8\WK* 4n%Ҩq|6YPJ5ww&PP1q̢\lVC+8S|,k5&#ZkVlsXaݝ@o4Fm(ۭXPxֆvߨePyePۥ元K+OePWlU!Q-MӻӤ$*'F7z;>-Fŗ -!Q9v$ Yqg#e;^ErҘDd&;,* pz, 97.<$*GkP w;Ushpov^{D=\}U2ʮ Q9E* BƛKZ#%UMNJqŶDbC;ђwN2;Ye]d|$/? W}FDZɾ Sy'_ʹݯ]x>v ^)2Z[\:;bZ(|IU%*UC$+Zw&ǖac[ruFzQѹgU&&F]CdP $.3k!\$JJF">h{ً>G`Ks8k^:gSRY,3=XMzn>j{@;wvcgcl-NpX/p8Z B [砣Z$l,uٴto" ^6׾_ ?KrMp1;oD}fL2u_];к${X3l`L_ybK`joBȤ=9Uhۨ~ E113 FeW|M!ĸ*3kNTd|$O/s?e {2&[z{7mQ&uQ Y3b68֙\m2+ [/J"/{VP3=DbN]d鋊 ` Z;Z6lk2 0ԤV[2@,FSI8yoJ9\@6M̋NnC:"',?v2du}RҢuRmQ/,Hݨ" >#u%0jڽj4ʔ;c%8P3=Tܚ-y9M /Xζb?[^Soi[?~ؠ/?G~9\v`}>LBO' w6jFgzruѳfoƶw; CsPi>_fdczX[7q>]YmXkUJ<9Hm8WzX aLn>( !r~}t$-u/.ؠ] fTȓ):ij 4CBJMq@ͫ")˙9aXzoɷ|lC):,}kNV% ґb %.7n*0;v%4x?d(,]HgKQP` PہGÒ7pDej @(OayiAXIQ eɞ0U"(El &1*N:ȠKX{0E[no]bR~>jɘ2CG 궼BۓW1r.?9B 6@_)9O xjqB% :;ODtx^w𡈥\|xsO|~#-b51/*Mx-af g2kd#]8r7i&ҷg7HobPO{;P @rv0c. S? AS(3Kǐӊ;nwLJ/ADndT^wXR /@\ϽO0췾QO:0Ƞ;)$s9L(I9L ӆ)tt^ZwM6 0s CSkѬ˘mXU+3&[!0Ę_(3){XC\4tC'`hd?^T,.;йE#>^ǃMaX t?cd-@~Kh)Zb /-oƕڴ\p1)yU¥Ud\^w-<x+6z?QpO-qyڰJE0(فN0e cCl=k>X%LzgFQvOGwG6U V 6J( u"nR9OA%l`̇d |KYqz$*RsRIPlTp-D>4Mvؼ>yk}"]hwa%K=!"'"F"mZqQFϫQCng W(rژ۵oQdC&mmZ)%w}"'0 8:E>F'S˙w;wcڛ3g)rf;EW7 hEvUvŭ^\SoS@E a:mZ%!*y#ﴱfC=5NY~J/'ѷoK۽L.}gM.v*GGoYƜ:u,Xpk#fELTxMd!q.VvNYN"=Tf< &Ƌo}}vd/!*,3-$0Гa?smwNê޵)IW>\hH\CV$f<\,Ǯ$5O?4jEd5^ePRGkb"6iT )}.ڶ>Δd{`+LTN|}`Ae;D6ŴfHv`jK޵q+"{6⺱'M ip:jdI#c仟!Wֲd+9Iɝ 3pIƒO?IQH\, 3.1\TiQ*ߝGgH)ieX9QvR<#agD۬hR tQ%"9Q>R  r'Y2eDk FR& 4{fQ)Fx#!Ds,<S!lT,Dc! 합HkfZvo.Q8dQHDM@ &Y.B$ )m.= N` {a&x/#+zײ!{ †%>eܗt}7u5ն@.C%=r \ ȥ@In%B\N.$rI%\~FsҰ//}D+%46"k $Ц/m#m(-N|-,#Z#KX%dRB&BR%sy/ žY%%Sa3dSWkNi`7DZ MLЋ6DYːQmz^,A_|SՐf}u_vWN[kLhz{V+vfPafM2Ƥx:\b,\px]}1^+17H2r 4Ӿj#o?6|NL\Ԓ(O_Eƌ}x;ZMP_0n)s5*~xp㳿?O6|w/5Q#ϯv)C\|˯NN"Ϟvh$4wZÖi: 1n|7GvGÛ3Ǿ1tQD=hݶ{~gSt,DTv"'A~M 'VA|l{kNhe^ٶDž'~ϹG03o%`=T0햽j>^vA5z͟fN\?5=@>(Gƞw{[.KG/ZW)O:#G/n7gWUAtvk+go1ǣ70u? _eZcz}us_wO c+i||v/֏_ur0cx胆:4sh2yCa.Nxr|_$$_X:q鮉2/ xSp>T+p 3% Ihe#T4R?кfc>Ԙm '~@kDˑ" cl1W#|ܳ|hzp4"njvow"7X݉4owOּ{_j~~fr WGIp` uX׹c+'!: Np ҞH!6Ypxj́YӦV񄀃|к~bvn~5Phv,{$*.mW8Jc̹P\H~^5zA ?ocTl=F2e`"T`q~Ql?%ѻ-<34cg&bYZ5% =ISC"PMKF)0RY}3ML#39iBvt;C!ls!ܲN憮;hmʬJ4H C(M`nP .&4X3U~ azU- uP(2`4lダ/v?O2;bSr5^鹠Fjt;~.]yBBmRo")RjR)-p+Fs1Ύvd\HuwCjGKR3/f1!km8TmUDJc{s< -`5~q{n5˳Ќ]B&jD3.[&tmBc<=yј[1Y4*S9d^p K(GH-g!)Mm 3BM/45VLLޘ.5Um8D 6| Af1m6}8r_YuUmՍu9Y06Va^H؝wσv?GEEgPnK#{3< &'~;Lٳ[+|,H*{WzKoqx`6H_.>~᪐t\Uv8%kq7! Ŵ6q3'_qg߆aP1J)_ԪW"(^J@Ynz,Aycr Zb%c*T}}qz k][Y̍˃0[~ZSʑO# s(Eub5bmLgҢ2ͮl\R`-FuaV!/MoY㴁%eb%I](Of=T@L+sQzXo|ѹ$rRͼoۃk˴l Q^OBpA?: 7ajD|qE\U(QaJb6Vw[9_>[L|j .xB L`#NR]pq&)I/I| LHk7CWB,:Nyp΃u^t>EyUJBl¤ؔa=L:1B gҙI$ҁ䲱UYiCbDfeDfKYS))ʆ R4RZbS% UsWWoMҗb%둘eTm(3USF6*Ai1:䇏ӕkQaB'm&SRN4@dZ;&e:XͰqyit`U@GTS QFˆJ.ȗ/w N'|OHA5gH(ŞC LbN'zPqQ"@b`'J938Ue$r;讀kD]Z]Q-P$2*" Zk,< $~WD"%=OD5%!=^FDAaex4)V//9wJ(VPXE$TU08%ҲᦡK=OS#I$j*3˰3}A|D1xظZ'ÚeR)&S݉Qlu0`|wZeT[C3[ç]矠 ,` 8޶J1t*NY c?0Y*,!&U2glN05gLKaH2#03T:%[};e%y"qޙ%c*%9*iBU s=wUu\*7mSX L,'^/+)@_kbޙ7՞phy=ostnk1KZT9ϽNgE_[@n#˳i;USHu#WzY7ϸԍ(.kBte-wm[~g xI"`>IO}< 9>84:!_1Xt} -剬Cr8"' f{3 +3Q-µ1 PtG=]ѹ L !;9ti03TJ)$yj19i,2c 6DXKBk7Nj]]߷'B}+En*. ľʲ0βF/_lW{USm3 #h.b66VSeFct]NZ2r;@$˰'(mz.2ug: aSV$P;K h qIqLMZ= DHɠ_Ңymꀻ?t\ l9I| ʛWƠr?fp٘_ǸɸŵѰ7>y ^sPPj5vp-n mqSG ,F-IL){1 MIx58Ov& cZ.`}L9Cu( B4If5@66u{=Ovy'~0j l,J*na&huP-_:ww5}@G7<|f1aJKx|m3K.'0Ŕ#pUt;}b}T&*NT(<㈥X/Rj ]$i̘ K#zK1aMZ<s-$tgIg: y,ƠqdJFcNMfC-_7[9}Jb/eW:LA|½vgRa9]2ixڧ #eן9]Ē42ԼU?)3tR;ըQ7#EW}`e9J 6ʸ2-:nЧK02.E%Ї(P^DC./]tuz{l=~84]Yu:0 I펿8f#^G.S6u#u_~ۦBQB_y_E`Kclx~X@v~1'+V7HySı:|R8 V;|_+S)^dUgUtٝKVE \ Io&}%qMQey߾[*0YF~~]:xo拾a_;;Mjw).3v[nʛw$}xk1>^Uwx I}hBBO op΋sxg ZB?|QM }mCnT?s,XE( JF-^)Ou}nߟ}q3>BAEu7O:i`B[(N[x3RȬqǶ5a%LUV%eЗcv2~#'n*v/z]bey%{IN=WD,$aPp$s_iře&OidHO(-+V5,d7mWAmѯׯ_?A̴1c5Ga"Žo3uv p킞Pi3$F}/%0E)Pk%^"ZoVvZ aWuv%_G+S;I/l($:6j Jز"J8#vF٭DX:$MҲ*&UmJY܄L 85#Ƭ6=5:֍؄IK _BM~i,*HM \Uυ?w`+)WAz[@sK8J0zA0So.'L-i|曕 0fӓuО3Q"HT>;;$IX=Rwdzr%U^(5Ѳ‹uzXYl!)US(ǚ`zetf^X|#k*N[MPSjN#hTgf$$,'+X} d޽㩌|!,e LZ=8c= h&B;6KP_Bzfj+2˻ބ ^BOd5|eiHGpȪR)$,T; w_7dR@+_@zmbAFg4Z٢s[ȼRdZUenETJb,eB׫}'[:p+o1+tu'r_ʫǕW+WuX_bάJ3 Eir%Jd-5jr]25ZR)I4w^+'\e3P믒ڝdր|5H|jbnd [{/|Y!Hʉo(+ ,k^x<\VB3Eo=Di5FYw7qE-e)a>">2JsմaOw=CdjNSRIgε$*2 L2otmTniF,!H<봇HVسў{fKx$9fE  q9bRcFc(yJxg?+*`L=d%Ӱ;(@sb/L·EmbS *+Gx#h1W 6:U>BpgU]X=xEsK]\lj!0Vc% 56^qJ o P5EEV ^RR\Ԍ:Я-Vz;ƠkqM҂%ՌCkbl[;|9j\_p_AZNNzN?/??($l??_>e?$١uOCTw:pRb@q8Z )H od&2s% g9rEX` H3FE@XV?5/l볕g>{STe]2 r_m[vL) MiA],eE%m ;ɋIZc aj,DCE@C@wz:糰uwuͳA>]lm?hN$E5^5FÚ."2<G*+:Hl YmͽͿ|#b `cB*-8IMB5dI6@ |3Yͻ  j(`< >OYx,9K1/F "&$6N=4 ZBMB q|>`&C9r;r̠(ۄbĹ]:zl&:{1 NeBؐzZirx\Eh,3ѝ I] 1Jť< sP8oL"Qb|0^j1h-ƀt( (ƂU vy' EL&lN: ` a^7Mt+Dq$i][t}㦗ج ZV^Y~ uN8H~ ٨:pk,+XІq\a ]΢jVi&֠u ҄+.})VBe%ZCm\׃^:_W&9N1n+tW ᧎婧·ƨ4A'!YrƼ)BbRݾ\3(} T^;aO\mRޥp MI@In.V9$z.6>ȄZM}j^7؋3T͠%+nc!\DbdqBRyo?{ i㛷Y[J"+K +ے49?QNp: g5р|BJ]nOgUgPBXd F֤JWZ(masӣ !`2ȁr$4c,(ae8ZQ" ۿw}-m [!7@m^ "JSQJ=%$&6mkN*),,D"I/fG5#dFcntaKc\>\ L) Ȓ8@qo}0h߾M)IEVV(NΚM~t]qG榎]eO.>j*6XJY3j[O%U` vx&a͕~Ce|{xL|E58 AU *69'k-HE3kfl5&dAO GK9. 뻏VEdJ QkKJ02CQ[]5B!`]I܋=0 Εbdu 9 q9AƙӦ1oA1?b̈GFAT6 η+߫]tp1cnFr-^G.1[OoӨ j| 9A] o۶+v]5h :IEW-9mCvl%Nb%C9B 9@UBwy1[RlO%^wv MV;X='re} XyQYZ+B)V`F7XI\J69/t I;h%vXMZ4[^F1Er[lbq5.|"ڄ"8ch*y-ox(: 0B`Ӎ%߇oy\yyjCR@ ݳT9;~a3eK;I<; 6;b'FHB d[Z=F ck[SX ]4厛.֯5>i" PٕX"`rI;qՂsLoKcwvK 9bs狚C]MxŽm9f}rFGb" Chݍ.CǼ$H]c xRAߘB}$XPEbѰWjy-E2.aa&j\6^ha JbP!}dk*P"Q{}U]I!ETD#X!H8OB("(qǠX#(E"UHh`8uf^VQ A2:~݁f[eЪR$ AS=myU+8WE=Jsa+(?h|F,`̍`#@rJ-5 _XF:uABׇnYk@{/ÏȺY{6Q.rS $!c=u@J;O=_y5Ni{ eФqxL/Sy4h9}gJ;3qrq fݼiQ νfQWr~J{wkDvwOz|NݷMJ^|}7G 6.(\n?7А뤯^Z "|k.=by y Z{=0 qҤl׽sHD8}k(Ӽ5>zk>@k/NVL󎲼Ta˻F_f퟾zx/W׽fRF<4-;F [Њc^]c%\ڬrlwü-P9Wrޜ$nDbQo{ѩ3;Z 8>סּ76Ȍ@̼JEx]ԋH3l"ԝC[70NYT34MDAGW= Ty^ LA"(cRml6(KP@(|)6bg?W:3uzQMnX۰{8~!F*SuzE1N6g3qu۽O[́P6K‡j׽weԺxG5:gkˁsCT}Q-eCKy~io_m&ye\Ce쮦rT=Ȉ| )կ%WbO`tqcz Vژ2J+՚$PELEZ ,c"e'! P$)W,D!_BH ap`"-7?MdxЌnS;Ttygv;:qC*tlq~Mor)PTzؕ}}ܲtUIoWT}sttxo?~u/_G7y׻2YR͋ǣ2JgesfIZ޸)W^ sO}P|oJ'Yy&+c5r%r%أ/#ѻTT4Q)qT2Tʔ&"4NIp9~]rP2HIRÕPTҔ쭜7#6@({)_GEmb){F'j͛@BܭFa%f^r^olpY]֜TS|d S\Hm7(y뛅6jϲ&hh;atǵCJUgR}x~._H?hUO.mT_hnM~rx$nDnвC֫G2Q;/L;xg;u ރwlYOO :Dߧn&p`Ia `fk)sl ttl?ʞD)Ϥ3|Y3l$O4|HcˆƊ3vUHT. SQԻ!KB1%iΝI ޝM!, -_\E>*Tzҷ=sm2xBt`x9s})e}J,)76W<'Wǃ<,2,-~G]-OÈ͖p/DXzT0]ݠ%qO_rXM‹xsr61PٷT3mPa"WI|"P QcuŃgMۃƵ5qa.4Ӹ լ \'%L{ўhcnQI"^=X\p{zIUU%6 Fu9S5ʈ3cAt$1Ep(2C5R` H f4HEBQ?w6nj4g:=0m580ek,UMĵZ4@if84u+Nlk!?Xq_М.x0srѿ*&Ad,OM*nP{*230´9@kA@ÍD ] ɫ/j}ZVx4BFDRQĈU X$P4JbCcxU1 Ej 5R5bMJBuF[L-xʬ<Oݼ=w#n8:K(xTj_CNˋjNӥC1}̇Nl?$B򵉇p R%mGZpؿ  W*+K#nU:$`Vj,>7szG|BO\yqPaN}Mc22RER.RSdJJ8e1 ;e {O )U\űAHv$Iiqym=(Kt 79ZV 4Udr5Nʈt ռklq*;' ϺJեJ)de"/W G<>a ,I`x",%2_ŒI-UNhrx7AVX?Z['q˗ǽsݿ-Cϡ!ɩC6 ]ܜcWGNWGNWGNWGN_ՠYϗ AĔ.$A,U@$a3PӀ$-\X X{)*muKh{D<0QF]w;ez!xv,pUX"h l%l`#p. X!,h&i 4O(H. b橂 ďǩ,'mM6!1:&?113A.fZ+H '$J142JdaA.{DpNS"@W+gTq lqU*;[&iRhv2'ܐ]$XOμg(';go ߙ;=\s&tnO32A>8brI~6DYh'{T95Ac:ItU+2&•Y`&uuS>t$(y Z&8mbmILuKкL՚pP,}K>>|(iI<@#Ю<9)< |- $SF2s4y!J: r$x)|^;yc880qpх \89[!/=S)jfUHXr3K՘Xv`ՍCښ.Yv<{4 //܌yEog?FPEacT(daq\9aG eRkb wݾz} fpx>.,z0+T*W]擟> G|(8#c>r6w 1G}}!_#\Q5U⸓[9+x#UWN1IY,pjX N3 > Bm2RUBWFjǺު肜!4|Ӽ5,(-|*^[SA-Ye* jd](N@ JFK\H1o "䵧KBd86!Z=hOIu]*uWͽBr6=`}8#&@@Sh B3Z 0 ,3sAiu,qz"gͱu:C Z$7.{׭nu[5u {Ḳ 8kefw xbZ|II)L.OVԤ<-s$CTdA AaI^/E%2f,J2I_8`HVcWlYʪ:\o?g"x"χ{s]Q|ٗ}^ު_,C˗a˛—73hPIV:AFv؀X^a0RK! O=D(Mh eЖQF)w.R] 3QHgRQ[CJt('YMIKǵSV'aJS7nz㽎 J @y*>>c`G|!0"=-XI̓ vK/h[#(lF?p7e1[ -zn=һ-Ԕ(yuE/b+zT#GX]- Ԋɗu4h-Q2^gcESZRs|"P#>d"áeUăhӢ6Dr"MJx]C&!&Lsb@2u $xIuFkM*BY.5X3;'cr^Yroh=W$-󎖄$A"ZB.ѦE7{=<6j:6Vi3FQYA&P:Ht fÙa|K \\hDJۓ-ߍ$r[EH<#_DkoA`c!/7t}#SfTвr['[ 94n$,ȥWRTW@)4 ̒ +XQ`3v̐'mQȌ,wi44f7ՌC.{cQ4¶pDΒ7ZhwIXLL$8zK4Wڐ-],q$xdTDn]FVj&r]臜| ̠8]< <Pɵ :p&gQ A_fA#>n8@Vi䪜!ϙ=4:^EJ91z/ETlQW{~lA洕<9D , ;6XM'ѠbΦ}kuqF8;7X v7h=KђVEiQr2"-> *";U8e5@ogG㴒-7Ռm"`Qq2i2)\B3ZJsswrTwLu2E.!Sʘ.B,KJKS]֛ZCȤ8>{Vqa)ZڡԂTAumf)8otN H-'b-FQ'֪xrvxnJq*=MwmwӪ̚f`L3 R R ?lN@ެŖfdH" 5k֛bEb=o~v8Z2Srr 1^<891&y/r!VL&F)d \RX)Ɉ4'$9K(2k'U'XȲy6ROEnރt y'rڂ~m~:ǘwḊ[/|8)m9]b[D嗗+9A1t̑HCPfh4W#~8KK@(%*˅,Fiv*qiz}lޭGä75?ز[jD~KŰ\9Aig/͛QbB[~zӐg#N~)0qs_ճ5|dʟzf3NQ-=⦕Q7+%௘ K,Fve2)ہćgۇ-{S>l$>kH3WTœv2 x=C!tW&XL86b׸V&[~G1E`# g&,#1n@\=H?jsIŏ=hبGZ'UsL']VnD&2]i.\D(DM/BA1]qSx!S%濡cc}14vSN,xp,.sʁ`dh x0qW=޵=Ӱ%7WYlT'P]^P(_a}l!DS[698DY?npާO̰[~m%³'4֨|ڛ"x3 J|)wɹ| Z:K?Xj?(I2Q\Uڒ-ܟwKjѲ-f->1Fu`/`d٠-7YqX@G6T&Foee+AvnhH>瑰IuL<Z:b^ |Vյik$j7A9TVfa]"T7cH{${Q y8A> wň|4>bG^j/\~ƀa6}y{|>eߊe1|.ӻ;O]]+{ ~ፆoҨ`ĞNgi2ő)]jiZiܗPJJtˉbU~䋦#*î!#i9q9)bHvw`$ؼ \C\}X⾊w~ [0fx[=vfo_XIւl rÚH21縵,ryɷv>;X{4ۂEnb}%0?rZg1i9]ffVJ$]|tbRsq0cZ~:6\Z^ b|8zr1}>{tFsQbȥ?3+1xX?18df)Pv$xR57n~ &nH`d_Uf y2?4?&~?Z>&ݟTCCܐ*7d ٯ WwW2HbVrVc4ɃY+ٻ涍%WP-$O>XڲZ'RIvR[K0dҶ}{W^ κ]DIe zi6T za 1ҽ>9X6^ڀ07~7s`MČRP8Vprd*PI\5ܸvnm\\7.s\a a) bF DŽ2j8M8CHsn5Ks):WmFsզݍAhoLo|gbtM)riQ@4>S)"DJ kaMD7(8 K$@$d9 Z-2'+tzOesa(sdu`W@>=p}SJ:]6WĎ:Z` :[R#W!&2D*bz+وD*.4I_rIi.ZNV%T`̟p,cĐ'cP(FnR"G*Pғu`ɸ/M_cljROvwx+3op" y(0 T"<pKginv\Xzt1ӞSɆ|Vc=yN29Fz~*\r("p q ;Nj+<-@LfGRa-LbM#ZH#PN5Ep " Q%] -^%xYBI1&QsRrPQ,87XYqJbnޞV+_x]s|%x&xœSb #Vj#w$&:JTpC&IW8R&,&&R()]sh~k^=E@ ܧ#>AεSk|0aDj Zq$"mP,(N( R! HqOrz}r=~QX~m_2-.\/SY3* V -OcS+/l?'x墾O'%CXMqa105Y1%BL4ΫBX8`{ʳ<^Tپ|4Zcm۾ 86nĠw?4T7[[\cS6h7Y i?rGғ, ib9ިit#ihM o{ il| r:8>`ut6nت.5Ɏ-l~1{޺ѻ[hw?fi$TV6aTcXکΩl-uۭjfnTjK'QuiMZj-tO\ӓ~w|Ϙ ˱:u.o3PaƭGjhL$V֚5i4*\hVM,}߂/M?v/J eGTѭ]@Ko܅Gg/p-~y:ͻ?ȇNꂛDy+4u_\YI]B w+Du4| sUk5Gr%_doi.mR{q9~3//f)ϵ/spju hf2G+o%^%nZݣlR~3px5 ]k׶]-k葓7?IQC&zij0[ %vo-[PEd-Cc͓/Fb㻸iw? B9rs[&@8%lwwɯm6^˧[7fFn"kյ?^ȼE$V}?{hw5٫2eĝC(kunj?173 [^68lf$l6-_F;i&x}ہrN X;<&a;z߆F݌5|(cNeݝzՃ[h[|` .ܧr}3^:\3Zne]^φM֐oZA9"Lqj3˺y̧W[~ǁOsAms\ov{`99hHO0+]0 yJ^BwM<ې]nəoejǕH tBPzi9P3O,њlcm)ó'E9;%Rу;j%[wLbvSOBr◒f؊mU`XJމ;\'\MR$C]9N!TFnB([񄖂BTaUJ"U"EZI"e9 f R G*$0pei)qR3"| B*Y76i*dĖH!!7}1"eqb5: C#PIDYj"abG bM9@XsmC|k|bK_Ȧsic0 Ę4f/dT3mG`/۵h@206ܩPnykkȟ!΢\d>m-,w,~nA\ñuAqH1Vo RQ>@SDj?Ov>oFI4o\5A3AgOH?Nk5r KQ84];C޵3];Cuސ7!6U8JۅQZD:T09ă A*# "Px`o9imyMW>5^s}&Lh6^)If sD>vc1M%:VG1T(6ZG22&V $&Je 32^ǥ:ߧ0{X%*VVr5֑3 /hMje!jbe*d:%$Q:8 ԞYi6b\3XO3ARp|6 #PkB,N8(%bBD%`P%Uwd*8ބ]P .m8Th [Gٻ7rW;E]xRIQ$lFo<9;Of$hVJ ԡér~<W -)1GCuͻ:vblojܖg)"ՈqN GyJjç~ÏOxKZ~aO޾S哖W%dɗ\la=,H%BΖg>_}%vHmjOF-1ޮliC5D:,?r?|8׼$h2W5_j?OGV5kΊ+$CSvRbIƴXJ\+)$vFA^)_W$Elab#6ZAZk޽ea̔:`W>X)$޷#YÙ$޷VD귖xo fV{$dNgjX}eRcBi$_U=_/&z;iK%߿ߐ8 ;%2'o=Ut}{LQ$|Uٯ!}+ɻKg*D'/gZxvPzdI~H?x.`URT7+{1'Yxپ)Oooر=1ݼ?:J__Sl!?{vÆBl9 @32S>8^SfoAB(`t/ ,;c(> ?a,y^|UlF,ןߗ*e_È=x3T":'dhEӚ;S^ xv/]V*s%4ay Lo;aefHvL|Q-(n H+X QDӰﴑҝq o"FBf*8*U,ii!CKe',Gt;k gܯ[^9As'ܸ$ڂjl9* vJ;+fͽٔTJkD~d'RbB)O|Vʂ,*H=BJE 0o*(JF")]_)˷fD.-rG  Ae"xVH}%/3;et3[\U?1o;[_ph㉶ϴUV.z2D Kßt"itQVq"[h,.e2)+!tEx E;vfN2nm1s⬌#M;"A]Kra+soKD8#^ KRV;T͞Q3y(QPX,9 xxֈj4c-i8}쵊ɤ|;NKjn%dRd,^Tk'v@99sbpHɱp3Ko֍G$dpPQJtʟ*) Rۧ+w]*S@%!^|(/o\~uA7`ަmXֈ p Fd^ :e8PUz|-ײvp+H꽼*<^;8 4vo#Xʟ=<}PUeުFb'Mbq&&6zsiښƭj\BvQ*},mw+ 2$kiS~׃i2ZU, VX:D@h*BTN D< !|~ZyuK+ئҢgʪ3u1RY$%mA`S66-߰7.ֿcH/= 4Lm}]pMпcʦz,Deu!ٟcۆJUZݟ2D9;zѵ$-H()TTlNJ]¹3pX !f ђ3㽯3P[=o`XFX,,,}HUƮ-L&RaF;. 0nmUFj&bΔVy0WƪP0ڑ;A3ŭkn0n=M sҩr$^>5LK]Q(c 8He*yWBk ).@};Ieبl❇S:B*}m/v29[˛j:pDf1RsSΎ-%W\ ʊK_}^Z$ޑe}MhrK0>C} "I:VÒQki7x7͋~$NڻX|"$zXJrYQ;$D2pmCWEh$Ls#HvNzhMFJ=뻯4;{:}ĉyJĉyJ'>LEUXXچRX]Ӑ-,!Uє*GuGˇ5>|<9.̓2O˼ ɈH`APC4Ee&eKՅ OEˏyVůOւFvY^޼uBZ+J[Δ0S`FP@Uvj4;S}yl8 T'ϸimmoJN΅'Дh IVFR&]?Ѫ*\,ATd}ů.cgVIEU8t( ep7H GkHW.t?wMrY;&v_ u/qeY%pX-H LGP/t]-NСvQY7yY/<ѨXHeD8(**^E&a?ʞ!LoZ "HT9=vl^II _jziq{w-Hz`%:t+`P脲p=@ !;Qp|=z[cF; ZpqPæfXVpnHy>hTFyHEChK#$}u(4XɑW">a ?޾Y~6x[,##:vqS,~wїx5zל fy>L~p8@X'i^/1%7 *wSeYcIA@l#/*e}c|JK '+HX= CXvZeƖfw.D (-)B*6]U{DU+si9R4)dbշj Cg~1`mMʁ`+:Ɋ!oIO{/1ٳoҩ2GVEB|9ڋ(1%")9' $'؜"9Ya-%5ُPR: 3nFII\LJ('φ?Xnl ?~)I ɌU Ii`γ_a|?EG w箳wuUty[>=tʠ*+G!%v⒋Z [].Z' y>_ʞU bĶGfv觶Yq9sFmt*-_걗{{}skx9ϨA7+!_Т3B7cRMPd x:l;!GtR͎w't:f nt)ydXxh¼^kf 6 4[I 'hM?$F^Wae_F1s5G9=>9j L18 M0 S2 p1GJ٢r&h_"-A ZE[SiIcj7tVjQ2 f)KLu:1ܸ3M!?O]ڛ E"ғZ[,"!7(UtΕ$ے^t尌bZ]Vh*4%%ҿiН/;V|jgn3/K(LdwVz*% $=PN(J?:a{9NJd _/9cEA9լzr:3"c@`7n?yH> 3/#|MEKm_<.sgyHFօx.N믢 )}VSUl<#H`13Gc5Ri=a]e f!*:T԰+쎟kZyV,lN^]z*ˠ$b_*]}s(Dcxwׂut:TLUXf:Qqdjw\*L`;;'jl'9j)|}#@hԴ"ꒅv; yUGoGl룷mяZP7 Cѣ3s?}9?Knu2:$ejAʪn5-AyC 1Z4D*wvv{xvj" 4bsJb||r.Q(N=UtohyM ڧNrTA~6]34I;NA;Jv=ml6/fIc'{%LKj$CFx2l}dH|'q;O4GVut^|Q'z?kέ6\:=[E4Dwn^E޵9j)=S\ն+΀]^S0y3P ېMV卽8%Of:k$tV;M٨nbpI%0y6\ބE(c"_~pDLWoJ ES"ZRtP:a)R%EJ EP0Cjj7R|umҥvI^YnfXi !w C_MQي9 _Ct4LbػڒmM#P5o{>/D8vrORȶJ #{sq=ך<qE?*Q'w_gk֗3ݔ8|lQhn\s4^q"QPI u0"E 蕔[ފwlNs~g!i"c#} ʖ6N##Q,rVr]WU]0q )j~>3a\2KR];~ڙ@outlܧ務{&9AI|(t">~4/)|Ld _VMe 8 X2Aȓ߅TRXA6㙳To?昿&}=z7͓~Qۆ^>c8Q{aZ?mfBoW8|fHPϨr6C9I-bJd&ƄP{ɴyvf@;+fanMV)SqnaPyXp02\20\h9`Ct 5SQ\Ly\,|!R4y?vgbJН69!\289&TJ6cp!Q0_肗,E1V sAQ Z%!$*L W ^#ID&,f7_ i1 aRLHmr#/xa2->CEHoja抹ػQ~G&r4IQ,Y^ӳ&rG{_M)~晷8Y|B}WKweLj4s޸8FXǰs ٵvTzԸr5]7Y\7H6rݟ[=ɋѮuEMG 5L w-[s6K.r6˻r6EfLvVK s/2LQ_#Qg^QDwmqswKh{22\bp'O0#3wy}9{ 8:j⳼K0 ]\ G[Wt!=Ows#>Ap2O + !.Mb21QcXWQ'ww-^=FpTr .b6sYcnҦmcM:ښ&RV#4m:dda͎oЎǠ%dNս 9;2㐚%?/ ZHΫ']l7[ R'-=wV:Ǽ.o `\!\Q+XPQl[k̶R Q6D`p,ZN1+H5?l3'_ֽk9Tpݣ.muQYϔ4G2-Ħ +-?y-,=|͋~ 3cJ|*X 鼖 ?nysp.ˮJO~ۋQLgK5_ݼDk;m驽>X ըWZ2WWnoԱ} dz}<_#-Qٕ>[S)#TnŀJZ,-^Ǣt4@_rkVzrZџ4IkoY\mi &0T԰3;١YɉACpLg.₁.7:kCiqn~[j Gxi4^xFeu4V#U(JQR2HȨY#5+i 02ّ*6*REƇzLTwx k(覜Qm-#:pgI{P8[%7ʕEGtn3o#5?}ێ6uâ֝ѹdj@\y:2bψ &'[%6NiBT!яY)#FFS^C_lggO;/Y`"ݞPVXHOQpC)tA*6\N6Z-0"wkhHc+R޳9CZ)%DZVoZۣ4@V Țzع>OC{ei%̐,Cɰd> 9Ĵ씃%{֛)\4@ 蘃 ݊UMc #g5 {l=tyь(zt4S~:&FU ڼ!6/"+BsFVU e1M"J#9%!*̈́WyiDK%9,-DS#`zY8νuyw*U4pyU1ӊa6 bШpa`##/E8- Ǽ s@kr$0{5FeZג˴0Yِ0 )4rps-JNfk(=t^]X$mh ` FglYƪDu?|G\/KeեkI{eTЄX,Jg2Gh4 cIꄇɏԶV6R3zPGo)SYEYuusX4'ރ\&e`^1'%5җe0ު~%#pA" >FoݻV(OtB&Yzu<[ˊ rUh5 T[+l)] ~XQĎIZ@e))-DpL1R^B+ JJ΄+QR0a5Pj^ fy2Juq &wIpCҡ3DJy-,T\AVm+KmJȩå&8+ЌkdYj %`@!UbkbVj5mFh^r`:u{QT&yqLM0$}_s*%C˶I^9yd.雱Aq-YZe Y\Kya[B)Jq"z.K/qQ34ʜ78T _+_pjA/g*4ciՔ0`ضwC. Qr!$XJnI/P"OD5#$tX=AX[)Z\q>L28j]#bub=+!)n+bhep_ph&KF{H89!("N?IXiiC kU&)?.^PLS#VJ5Ţɐ*EpJ%j4T&*Q4L3M ⥣\j47q%Rn%<DURʬ e(q(k 5'[>JP82m)h ӔU{l4KR[T-w{~I[`606um \y}ۀvytv܈ 1«{Ta1 ={@ WC.\eZz1:zՀQKS{TO@O=DnNf@JSN/'w᫕8o<,Fb%DbhHJ\^g5ou\!]2~h'x/EZU4;Ϋ^(0l KOCwӷ&2% ~gLYĆ7be oMO$GwO󿧁,8W1yM#xWWeӯRVW˶<dm^%^1=sٻ޶r$W,0R Π陧k8Nb%>\$ ҝ8::,~U$EW\ۨ"i")$ DyTn Q)XK¹~22Ȕzxk)CZӻU睈VB&jYFW9SF<'ѳʉb$!md 9 .hWx\t"iuy)PKo ]t2Ƞ2N' Pt-xMud0 ́ϕEDP' SۿZaH[ cUl^RJ3\ldQ2k%+z4J6(YFu$t4 Bp%3R!׎hݻVF:崼+: cp*RsQhÔw>RĢ,!{&l Չ0lɸޔ0\T`!+׷+qs^m3j#p!Uj0L)NlrT8A4^J4L좂 Ŝ筢 R1o1hbIGe.+H@*ЊWZPQkT:JGJް)VM%< \JZi%_yJ"PA)WnsV,j-` wߧF#n!R2"ܣ5YC'C0)%"z˶ H`=$@.ݟK S\MeE@`0B9h,>G崔Y% $+4t:ݔ1d)Fh@'.E.BF sC{Q!TXlju/o: ;R៕ʠ AU>}4oݲjy>UJhv7 ]Mpxeei j#W7ی?7?>>>v(@hx4o^OK <;m{ 7~d[ON'trd Ƹ5{RÎ!:![Q "qY\*j뒾rr"64=P0uwe:(@_¬"8Kfs]F) sp\ҜHO.jw=j}ȩǻoR|hx?zdև>PL[<{z9YgN=\<&`]QM:7|_ZIok5%n4#eXy(.Ց)o^kx{jʚ{%Պ͜0WCГ  9߃a|tl1^W8vŽ¡EG$_{IQ.qK.wKf[beP !f_J~۷&3P#K ]5 C .?kTU1M6 rKmcI8(גa_hyUϓ4Dsy'9ޅO⺕ 3R/da/d"SOJM׸ٛy\YV|?}dgՓ/^x9229rF¾ߏ&r DJ X)JD*U8q[?  ޿fH%$/ neC' {{Ť(J=VMK7?LSh1@DYYR,hCFjCKٵjմU6mu)E۹>(Jca }` 7$3pg='_WJȎ"޻+0Sާ[QdDj =;?( ogсG;00zSFdӭtmA0*Pְ,bKfnS6hv'4?:5:'( anp\Iճ|ۙrM!2_ |'(SA̷6fo\[m>I[&SqmԆ*|Ik%YeսO>}>봿М*F:܇t>9u Pi6~vrm@H3țn!T0#ﵗޛJ| eJ=sJ\ m̕Gsq뙭vsyZԎa[iKz2\Jw )a^ru(Վm΃~yy01qUNo殣i D** V)iHP!_mhciuS3fzQejv 1"0j [.;4!DO8GW`@CIߥ  C 8s+H ǓRׄPJyY+:Hj95t.kPgxS{ymF Ps ]D %99}9긠# xv:gu$'L, (ynni[ n:et_Ks)ӠT_ 9E7v?Γ]#':"S_||rq< 8I??\eޟsy?7t 1^2JǼģ&x\N a`@ L\+T ^]G+KB x?`||ř!\ikh5y1k>|֍oP @QKWp 2sOR. -8SZq6np1$ZQq[U2)3T)pe:@zRȴǘc)f\l- z RܨbR +\tgE|!H Q3j:n؉Ƃ2&CD ZޤJ晹Vą\32 7@vNh6 (Co@<1MWRE*p: MmCj ? m@ J#`dh";ݗ;9/\)<8B wL4’"`bqmC@mVɧlYe;aA%igEs+m ڗ!H77iDܒCI)M44yj9uV]%p-4\}u}cK5y)+2sAJ>$rDL0>Y@S^`pWiqNcc7L_P=Ajפ?\ZMкuZj 5jg=C8s⌮'+.VpWMi&pH5 !)*LTf(qscfd "j3d&, \0MZ,rdqu΋L4pa%R3yfܿt#pe)JԶVq2sТl&'wD%Ja GPF+PMG2sA,Cq$6+`ߵп`ܛ)$5JiulQ$I(d TqȢBRizkR1dMX*N1+L Lb<0Xf<fq#tjM*-7Sh:qsT no~-vÇٽf+.(yH%+|jrۺ-_k$zt6Ì3K[x;9a^'&"o%lpH[1"t5cTtFeYRr")a()쁘XMqpo:V`$Qh=:Ur]~+g-Z¦֣rZh4FY.c+{}pD-ݱZN 4rU{RCU{L]=4|+! V`9"-ꡉXPZa/eJ*lC W&&tpgӷt[KQaZGmKHmkߧT֪~>l{)FeN7n8SJU.42-g4g |eWUJJQEwrڶZ Veۯn-mgiV|~])9* bH1$H FQl=0 W^Cͧ{j-##RZ+VT0jUfn5yi(KD<+|iPnh?lG@ rSս%T04U%m ~Zͫ#l13r?LKo͝[$H۹̝y&!zyKҝ/U?OkR ~Nzxa-{%>j։ctO_>h'ȋq8 }7FX7|c+&D00a3s&liN Wabª](gLM)spuՆ$I{UNˇIJ0x:>ϛ { RɆLڌu{!OĉuKapvh7p`>%/8tކ/4Cg|y`glol':TۀvclƈJ`9KY@E38E @Z(,qy$#%ru7^XKs~ڵFjx44V,A0_,-UPsJ^srreZ:yʈ̥TJǠnR2Y-r,%JNR[AgqƿNv.r~ _nɼ&:9^gwO'aGn:P7㯳7ݫQ5m+莎^{(mjҦF[[a+Ochq$n!>VsW/- uBNp5- vO3o3y]Õem݉ ~GY#%wHP>4 'H{x`(V /#tji:kǎ,P(7L$E1ZU"Y* ShPK9ŌJ-Bq)B|[t7% )8.4!vVkƕl(H@;}ʴ}1^&l`CY?9Jxe6N\J q?Jߧӏ9Lzvx1ԐRI t@J,b#BbK"B]&.7z2x5a&jZ"-+.ykqfENptJ-dgQVI3͎#PT"&6ٲQ.,2bEUS݈i(ʁ""(RȖRCuKQ`D j@[3 e嬆W¶ UrZO]UJHj=iQ K;nn葘8CkH\)kc={W Uo:2Ggivi'kXy>}2DZJoJ GF,>Di%C3rw{Rq></]kkYmtg|{xS7ˤt4G~\?E{tTb|h##gʫ+9츗T_>zJh)OmdL;7=lJ!b6f|5+(ī~.jQg&TPvϩPX1HkF̛rT΄ /NJܤwqo;$GiOEi1vGSEȬu&Qr#qiЏ#Iw£׾ѴT0yy-5`BReFLԦ7 yH&:FҜ 16^0#e.3d~ GP2 F}Sq<廖 iMzcRi8=Kfϝ+g-#ЊM괌H7'zUk+K*[斕j1RA!J;l|8W0˞VmӖ's1'{d\]+OO ¼>YeLIP$10`!+լܲ1[Z@i&8dI[p_d61_Q:d#Ee+A O>z -MZ|S}妉NY.;[ uݫn%s1_ A\o f-d-Vl֙D`$}K$7H!p*rFiW}4ly#kmsCE!FNG67!1_kO?l22api6L|5! /ށαFڡԭrIw⹾벲Iٽu`R %"RDyDhBQJ(j_5V&'SXɈ`CK+ٵF1y 4RW QkU~S;g;KkwlC){vzY.Q*%\wI҉%h7|c|@*XF@WS_&)roɞX:*ox7r &3eY/b.X}.ߤ#<Er̓r&\i\_Z-op`‡3Zh#81 nC@~&_+{+[PRVv 7N&CIz[8':)R_'sĒSzOD}s*%5Wsp&-m-eerFq?4/.i`ZOI$ĶZ1\1bK*ȫN@P_A+Bg듍ؚHb[qzTYhnO4j* R*ӴcoM&u/h? !c0{x7ǁS[3 EH~o` ԁ6)"ʤO0S*\ U$"Q&ލid >WdnF^^ 7K3B{Hně:~Y_R}y1&] JTj8\ Ӏ8F39&)L潦&ˡ$־/;\zh]0I_N,N5;srmX1?:laErl˚, BBJQ } y `jDEMzξSVi=(}ӆyikuu?@Q\nvT<E} 6* -f\cW5u""/M̍ Nm%dQ.;N \𦕄 1Qdek|ziӔMt2&YZ:x$fxl65j'6g5g`"bڶ^:IRQBJ}"yM w\Ȝ" f"X$Y1Z$VVtJ$H1bP&h8wmm|9!=,,o6$ ng%J )xzHQCr\HZ YRUuuUuWZ/VNÝҿ]ag r֥-rbNϳC2<Vm~ ,Q+-9١A,tF-g/0Pkݯ"Op>;ZSD]ŨcUٷrdqGrn$;Xk*ϻ%G O&`NxʳT6ERM]׻ Si鳧V L#yW*bZö4ze!"1F-oĜPq^RVY-8S[󤊬(UZ|:6 tsd淡YG8ӯ6}(hKQO)Ĝ ZP,y\TuYL SXU}~?^/2 ^' gx$6aB[P'X5~\<\;}EpT8HInhNuxM6P 3V*Ǹw3-1Xj@L!EBuvO*aV(8..I^,#JE u)r+EWJ$r %Ii/KȘSlM"b+891ryƼ6 T}-/ijG\Q`0@a4M?9`?^RYMĺIX7i& u/H' 2k "%$Ta)2KÄ?o4{s+TTRhO)6zEN WU𔠼A"q[X5geaup?hF½1bMt%c8UDaU0tKbSäMTM<>cVة"-q}7o^8*,pa ǟ=}ESd,?K^?\0Mfӏ~yxU "زlGzE/oL?ގƟ)z]ܚ%= >D( s J25paV-]jUlx,Sx,,Z<{! =)0S2Ԡϑ2]] x~՝y(CTJPk?M=>Lm?A6kئOatO=jX9i/0{ ]lp6 nM;Q6UUǛL+ _\ hRMws)%ax{o? ߛ?Q}+0OJLf^f^!dlz}Qԋs&yM'1<;FܭX{mdn{lj[8\?~H?:jWۦޯ>Gnb4&Fnb4&ʺ0)@3a <>IĂr8pX`cZ`ء 4j8u /{k@ F86G-Ghܻw >kLGP-/,=2//dΙx!OGƤcyvξ<"Kʰ&'ax/>}uZW4=~'i81=u=O3}?O&0>0;٭!Ym8fo%kr@.PmtnQm+@J@r<}|dD42'#PJ'[a=>(fWґz1JN>Җ<9) ܜ0f&XeW"crVSCB" U>Y:g4~rc;SL4;?Sp?'DTsۨSϷ$Tc}JWlmYX,iV. . g@+'61;rZ`SjlR6$Xm&a+QBV>⬐2FXe 9M**WՌlִr]m\U"qDqo~ہy̴}vfk +dђd!Zj|;R Iv\ϐcӂC V%h3JWOFz */sY}p#$"IcӉ$ |Z&L3X?nV|083J׸L kF!M2h:h Z3d4}:+j 'tZPr$IXhψz@[- BP6ƒޥ< :"䌬0~/ 0)4k_ 4Si{܂c]s[X$im\-CŤuC#p[(e17B38F %-T覹QbTkv)1=GRt:Jhk ڹQ\+cO E݋̢ReoݖVbJш{zG/WtΨW[iQ;cEObBsHzd-ȹ?ؤq\fجhƽ;]>Ww|2Q(a2YmLYi8R5aCC,LKy\,\FvT_scn$eĭZ!6.ot<gwnͼph% U h;* i7&Er DO[AAm9-E9u8j?5bE6|{2!bZ.3H5ȵ~,YlFi^@)\FG+?LŌ!tAJcѬR7jQ~ ɓ># jL\7TT ; cxmjt5^ؤv$J-jO:T T*mK غđժk'Oc~,UTO%@˛KoNILΈ|A}MOh?m2߬(&Fk .QOm.7wnQE:&~`0NOP0o0Cbу )=+QɈlQ|(Tji\[eN#{Ed:cQHrn( ʍ \mhm88Un|,751tn~Lc˽0۸39x!/"8^l%hdm]XŠ3ɡZ sЮ:sj~)|.nl4džj<`sG7Q (skq0yy[!h/@{Gs4DX|)'p0w<&Sm#S6NÙ~NV?S98}VqXwP3էdD+˓g;&T?SVjFf2O'{)ɚ*!L/r9s ˯1Bb}Yqv9j Q/ ė3u[C9!Q[6M(Ks>g>f?pg+ϻr NL$)i1Mn-{$&BE0in R]Bq\Eᶋ\nt^UՕF(^ِ> ',IH/+g#oD\kL$ (1O] *Wn/lpa/Hp4vzaߦ;GWip#]س+N+]_ V ?T QiMKJFdM97w= }gk)RkQ~YȎ@y{PH֙/>5 ٝl5n$ZS RI+ц'ѩh;2nbhQ_*DS z4G i. ?RDrԸL_D31(ֆGB;vJ1yv\"o KdʦK>y8n)퓻P u&c vz D՝W ړ>M2HҮ.|t֥yOVv|#Mp\)8MOF\ \K):GO MOB5e*0qo=l4 Y ?gِQ3v")X.m^;݉1\Xy3Z[m 8N`bڲHr!]F$!SM㢝 y&\r IP$EfbWCOR,\6{Gz4Ҋv(G!7:]VcԑAp];)onfءO!`7T!,}1HkZK/ ]7!\B-E@&M uz(Hx#o;~/GɘU?ڊz,s,ʣ,5*5}>dTiKγ+'k:7QKrc/slVk`ץZ4yR{2w $+ }rCkenӂ s;_ ҎOS>TY p&$  !&ǖ ,<qX$x R-c$SIH0LXBxf&I&gG F׺޳o/;`\O |7+f=?K@b?g} UQۑt?\XRvaCdܥP%{no&!vȻc86gG*G&=(7X9į y"ZIDGj(! [U NgZu*VyEg:$䑋hwv[Z'=xƝkT|"'jx8b; FJǐK(cMxL=O"eV` u-_1fHTx5% [o^^]fҏ-a$RdDzccuVdvGYώ+#;g^qhow"&aon&-~V 䝣s+?T >;[iݜbDoCsۯi/F@4 OxiߑL'`zrtu_L"ȴO[ڄw%0z4A 8>~'W/q~7_ ~}?\}o"ٯ_Ϸk}8~`™W+xᅦu_\Ń0l6] r_?q3] qu>֟Mnq.B\7Iz}d2tg_7db:}gx2N]AXbˤr{` [\5>vaN/i()-SܻB/'K6W+ɫc9BS5ښ]q ^Nr wjΙNH|{tzit*!q .:i%W7s}{&׃aA3P}> k |` T fy(V _8\ fѯ' 0dc<{5|=e- O/oFrla/AMg.SGϯF_MaR3f<'cdRo;2SCd'ףIڏ3㫩>5ݟOlS*m@mEt\P[}&8=TƼ]: m;VTtubi#Ljdc)#fTi\DGFFqP`-hSބ٭Uirw anU7($J @qoz9Ѫ(eGѺ pyc/+lKj.![,-}%dlK>JP IV*!@biLP ޷FsːծR+N{+<a͕<}$(`'Edu<>-R7Z|>R(BKoZk;5DlT<Ӑ+%~65 .ʩhlͨ.:4R4+_~,O2܂VTްI7ɇOWnw%V6\z^Y 5.Z.#p%8)k@Jפ 8hEh⾓{P=}h .ä=cLleJ֙=0f;̼a1 ̼មhט7rmquv->qtt׻g+mÿ"K)^Aew9 V8ZXlQTV`*0؊Uעcm1`:w>r-K (f<;~xVƢ8KNmC+t s+xE,/˝`̧ygGTm@ %n:ssNL\/yp^"^xރ FJ,1%N q1 $zCEF8@(b0#'b,,3 SSWACl% <6m=VÙwcΑBx F0L)2D'08iI9 ePm{[r,EBL"lL JF1N(e>Qcb[(_+}W-4$$exקLpȴe@*yw TTp0, Nb?ԉBE80Md_NHSHlj_sN5fZ PmUB֭Q(fụXxoaSE^TcP,8h$<:xO1Lj,80G߿_U+EX;XfY`D5 nU6dApb>gL}y)].t[I4)k4Пd< U6 t7 6.0*F*WA΅K`#7'4 RcyW U=Քhvսz`L7FB L99$ r˜kP|9(P?/)BMBp7:.B*Rp[^e$O(XU tU"JHYCPtEstLCi^f#Psp9 qcHJ*Jp0hNaQVp `DLKIbJA F Zf)fQvtP&"H1XzÉMwmMnpv72U󐲝SM֩xpr:V"HHD (劭l|h4#, LL}qy%hQC蕊^vy5@ XQ܎t&zߪ%nǵ[W-MXLQ F*CB_+ ߌ#8wNm%=Kr:*U)WgҖ2yəYbdW*Z5̃!^7* a^MEvCMո%_rXlH'aRW-ʜ&_Oʊrﴺd-m~3 i|"LjҒ:-}wVc<]y.hOk5S3ѝZ^8lŋ iRX/[`}\YXQvQR?bJy z@$NnWNυ6IAi(ph%ݨBj7TϺQ;nAGwd Pr)8 :: OHر{g8=5<|>m*_k-wrzwN"~S:*U/ d9L$[G-{rk^Oo)zHVsd2>$,%4rx3?Ƴi%r&"Gqi(MugEa~UyBp(u5[N4'"  "ȘIs+潟!Z@c5AԐtq5c8nijGX-1Ġ*RH<@Ĺ*@2Ƃ뵝' +S(Gvhѝn5ODqeE5mZL[|aKNyl3=Vn(E,4]QU,ku*@ MLLc& gD0Ƙ䱐*!"@)LEr`?ŠlMm0j d_([?4)[ b-j򎌆kjM[+v"RMHLԖWIs199QH=>;?`R2,>HRFDi:cMw314KR j)8-_WY)ȏGmcDָ{yO?%!8Av?mP7RP Kvy+AJCL-oyZr67䮐p葘nB2Nr"L@Y!MhwޜT{Ecr߽՚8bU.6n "$ K:倜]|F /˴/gg%о`^ HLx L/LAXKw 5׎'AAjEWaO0( ^r8P _K7 d7kR>5q (l?t Z?Tc}T+[rjȫo&d'=Y>>Y;ay82B Y(GL@LAΉҌL#58 EUPGyW ͨ$*̧6V݌Q#2գIiIW-O♣lk~xw2$sFIf5QM~.omA+Ax|F CD썓 N>ybw3B{x?ߨ7dh{I"#s $d@{-9 0&_O3ۼy8K(b`+zy3Ū^0գ]hK@wڲ 3u j+'0Y^0Gd=06&2*b"2(Ba9V-F+)+ILҟG]$2HQPDT*H;Rrd1X@zfk=\t?ꠈ!pq`H'a-|.},%w@s0 4IW#`$cĤ)[ . @l}Xjb|`$,Ia$j`C$b38LcHBI!'T)DkW2HM 91jW`s]j6*Խ{ uZ0 [)womYn22 i)LZEjӛXB.q-vMC&Kv:SA\ԒY1-t@{U8-Z{)GD9oB̜}"1:q6j@oҰSfl<%WƧ"I>4졠FTjS&ѵ|-,VR 3HU4-Jx :U|UQ(j>Zf }8ʋR.ٸ/IYOlt;ꛂc>͖<<Ғz^ulηfFnAglHo5!t:mZ`xC 76mpP~UyvԿSqi=N*˾,䅛h/qƻa\{?2IyBiTa%YRA1S,HP WUPS0<ߎur9}n=zI\Ks? b\$)\n~ЅJɗz\rNTRj᫫GcYui1}oKˆ? z:,JYOz 6h1F?xOUG~xͦwc-F3o*>y^]'º_!& "Cˆ9ooZ!$ -_y@mYbSR'K!B+ w@߃ q8^DA"p@FVu=Z :I3HvAys 'T5ǰF 0muO[Xh FlڰSs6b+mݣ}jAMuW{W~1\5(OmM|zaښX@!dPNծwGw +gv'mwOOAH眃""8OЭ(`NQoP sރy`0,i$|X+#Ly5"-8=,됣pO#,YbJoz6G_k7ճ^R@) %b#x`! -,d"@tFu&l#)\]BJ('8ł׹C+H1K I!@4O&y%DICjQbr;^+<v#ZN>Ctd %F#%<#xEx& aoSqz%CrR#-x3ۼ6);1Cb]vwbHO%t:2:^*x1APzwk.:V|_݅t}9uo*ܟêk|gbη(6UR#eneBR76Z7a_"g䍯kV˺v(j<{nlǥlDL[u W1B1$Cض js!DfVx%x)[M|!m*kqH+09'A)@ `9F dgh3υ }7дz4-t[F:`8*_~mhMuy93wtW9EJ]ۗI 7HC:zt|5/,?!1qn6\^;^_K"KqV *N~m1Gw}PXЧt>P^>"H)x<<$H%2}z`z~{ntµމ|EiiAh rzF"db6Kvgh$zn9rMoF_s6OJ0ߠ4KȜ6d&*^ÛXimZ(z-[g\"9dprJ3%E+R7[-Z?=֫kn4G0m פ^畱w/ZXB^?ԪZBֻ\ v4^E?Pk\IizՂf'_҈npsC?9G%IFP~iObс |}J  j䑡迢൶ZK\Զp+Ύ swT@RĂvVtD]o)yʬU|)=MnDn<^X37*ܙ,o3N#sy񦃲BAә5҉؀$2tA 晡|8QKnYŗq}Ռ7)F-Q3ꓚk; s5 x%×!v-i*2iqNC1jr{˨3z>t 0^v<,m=VÒ2Q2R\mKrf,R$`[ߐ<="E+f/K5v uX#lݜC0p˄ŝ94걔"M_Na-@]<>`J]#:D(='vfٰ Uq .k/V,B:X_}R**y*iQ= Yn(leFH&&Ai^ּ̓FĐbzf7Uh8-i_w=h]JRceuڠgN:&_R21ED!dTf : EJ9H ֣%86A[=פ"-~$8w)"X!S૛O7 ˏ<`j--φ!0Τ՟4ۍӓqzR6NOIZ-M:%7Bq+W\  6J"n(4l>?Mz7xuE,G!z7(^5Y8Hӧ6m2}`)=pi0A?[Sݾ~斜ܒ[rRrKNsKaFPH!9bn"B\d2,Bj,=Z?>~t}͜O]d |4IsR?'}hK'^(!IF3,$o&j$ %O.)!QћzMn/dd̂R:932sIc,'?ŝx1>J3qzxE+2p"y7*Z00{9C(F [c,1ã:i=}8OV*yg3uN>At:\c=m#Wa)s6|&&ZKΤ(0lp9)C|;|V$$'g f$*ĹHSJ2Zc֘ KftXӦjv;BXIxfMzp ]0y%5$d.M[̦ /Fq lYr! n|yA<@9L?JuO%$)قq!3פ!c @L;ОCBZ SfJEA-2RuK39^Q'ӕ˓R7 L7?+a8-zg<-]πw?9s?n\`ш/.%c-Z~KnJ^,3˜#AkۧJTM)M.b0zZ[#EN(j܉}yφ4)nu8;&&arqU\Y%sXi &|\8Q17*lshl{v*V<5PɳۥW1D~g!{Px:uy}hPPeV1dBv9Db'^r[*$lE3Uo-˾P?8/wAěl ZZ7jͷu-M n"* !dhGJ۝(nQ[ v:"H#xϽ C3)⻔_S/{)=bߋ 1J:JC,g$LgbƤ1(|ʺ,ui!ͷ}0sdydFhq 0@Ʋȃ yp9R \7݇,Md:;&\ۜS)9JN11UJ3*:F :m_ {"FpW5S+J(<(Ya%KDhh:B@!k#8n3C ,FdoAY"RI&9KdU{Y7d Pe{TrZӽS'hˤ|!cZLj&@2K 9h+Rie@XI--L,2.i9v;&޵\"Km_sE.v -!9c+%W'YgDɢ.H"aHm"3sns.-ƭ<Ƅ-f(Fgf,D(HlvYkhy am&hcbp4ΫJQsLjTc#DV0R3YcA.*#U#x*ȕ޹./>HE,xu훷dBPEsjV}[jefڷ[-|sK6YٷTj8OL1w\}e7@P{6cM5Q}+|FG7 #Y ~uk7;ƕ>"iJpQ* LƙO"L6˥獳la6qhW8Fܹqij?a|,l}&iAmZ1DFʛ:5Kj1Wm)s[2$)-f 85Aۗ.շXtәqGIV锯˸_0G1?ں26ɫ޿OD&EHg|vskLdXǯޚWx`YۼbQ6W?u>%lJ0vA8ߔYb:kx1ŕkQ6]7W+' ;Ěu_f:#qX& Žw^NCĒ3WMd^_)/!*SJ*aX[,@f;͕!'ZE<\ 9ڀ$(ŘV*!kT 5éF{# "$iFC3d~xE&GDWd̟߼9FryNU>.6efdKaF?3ebmhYZ[Д'>7yNwz= !JQy7$+:;+YxPn5Ǔi=_iKv79ei{4\ 逗STڇ@7K]}UMqCƫ`^*PuRw-] 5}-E֢F̯`қžN!cG!\L&NHRadG x#̾=MY%J7G4tAY]N}M >Gߛ wNCϳ\wꈌi|qP*AHe'T~^;UJl aDk\xE1T%SNh^/gEML4hby-Nԅhr͟ޔsHo?/ LV%IU' UKk<}xq5V[ HjUAHU:MS]y5){E@=g8 2x%6ȵz^M:K?)̚ы`Q# +ZAHÖЩd[EYݖ/UP$Pm"SuMw,6&+BM/V`&w݃3ϸ|F VNrqSzMXdO : 5$*ƺ Uٝـ+ETŀwZ$ .lӗLS`Br m}ּ-ExA0(@ RH06_mLg:~CFs.o)%Cwa .ﬣQ|rt)"gd_}O67Wu il6C3 )E(D0kfm+{=oMyK;/xF(yIVV!B]lufTРT҆TuVôHq[2_F)왽/b ɪT-G&mzrt?Kp KBRcR-wPc?ש(L3qNqanFRj2 )2iVNN7-%Scp1 u0{71c{U}= (cC0 ( bE`F(v&lC5kQĵ 15R: B-4֟ /^xtE㝁:,Y ˜#dDaM !f#jJ|V k8g*>(5]5Pd|$Cb_irdto/~w/Da"Z#V-Yq?O: @j..(9mH&t4_7'T_^؋'WyLzko]%^^!PSTJiG2 5՘$ JZ3,ARx3n(@25(i:UEد_?&s{y=ʹ6a,k>7GH tY⬝}LR-DZߖ('1SL,Fȸ(ǒ*VQEϐN+MtȃY \A"FqE+12TT Nvh%`` KHB$t(R0RJUAj!JGf +ts yԎ8Y@I.6̳ѡ)`Y"aDv@q0zg3F`hNh6%T%X;c)9GțQ "A!R qLʡ'9$,uVf  # IBHd @VvK*E' hd85Y2X [䔐CdSjAX*4Gv>G鶁F[=VjiJ,u%V-o\ 1lCJ`Z.DKDl4d X--5R-nPxEXzx'ﷳpxj$ϓ3.^_VwITo7ϯR$ G ,< "DyW "&ْ狣+zZLg_/0zO>o?=p7 J98ZW3}7eQI%4˹xX!Pe8l  ΧE :r *U8HD1JՊKO ЈA t&8 0Tj"$ `G! * D).WQ ~̌EH *9v"J:2ZrqHː)(T _ jgSS3 1_ˬ?!J^*c5믘IHH(f ,:e- S3VYܗGjuLմ{?'q*;}4y&T}B`!+mロ*  @.qN~|ovrMJrM4eO\S &xe%Fմ\[qkwZR*10?J!ʉs'Fb!y:g;\PJH#p7k&o_GSkPy7I6JKsF!]Sh,2`f*WK pe J~D-'2yC\LK.z%}ms+aG> FM5gJG8^,n5ԄGf#w+HDu%eq:r,r`8:`T@*tH z(AכֿL|?҃G'b0Pxp(pv.0u |\x%Q(nAN) ˜ 0CЁqɨMsa 4(b*qH)uƟf~Iپ=~Й/1|]g {_k`ZN9&)dQ3m=_ur{eCsZ NA,hbePT咤S10Kc,<Bzە=tBp&ufe=7at2q^ʖKQ!mzJsbNDTb­C*OS-J~23yE`n| JmLKFN#;+ORdzVWcq3EkZ鬀B6XQ]xH 9RV'\Sb?4,oi;r8&b TLq`&;^Yl<k ؀#B7Ǒ q`g,0RSF ;` }m*8 '4n+#舽 75>b^OzY8:O}q{$b'%riCS*\ɱbbh41V1%eʹNL_k8_ywcϸX& : ʵ2q;IwR:(Uy4a Heu("=O+ ~וr$E#/j݅z.j]]\8Gh[&H*~[ڌ.< b򁲆JKnuQӹI]d]H3N+σL: ]( DA U3uvyJVg_R\boE9Q֯j't.G`WVhNڲ2BzSnhV\Tnhv!,4T)y DNE>Z\L v{ )7(z\˫b邳b{wK>|8t'x>{~>XW$}'ї?۬û =mG |dz|U90O%nEx jl sw.hSd=kTӬS6XQU80PnYKW[%mӳ&"vIjCQ^9hW(_:v: "jwK]iOfDbY)vf?nn^ : i9\]+5εOβ!?wp4_<΢B2tr&i78\o˽x!ޫzfojI8;@KaC1B3 "Tqpb+D1'Hq>r߀?žr 'u7ٶ@YQ1_3yt"ߵɃ/)wxVPI,8ʠRh}odVMTfǓSKe٦>yA!χ!-R5hXA)xl30u=)_ֵ}î&]}yA&'#9T9iVv?ܫp%Q9xl56մ[)"Hgz- ,NKǶέ<6K+weP_JΖ`E<H0P$O)d.wjaJȒ$%s8jBN$8H! $s O&=AYLQ!aChP%I9 w5ϝp9DH'D `Ud=˩Y4"S/&"zY=o8V8 `4 R&eKL2ηy%Ao`*yJkBQ<*}wB ;L \%keK --g.o~&Z  ҋ4,%mWw>b ERӢBܽtxʋ)4OFL3" Nˌg!BTɐkBMSAq9 Yw'I<`t±@x0RSLۣkLD E^'rJĊxiJUi Cb93 Lܹ8 >| ?>@O*fDv^^~4bmh>^פRM=.^i8RB@ԗ犜E/Tr< FUH ^|V^W̆nAHlsB+|9keP%D1ʣ VjAݪ9,D NX!2S8_OjlMzAUH+$1uhvKŠ褎qv;];֭SJj*$䕋hLpnIVAeU(ٝ}2I!A'\!Ah_J"F]%h ?dnRq \e+i&yGS\l.P y xa^Wn@ fA{JIkaf6<>鍳W)ߛ~ hٹ@-诿3ؤR?6n^rs(2o^ Ěg6?ЬY+ZE!nU;OB–1ޠa2l+ҜpY<}{cϿN٠T5Q2.Y:#( >R`3%ά@H G2hޝ75Iލ{&zLonn F_k!yt7y a/wÜ:3̟Dop’oOs? qN pDB&q np冀B쵆بd_,&x_i b7Fs,Hư6E|>_Y.. lR)wqw.XM^=%W巊 0^pk4^4s_TkuHcy1@\T +֯H@+ًE(dW%B2;YՖ%2QX"(S54L%U+mI!EIR5bBRQ]y,+)^|zZo\{yB=Hn#;jD /B*NꮺŀCjhS[bH}} p-\s`j8vQhy!$L  dk~5𽰁y r]8 R, 83p9奸;7`~'y/o͖1Ε=M&Wbv8"̑jv7Wɍ)'$MgW/#`cO>Q&42bfCP_NQOd.#- ziv?w| #zlpL;3K^^ᯮ;Wzy{O?2ūw?kRd {lO(V e3:֭[5ӆGShإHT(a[(7m;܌cَ` Xs%.5mUn26?]j&8C\ĚfK`,UWz ,5(5  t gb"q2&'D d|P}іy[]~7_Ϯ?y}K*$>}Yf78DL2#,eؔjWoc7W ZLvHk1) CTJj`Ae 2)9W G];*R9 Ljp%F`ːU(VH^o2'HBB %h W"WEi3۫8&+չ*:iqBYÜk8f|U.n;~tӥyDu01X<@[ X\sm^!!qPVph Ô:[wrt-HASM{.R맷e) Lv՘㳾PpRhXc+Kg85\3氕BⰫz&rR.AOjK?c I49bf:4g:#BL@3su9sDb -: #C)IXS@`!BW)eк̑XO3 /uv!^/Qyؿu|t>Om`ξ߾{-)<ݿ}X]/~CɅWze@b7U؞GLp?z۫-uxOme3sn-0Ż_wn1לs6 :y|Ws7Tp8Ip u/O TqC m`k>n-ݪW.+2%KʔF IXBʃEՅGb2eKk+LƈcVymG Bg Ќx-%Z8Z@;&ia,}×#, F<.#ļgHԐ"I!,VK' -L \,,gI1B)UZ!G >_pR6qn]ϩ<;1=؉l^|V^W̆Tf6$1Vbk -ԅnjWP T_}c s+6X/kbwIܼ´mAf{xrOSv!j<~?x93$BD1(V~,ؑϸ5SESI]\@V_ 0 YIXHXevOڌM~ƒywwy$;5G{F< &Qf4I!2Dj%v,JRьqQU1J0to/הOe'SϬ$#@L;3ⵓ @=/WYTfջA %d# 0{$BC k(. I!FQf)H`VPJ¸gK < ) 2t!~eK#eAmh ST OSFd sNbFXe@pzgDP^@d=2ƎXld¼łb 9 *ٻFn%WyY,ζ;̾d oq֖=LoQԭf/& 8ZdWźXz'ku#W]9A5f6X"(P~wc{KzT_~/l]X3a PHɶL +<S$=hQ>|v"B=e\Õv8Jt@^Wg$rloڋ+*s%Ӂ;_hNd!$u KJzIfmU 176i5LGR*K\_`AE_)OPmUcqҊw~V8a.+l89X4Fh@!1Vx#w_Kq:P+wqAW2[{=;hQ_PKV1 l:bA  N S)QG:ӡ*'we?[EcQO4lFOuXRZĘz@FOCS$H)48r YKajp2W7vcbio g1tFUi_kj" %ca<@7F9N;JC:!,QW'R#s:mIID h)uTJ$%`E}tB#؀FSlRd8?m5 ܖB'UEVy@߆KcO!["QaTR dGwAJv5JCg$Wp83SO3Pt-KyõB{o8;#_I|"R-K $>W]4DU Nzk,Htm5EF ;+F)!ࠇuu6& 2(!cE5Vc]C/憥mfĠEn#x.d B!BK&5@-3I ";NB$3` PVZI-N[7(Rbw/ZuqR:A{ߢ}Y% %4@ )/_qcG-A :yLJKTq)Z9[*TE -{'AӓGz_DgV%/.RC1)&g(*ߌ, -NJÐ>mSjv:ANQ%Cq5O?,ׄ# /%S"; \W iα+~Xkcp"6U CLކœE"d B7p4D.RSvmaEXR[:()z<_gr 2U0#JIJӮy IQ5[7ad䮏 7&ߧ#-?MEz {?~Ǵup[|@lӣi\2/%='|.Ix{wzzy d‰X}DŸpmo6-E c+a>3Z0]A_v%轧xP 0[A| # yӆfQۉrGiTTRWn*ﳰ:M_vON ѣ/>a!@Y PoHfb'/Q2oӥF 5dۍɻF( {j!!qڽ]: rkhN +_Ce2Q.vBOHYΩgI$B3eʯRB tq/C=x9^_{h&IЉ*-׶*0ggTn(ƍ4KPe? e 6r;'Q-2fQ:=nHHk-Dڤ 04Nȁe֦Y =hު3^FH3^oeFg/5[jN0"N8+mL uTzН5OͭZKppQ}=W/t T)Ƶ . e,~_j'8Tl4}8hj yonH3)x_OᄃR ^T;R !rS/cy1*k}M"jiA%jh,>H7Sa7y~!] *-ס06Ђ{R s4)JO4>Ûd2h5]p[{ǛO-)yG#&CZT/  R~83A.87ӗuʅ*c()  h/P*6Z֙1QZةD&oǤ_ē=I$X/F!v,BIe 4E;MUx3~S0{,4lgM(9)keY<9qs `etb>s.yKQ-Y9q oږ7EЅdCR[_Ǜ%h9ص웅5FRGv )LWs}$af y:q v鄇gB E HABN='˫9v;0)gLjvc9vSSFTtJKIst?E2_I  a|9U:kHF򜣘NH؜8;ƽW q4gkedDMy|Osא:jA%'uÁq+AE5ڬ-!WYE93Nl4\L8h29RiBr!֚rDRGn33h?R+Fv$Z2×b t[Tt٭rŠuM;iz1a8s>Jxv3-yvgSB ޼̡!9HU0r2VM.zV7Ufr) fRc % *K%`< P1* <15mh&^is*3s[3SlԒB,Ya<[&jGZ1Ԋg]3'ue N.:E4I0NjQQ@.8:8ծpvu ΫjCz ņ)w4AD?>>j_ެ=3-˂*l&2./y>SʻX{OਔȑY?$7 ڼ>W9vw]X^98'd ӯaf;d56f)/WOJe F~oȠWB'XFh?|?K’X7j`DNJ6xsMV_zeثHTe/GNj=ݬE|4N;^}HSGCzyi1o~+<ʸG -We;tؼˁa\F;3FLj-#f4l= }W%m}z }swX_ "<&ieͣ3&Hz.>{4D7zuϷHWv솉`1$8* />i pTp?8#9}uMHk -~ͣ[&Xh^r>tU|xQFN׻{g}|jUYQ-˼Ob poVN \,JMu!|%^cIeQ%qJ:>mzKYdY07qĐSBH_9FLP%F ijwmn?SO´GT„#\ q%ofP=;Ql2E(6Y2W5%ݞaā-т 0FȰM 2ST(Eޘ&t @r/ߨdY6xj*y-Iv5ɹXO ~Tr;tj֕;ON5wHZJ0rDw^zRK SHWˑ9*Zy=[  X&҇`3ami"\X-ZZӔ5`heIVeJμR;(P|RkD&B{ʸV8KT|x!j{C{k$S'x ڔ%!J{[$z(e$FtXZP0bPVCLڷP"}ѬV%TQ-8e xҒ\prQy`KeN"jb|!( <9ZЖ1r:ac )l]oAo6YS9jt)G' /ow|vIAEZۗQHUXԛ?6a Š良 ._]ó;k G+퇿S;0R4}7v׳ŧ).:2hŻgTxp^ざԡk< Ɉb8ޯ; 5cR *N_ dw,Hkc'/N)tɠd1Ӄj ZU8]Yo$7+^vشx iܻX`;w^v ʖlZVU`Ifz) [aotH'N!h9śri\f7T-"D <0jA!U7b6.X뼆 Y]{sNi=s {-eRXYpQ30LyRVǭAFw̧0z쮛>Ci' /Fqu2rV0٢Zܥ/oCVӏ퓕;6K.#]i{F8&N/7Í~HK1Ϯ$w(;z>CcCAAX݋; ݆ 9 r! kchӌ51ͅ4j9,>kD9 MODuF3$PRD(#O+BqF)̥DNGo4T@omɎ8eZ9یpF%Yǘjߨߘjvd6= ݇k|K:cBbZk/JFJ:!m]{&;n(Cc K,NpB4ѬZSF?Xp'e,KqU0[vx[ħB-J@K„RS\'M%'l晫v17fܤϷ[#QW7mX5VgFJ.R0ǔ!oܯ",ϛOdA J<6Ș4\`[ymFz-s"luTH_ )xFIp$4dt$Bj5CI(^1CpS:T'53tHVeV$-'AHh6emlD{\pDb$𩾚8@Q%x =;bJVCdb;r۽ZNXLrOI{IRqj0rhqΣ$2hPy͑ic$! "% :ҼFyý3GLѺ뙏MQ}|>}fwwEM$qgo*픪(s:h%ƥCʯ4j}.PD'esOӂ{:')G^V$旂;G-69 }dd1؛k&;I۞(7_Nf LC[$ˉInST >Di}6n>iPDyEuDYpW}JdyOSgфĹƘ4uaۿCЅ q7؃ԅ?-uK-bHh:1>T k>\3İ(S/^6>{d-g֒>F{ 7yL%65ʲZ훻{eJ~5xݭ,B` eg18|9sET2ռ#/bٔ⧻}r:ǝcuLVW\Z_ C ,8 E 9Tᎁ)UH>1ƪx,y/I+㴵=C4'9Eup)NkFr㮓_ \ ɬNN|(DnuK^aW1 cF.<+@֞X2SɬI$i֪iXDq*A8 7DHIcc*׳E? [dqoW3lHЋ_z;Hi<*ؒmc0v;l*;C q27,ǯqD33G)1y*cIF6FQ 7&Í:2\!kz U˔y) TR\xOWG5&ʧ8 L+)Q WX\Ia1qA? USmW]D4S"2i2А/$w."?,$Nc &Q Q?.2SfUbKP}37YvG_n̊F%gLT> =}LET*j/b?i4COQp NT%Ώ%KzE8Z8e:@TK==:t|y~?Z8d-*DfͿ[3ef/9a8SfHK&;NPҠ(e  +湒ZA CDRXŴ`3a#QCp>YQ+ `r_p" ۧ_ HX!ps+pٯ}FMVSK^azO9C8%XOG^Oc#HG$Nnv?mZmXј5QJ%{w]zF"&-DX%Lַb}}Dt)Z*7>V)>sCMfOujg=5Z5N .ARYMpi3xQ8/Mgu{熈x0ު"&Sۜ uY3FA+uܳ*p*o$_Gb,?À1tpE)9,Yx??ƿ]Ζ]N3 4[0_P_ RR..wd@s nARk<8ǒOB|Mg1|~\>}8xQ_ڻ}8Kb`.\͙ٿ-}-B߮$;ܔN)|Y9va|\f)[J9._ +cc A,.uxEB#ߎStdcSWV319oxx(%eSnG=RJp fKbXfe!^^tXYLHC5 {%IB1,З·`x+r|yQ~tL< };|wgK'O1WIXX)*pkWe/͵x".1y=y@*̃$ */j<<2L^Qш4y*W`˸ l\U iB'RiQRcDLiboKm}8^57I0DRF<л+&hVz^m-¦s."p(T@`ZXj[#`܀ueb3ScWGŲv MP&+R@};.Ġ+zB9Z%{^6s\l??ݬC5dk-TΜC\Z#iְ@37` #d "GN5 dIw`1l8l(NGOζ^ڊ 3c*椯jbu.\k'qp)Z% >^V#cem" r'@h6u(hcf%ӢbʂnZ#4JPEeU ""jL/,(&T`6K@II=f]PY/X;@4jl% T)Ylc8)5)69^KQ{JefL!0Lk@R d/`s(N롼N-fJN0 ul!^` s[IXpC6! ˚Ֆn'`^sk /2k?fߺVh3V^} x%w=Rz-r=K/LWN{ V+jQ5} ߇V3L<0!dHxyrIm,`Gx9hOl.2g>Kc 뤼1.礩ڲV_XhBy>zO%g>qg&FFp6`@"!'R:"1*hm rRK-DϨ{]ǿ|2*(&|a|׽;|IcT+ D9YmiLsSG3z(Z|"ln{Dװ3޲j]D֒˹ UZUW}Sԑo1^IY(-#OW a WP_)JX%k!1FU{x_C9t;gHN>ZNlI<*v-lvaD`־n- ouGQ=J@~hH.:I@js9C<騔=ϰlTmp6FXwzK2>,&B+J oYML1PUo/uԞ`TL|[>EeX>9]^DzdkT^[T4G1QV7P|k$De8+ "J[5ۦ\ HiHKKX5;Wz21yL"}]G(8DOgh%[}2:1Gkq6p_Z~YE_MeY-mY)~vxa;kLj{gKsUVL.5r!Q8tӂgZȑ"bnm>d${`2 ωoZK+JݒZ1[lɲ q,U^dUtp[aiA@Cqi)[BOqAetbe4M"'kpIQn-`+ Q2'y$I`-nWV/ߝm5Ȏ"[MsPT,Xe=qXVZVXc|_a0A_rS!H6Z@1qGNQIEcQ*Gqj* G|OLr%+dv,Unoj9zEow|<I.T?|w5`:hKT9lMǵџD%O?DΖtpwql#BoNg9:~.p|`ܐ.R P̘o.Lf'0H#x,~pTC rRQZ/LjsX(nLNԔ} uLJs4%h匔iheK w**"W55F_(P;%=e@ڂͽAPs*_pseu҉&gO\"s N|J6krSV8.=txhuص/V)]bD)mUQ*4*Iì@U I5UqzfD׻Cj@Yot49@GM;QjC5;Oֺ4.aits% 2nQ#V#QE4 oFM4:H2!#FDotd^xFj \ɫmKV]fb[$6NQ+]bx=q{2t9Wǎ"1]#je7RaKma@:^pLuQY2:ȨެcC5aJ*e _Qi. \(xvMw4Fɴ/QE,2%,86*qVKeAjZ ϨG!pZ`켉pwـFLRO3J^DF4*c 4EVS!Fd1yxS<"ǎ*I nCZx,<]s|eEly٠ktMrIn5i7j-G4%4xFKYјIPJsMn6R¯-׾l;sFZ7?݃bY\,keMrIXV?xB(8~ mH) *Zâ'4%xـjb}QԾ|ga' V@h9Y\aΙ" 6ۛ_T?5Z4B@>x.\1(h{{zqN>=-y烉S; 0>uK\n=w/y w=(nr\FA`T\,}%O_EU&4Ytf54Y}e{hlC&ZV~ 7$.0bזx$  ̄H,LH >FTobF4KY:D24MAͻ{q:Bp8*ݍ`+!".~7>D3qw6e͔%yiSJAEщ><B] fH!+a=\l XIMqcWo( ߪ7*y96CXn41̭e@&lu>zgdH $6$%#h~zɍtL:J%x({8PkƩFVn @M_ f!ܐ*akvCA_ESʝ"!fI4`hQxmuhY8j ϓJ]GgE ZGGDQ ҁsn!w(v< ;,h5inB/ܲ#@n[J'f4FT"V:w4^]ޏH]_k̅f4>, y'? 3hݞBH DKxQi @IB23e:0xѫNsHDD~NЕdJkW&nqrW2'8#7Fӥ&ʼ0697 $ѰA< Trm䜋 t^h5 lj9x 5" {Xפּͬ&!=HI[z6σк-\UҟiV%*@cՃh>׵iChꋭM/LG\5Qht V&XɒI-$POeAtH>E:jSךCb,j4Z1J@i$pj"0HfVJVnRw^td%mJz$+#>}h7: HǟQ'ǣ!.ܣ WaW%_oŜˆoƜ{)9X/Ͽ#<\'/K  >x(g{,4d%c'Ⓩw4:AQ5@7/VLq?޲]VU?:F)}AA(}b ؋%_oW/i^݅/_ʹ$[{Kтlx%uw{q'm/,iۋ}H( S5t1)!wE"]#xO޽B;=Z#Rf]bR%vY2Rb >fM6p2؝]>Pl-(-ZLoi(3jU4OɎu %bppjm |Ȣ}/ॖ 8048 Dt QKP ,RF"#W|657̺lիX\.CR3 8j\Qŵ*͚ׄ*js\х׊H}}O&vVGn-IjJpkp+((rw<yJ9="Qݲl-r.$[q^){U.cElW>ctg`LwĽ,mE%<],F\DuO5HQ%(-ZLoi EQSHQsls@>GOQv vFH_Iֶ$V Д8lT&ЊРC J2EqyFd R:YuWW8Iw|5eNg9 w X']b$(D{ǃP S9zRtJzʀD5{'10yT We1U)7#'TnkL36l^!Ng!N-Fp0`#ZͿN5Bqz밷7ߌzL[KsNwqt}734 ?f7oNO8!}ѧ}(ۋߌVG]˸c4)o:5:q;Ȇ{a4塀dg{: wB5Ҵz9&(7O,۟$5%4hw%[6|r?)Zg%eƇ -h!- MۺFݻ:11L?oo&k3}>SNZ'njÊVN'$×"1C6fC#{c5fNRPVs2^>nw$~YN^]xtH!ITV$j HLK Cg=HJ=R6IԾ6>z]]WM]E(Jk WYrjv#׳.\-u;z6;Zuw#ב-r}Gb\=l|UzUG&1x됮zmGcV'Uؑu*Zv ټ":xpu7_}8]D zsS].ߺ8/cDft{}9vyt9kE55+Wr;߇Y+;:WJ+]1Y(ߒdqO}Æ'7x{A ŕ>aoHfFhRC8ll ! z[qiW44/dzq? Od8 pभO@+aqQ$!Ao.RCt_?{8nя' `13 $g;}9P"^vOv e-_ԦDIŎ-SŏbU.Pa$eÝN0dJ**AImNH5"m%h `X V9As S-8MaVG&iV=i^*pHVzsK+L'S$meshǽdN& 0cˁ_b&&1!f27qd LK f"Tuoz6*f[<}/fd!K֌3bjJD7E}_Ppi!gJ֊Yyx!@"|(g3P1hL&sP.}fl1ӘZ\"aKyUDAS.QPȅXdD6:GgZʜTs2'A %HcBE(i)﷿"9F4YGއ|A tk$"*1k%1$TrϮJ} wžA! Bg !$B2`%`7c!a|lAQB>Q3t cК 8QSмiR)J6xM%;+n ~;t_";wigl_Zf{1P)C`N)?.1ԗfIl cӉą JL,.#iM8`3w7/fߒB)ph PoV4j1Q7o8Bk;ϊGY=-fI"tV cޫD<(5g%|N|r:թM4L nl$1e Lsg9hC(qɱTscl +(!Qi˜pr2)8KUJ%STH{.Zh LvV(칹\Fwk'3̞&3FTj{ ̶`x5jv޽4ft6+dKCe3F˨S`a )sa},"5rHh2rr"k`.w4bby(fi>gwI6AT=m%2d壬p7mb-`-z|cede rjPNI9w{t:73tDhO:mIgCk Ү' I=xC#ދ> S {Ah}O:Ѡ蚞twoPe$'=Ee-vkL0)`t4m<[cq-mOJ#C~W{|9Z-h~t Oz ff˲ Bx/,1(oa*3w'AuD=ghN B`9b˰ۆƁWg!`vzyb.̺_u z/u{vVK lV=E1ڙ_URY=C@b H ˲v@A} Z0vA~o gU퐚zPi|9PvV *@z}=%A+f"rR(b`QJT$2%R4qdRD.LqaR o-bcZEY |H{aLnY <(ﵞe·h4/7Q?zI2ޯ:EDC/b u4ht ݁#|asr߸U<ڻ ={$&LnU,}>c_οA9'ܸmqGPL{t(k&9-Ҝj04K1P4ZQ\:WպdSZX^JS}5Ϣw*9vf3Տٲ̧w?tن6ilW%@7?Gc͛ݼ_:}JNA; ۩FxR%T^gy՝ +E*W{0vTSήku'P/.}W;U 7N`vB;AJuyvGD\턫N˳Uj'\6wU! W;՝ <;ANwvB;A#tyv,\턫N ,w W;G uvB{;A3J詏r 8,\&ȒTTdd޿416QSY}Z~l5ꪹ_ RpæR?~bv/Tn.SK3,Eo XRzЩkT3jaJ=.L3Y z:UBYbg"rZj(leYR~f'nsYy濾᧢2irjɆ-/uTi1;T@*a"HEe6Ïia -C&%p$F7MI ̩ *JR%t"uJ ,/jU`E':Ҵ{V5 üi7(g 8=ޠdm8-o\/:/GQڌy ڣ٭S+[k!_^, ϺP׺#_Raa*;P ԯ^t9mF{խG˙7+Xp*^ a)9?>pUH˨^@vP=yWȯ^J +|ŶEBۂXit_=:hS$' l'{Z̦DXͥS뇇9!VAsgukIz# DQXleMv²7v>VS^3:涒Ǩ@9F$Lf4Q,5ֳj+ɣHxs{kܤNSJ%SR3oG[,xoŻuoc `MݣUu_u4iAZ{^"WxލyXOKh\=8nY|y{snǣv[/ݮ޽sNQ6y`1O%d +uxTz/s7G) SvkZ굅גv[hcbX\=ou=}~i'<,،doU8XE1J:ø{/{4E!]+\o{ N U8 Bg YXcXOk[z陫AWVz{k.X)kCT.{_dϨUO` $p*ip9|3[]{˶GCKʎ:iiX)--OoL'/oeh_ࣟ#18L.q LNjDt f-l%$wT bQ7< )H`l拿nǣt wcK/|3i2|<4@Jʛ"}kx Ji+԰Y|ssWD&fm$*I)x4h>y᳘.e()~h>*V?>3&'r,k|SIc0 5\â>{>{sY[F$Vp/xtfUI9 dUw 0U]ÇQARaKAR!#TI՘b:^٦GZ*oo>-e/3-aQ2aBɬd3dڂFALi"h\\nh}K >[?p<-oG¯ 6UJX>XkύW5#~Y BePœJ[2lj0bi0y*GDqo~){"-.\LgֆfI9S[ExF`1 ևE=+R9crQhp3ER[Q3pHdPͤLYƆkiEnw@-l'Jr֖<81Z!kaFB`FpJRR'nWy} -K+bb3} ~aTE{JD.yT>uւc!燓CT=Hc"E ?wXs\R xN V91Fyί AԈ2]6T 5mcHJ2t[m>+pA (;".P-Dʶ zP }9Pn׃Jˁm `FWfk(^e]46dW4@C!EW5ϫyN/U`;3FN?lX k1l;RSw 4TfQýIfP&b!d@\o%; t#31},: ja ]ZXW B:^%{B~]J {|/̒ub-\U:KOc`xCjeHU0tA z*z g|<jaTycsa\>*&:+C"mh;#9Ӎz=*Għyv &W舝8/wčq4OO*l=8kXèrAw̹  ! I^vgg#qunz>턣 G%T?IQ".q"UD"Hd9t8Yo9C1!PFB9%:o!Q7Z 5R1 <7cP yr0!πp@#u 9I*MC*,V.".ռ-kbj8b,CqpYͨba$9#ŔJq\p9zCöԒEA¹Fe9-vBIOi:!+k_>Lw7ƻT-W17*NniM߱< mxŏ74fqH1zWrCޅ^+_:ǂ1i__{4ڳ#*gg-iȉh#<ͱ;M!`ݚb:MQź {yNqt?kx֭ 9qmSQX=]Ƣ[S RT;X!w5hukCCN\Et*# v2Pr[PR@/%^WA`F ?z쵚"zG&2]>+S_r.Qui:FlTM@-CG3kҸK;kcB+!vyZ5J3X媥][%},Wm궲*[ns)[oRu_{mTC^2}<{'pmaW`5]ٷ=HN`xLե?;RB*0y~y;~ylC{V"]G\*WBftGxL ]b"ZxC H2P/8qKW~Y׃B AXռe鵨JH S[w-ZnS*vDi^`Eku{jZDqj&k\Ϭs!\E,*iY:=Oq%~ȆX߽{וL?l$hͧe,Y>ܻ'ճ:qAUޮ{x,3!`qͮ?NoeF1] 2h^,[{jUUP^k 5n&!,hgG[_s{ꦍpIu$wi{\TrHü8 )~9*{\e3L9_# 7ED.X+{^궡檳ԃ#^HH)y"_IP|_W;U eT.eT{mm|SmKp6!qc\mֳV6b+Pr*]>{^^"KVl].ЯP+Df:UPr-B(rwH$Ɍi]p_r K:J[byo2x=cEm?a [쏯[[}<Һ=l],Ye\D{q8_]ν ^7o0߆:o>f)$b/6XzouYZXeŝvdu~wY=bxOfO\]>KDھЂ"SiGZ2ILU.ֽO^ ;{/S<3}PӄEajW7l4qŷV?$'+G0w뼸;}i9-Dc cH$$', vhΘKb~!3 w5)oVz7/_on҄7Lн=lV/(̦/>>x[z_srJld.4ۃʱد8BT(DդuETPVjizU077@ D~C, QWTZaʑ p TjV]Ʀ`r%< ZEP%PJb!+ɭW䕄!YY!=~Q(yY uTFC{N 8ms?i"Zty\71P.G,29 s" FNb-HD &# ǥ[o_)V)Q,sМ:A c@:EV8$x`w0(Q;+=VydP( yA(dm c9/v.d𿏝sThi)" [?uTaV*0I{ 3g,PP/)?&e(C {c(u''~=;ÌӅUp 'O(ISp†)?fN,BE(PR4ʶ*7UL$(&Te2#aRq q %ۯoO`~*շ/x〪ǟ~N(n?j!eX{ogo$d".~Kof󕎿\lNP ̽1`߾3,&+|6ʼ?ϛN)9M 76n?S0} 2p/&ry0!DQN_BLԇ>ûܓ.JgY+>KXڒP,aYwW<>giǻI(=#VKHu'ck,D-R 'ͻWzmV\ڲ{6WbRfbD3*2ʰ1͹VZGdG׫PBZ$Uؒqz^wV'oK_g) Xg}"&Uə <(7! 9`4Fb--;(u$bДZg1\2Nܶ E6 P,L$ܺ4ҿ'9S$&8\{Ye}(>PVE^ЮFKٴ%Ԩu~JEц% tc!s΂M&rFpgnZ޶1]^hn>TQ4撄2284'3k+ OWVURwv/;G'G H?k^IXݸYmjpS1/hA8V\f(A͘V e8u&zhCg0?!$JqT(AO*߄"r|Pf?o, n'Jҋ!H<݄P8== tBlԏ6LZ^4Vk3M0IEFLGGIxl;Wc qԽGV"66u~՛9{GfXt>8u 1ƃ_b()GZ[nfF[ YPAnU8qT qg w\E򖗯SFgY"din7*i9-2bKE_e. j@' ^PE4<{/ <T<3rٴh;1J!8.?Irx)yDZ0k5vk>ו"G("c^gFQx̣ YH΃-(8#|,:lX_ɜbJܱ61+>ą$]/YsD:Rj|y,Ik )>qg]QڜHâ|hl(&Hwŧ7m~?mS})R[ TIZrz{ŊnN=R#F8M|c8P(q4ќԮrձ9PH.@z6"UeXxf4?Oo/g;?_uwoZK^o;d~>|XF''Bv~1?*}]J_WUuUjƷPZ&Jk [3fS 9 >|k͇ϗ7QO=L{Z)3#, eQ9X\R_Ƌ7hQD""OeMBKB}T@AD3_:QX@IoUP(ղ ^}Z5v86{݈R&Hs| @ju^5zբYazybހ^&SVDLbi 'o}8IE*X Wrh`J,XMpO~X(Ъ(>"HwKg,TJ Ħ@-4bAiB*(˼u+xևKNhy=t ցED-vI_͇&q͐}b#'CCLRCԸȥDT|Kf+aȁQ XAuC {V2RNH1V8̓ 6Vb Ta9. CECE "9%k"5Ry2 UIJ8|f-S(r 1&`+Вu=dZlObOi63BngqI`b u5!(ATS=)w$ux`SAzqj.`jW@W{u R0]K\!P= nDնOD3%S'Q83~bd:F Zh`фУ2ja ty%s 䱥vr*'*P~Y098ryQ>K^j@c*U &UH( ޞ 8Kq=]AK8V4Vm˔630Q}oN5(D&3)#J 8 V2CҢ؁DJJ6ƈ.]4`'`EyJUH_Y+Xs 5;/֥ <gaŤG@Ga1-QŒqBqCHl $ZK7p,{LX{B`E#G{-1 H!q ˃fJd ۺ8.`z|@ ?b>kf"|19Y;Kۀ[%&"n6޾|1A+tIQ5b_x3$Ml)ai3#Y`A}kz'p-\ϔE,&F(S$DrR$ (ۨtFrvsߐIf( ™mAo,+N_(X(¬&* FeP@DbXW=Rs 8X0垀n{"n2ӚLHbBKv{lą5\#Y/ V5*a\&!bRR0+XMSLJ (\ҝn9: ,$3CZ*1Xc%(#fB3[Y)f I/fKpH ,9:^EY?vdTJoSD8 ұ: n0CX{Y-$HUIP5 R!rH!uުF%'0i oEJW-7BS3D.R  ]ȓVcazrF ZΈR֑3DD*,ުF^%Io˙ʔ3X=MN\t+&'.@ 8H9xuS瓚t@ Pf둌HXBG_w& -LªD*E1ZdO,ra %iIJd4]u8o'SbDк2!W+e+nswH,ENeuIw<ա[ T:ɺe4CYg2yZZ;F)M(*88Js=?gm/G&:VA%$FG,)) 8&_%엔޳5bcLg CSt K#,0>ְj 5;1nskH+ez5Fcڼ,YKbnEfû6óQUnWYz~E"~.{ৰvqX܁XZ^^vT"@k.׫(j_aŌi*qlC#DuqZ¢"c~=Of2Xz!_ri4/i?1ny٘Q֤kbS+Q=tG7 v%iFdOGAh4lSˌv^JpIebk7t^^qzUm[9r4ҙA[k7 Z*!ɝv)3\qg ˱ ͱV ӼǛbJQgngSLWY !銢8gײxlC)cג|'ר>r,I>|T+$pSѱV'izLtzJ=UGf0̈́xCR!̨}`foLjv`C*oa]pKVU[5+37֖x=ht}d&3\mlL& 8zZ+Wz8K2NQRiw#< |I!&-.dv̪xwU3 1⨔ߣ#6UfGL0r3[O<093-?B{g֪0P]vF;!hE|b2L֕O3'0E֝m t:8nt0Eeگ >k H$U0"Zb$"#5H&9nq #W|aճXNx^|- HE]1 3JYNAF(")ONK8GAADAJej)5IDG\Pt9MX-Q[`<a!)mhyŰF,TW b.590-#qTb7_*m%AdgL8gɏFB(G:7WXVQBE̮FpV5 xS$Ntx#I&.Ϳr*2AX^mD ߸l!i6"ZF\7!n9 g%HAD JiyF5gLzٟ_B-Cݶ]PZvR,՜eEƓfo>"[E_[.+R.4-ٔ $өA4C줂Z(_Mq~q\[:>ɰ6V0!~ҴWkSpNr& tX>/+$7(} T Μ2NFbNxF 4H0 [';>y== ȣ4GSvʥ1^40ItO F6H.zC*eQrwW7fj>-w#Faa&? РEif`K)Ou 7ܺ0_!$ǟ7OB.+`Ɋ9ձ5eEWϊthO1j]~iiv?1Dц 9$k(ng_|Uiͦbg& ?IV{㑜)vl{Y81cU>{cLhJuQJhO'H.ƣ<ۛMċ\/ǛNyS%krfx2F6:4C2nP@R`N2{poť==:¨O>ԍd2b*u{o(faOσ!ɧҾ*Ύ|@WO8f 3]<>?y@`->Uoh~Mx%k lĿٻ6r%W v Og2=geg;Gg6 }ɖlE6٭VA0VwWb`yc;]\k*Z{(@A uoF%٘ٿ]5BA8Np)JVBCr9i0 D:=9Rz&K7p^.sВLˌ_)7T^B}ձDb@19?dGcf{L=Uϵ pY-8 Q6`Ba J(tԌ3kaB(,\Qzb́g.wDiGze/_@iɵO u톞׶C2`#,јgO\N Ͷ9IVHlĨU!Q $n?J !xÛOm$-C=ܮ=Jpc||xb6>!F->,nSʺ iqœb h;~̥P\[lBSx73OǞ;`] W'r1:MtM19K/&b32!!ބӲ^*Oá"'?cYQ^*]bښ FT940,NX a,{u{DM1r[a#OQ}5q{$]߬x ̨dڽߖ&N1A9J* n} 5bʨ Lsv˗m5,䕛hMN\'}:=OJ1H1gxNA[zMtM JdIlv).I\zZdZ/'aSʿ Y\ar-S'UFa^!1W)ԊDN𘂑Xa4!S*,9 ϓ 28aY|RACx %x'Ts|}@QFa/ ɟo9 )89+sY!ƖYQB̎hkȾNLLm .w5qfS]z]==gweRr>q %p\ۇ&/猱fg{ſ}YwV7_Ϋw1I/lSF6}eL%'㉂򋰮.w}sT/}GXs<.+3Q;Gh# ^4"iJ4AVh;2a/M>g|9#5Pi WMl "SKq&BG8`R#̷wkmZ"֨XƑWXM$J0F!BZa@H`L105f ֫O!6X c`XZbmT!P.!+9;#-RHMT99Tü_=NrČ' FRJjf(53҄ J, -,,tHcN3T]̀le\(2F1ʸF]F2fctzg)PR9V`KÄ +AEdnLbA'LGRh% D?ŠX ar'PQP`B #xqZmpJRpF;WzfAIja%j_wۻEû؝$jn[vݠ2*:.M+ #?^Qf\M$cW4y0S?ycM J ,Ҏi)ޡf,d?g=S.ɞ-nLГ= xl/u '\|ӣkh*iV^|^rJL8s$I*/!2/jI>rƘ&ѿINwQ¾} mXϊ(i9YܺwUIf$atfǑөO~]! FM2xT瓫Ue4!khLεzrQ+CCf5vcão5F$i *MrTH`XadO^@Ř3KTd]1iL1*"`'bD(d`^@馾ڂs)OB,32q'ٲ$߃cp 7*rcEJKw=^s`waՖm T0EF[EKi=~*BIwnn`Ԣ^UM#9))pR("K`CJ T`k e^jl2Hv+* xc43y"rH C&:J iz䈅8jsDq bI%҇H~ܓv~Wa(Q/Dߣ Nf5Ej!_P|rܛH1 B1^G]3PeEĚ.a59GN(o1rSi^O|0'WP],fuױ=ct McI% dq\׷0ERu`NcqR3(d@*]CWM0u 4굓rFu8VQ IͪC28 ^3J )şM$^lL̙x*{g8t|k$WqȦe7[D_2BXw>j2z {tni<>깇z+?V2>!)ix< \uq^eO0? D*ʌC Srksc3z&Óp*j.b6JuQ,b}m=LD^!('LJ C.4ROX,nG)Y/sΣTh2±R{D,JFriU$١k HMƆ "#`yhI&G#M*he;d !WA#ƍc93`'t>]˯MK;<@uH91E@X,L8e3JJS^KDP6a9,g-Mp+Hnzۊ%ƶKũ7bTjQt.:9gj TVgSg* X 2T `^*ԪjJUnU-&'hU5$PlU7d]_Vo8JhLp$_Zf (Gu3Wvv˂@XeG4y?>ipǺCm8ν5A!" Rj(N9JE-fz^x8L.m!Ҋ#n\\/z]ݑVvE/rqYK^APb(BS m2tnN^R@ uvot9ȊpȇFOQ:JBOtzGiW@ #a a^_UzJlJt)Φ|%Zdy>fk$ $i3Q9@2 e^$siOF'C=ODB-H4 s6WO~1j4D dn~vRDlRW8hB.fs|,ȏwS<~::ɎFqhT??ta

؀9y݊I4\Uθv&­T1O,@Z@i[ AcF(h1zVhAM!gRCwmm|9''ozr,$}I`Ү,iyq,O)pf(2f8.=]UD@tLi)m}^Rg턖BM[磢Z9'DdCp8 ~8QT _&Tjzt;ݑjsrP4D`ITMBN$*;)wf<JݩքR!{i\*v ͗Tʆ<3$D'")\9d$K4 U"@wSAǻB, gJ(őE5-xJM hP^\bYZR`Y7y%6f y7l7 8\f7 °/ItO|qVUK0cᄏ|x`ZE,' I;|0YV5o|Krvd:[y]!ySi~Ng/JsD > \G[fV.~ŷh1E |{ j(_#L*-TR^2%gױN0̈ ے|ښX($Τ kþTdM5 Jz5uPКSXy@Q%~Q'Th&W~څ` Ge!p.ht`h)1Ρo".p1K A hr41q4m\oEfE\I"wrQK˜Q}]`-9z0Fj巑Ȁ. !uͳ3B1F Y;Kڀwݠ,u% { )DU_*\y<.Q@7ҞǤG~.+ʋOPnfhf)+@Mύߥy8_.ƙ$~qn?3tc5VJ2^LFq_Wh.?W=٫Y~dM'%'? Tf%ܷmP cuZJ'S֛>@aSș7Qg2:\Df;  hO+¿+}ӃH|DY¢b M;VD6ӽKОc9kub .XÖf~m_5ĔˊW`JK0TWUWٝr {wm'SJё O\[aXJB8AH5W.%NX} j\3/z5A!#2ϾgN^Zy1 G/*AVFSRLQ'鷊 -R!H(M5.8(H4u`(1t̷F+/Uxqdz<"j\XGӧ84_+m)F:~qKFBdeBi9-h$ uFQۦU1Lޝj#!n7bvp x)d'kL(oRzp` LCQh:P4oTb;T9yj#ۏ`˺gu{ W%Nr]\8\'w~Q-G}_f qv~sGcKg ^Bx]x5&n`>jCB>_)}즹ZnuM~rUݢv|wKPc5XH IBr$S6i79vAtv{M;kDڭBK[*LUzM|qX.U vj%U/V/t@Cgn&EE[kVR&0C[;rȋtƆ9Xmp@ƯP31~-h n\ _J}$p˲3JÛ:Գ{$v{;}vOZ-JO]vדS6(d(%G=,9Zp)GdGy.74Gw 7j<؜%-3'nxNӞ=m`p}Xs: .Igs߽4)x0T[eV")蚸"J4>$&̰[RuHp%!^1DARz:É`Brw${ 4qXWtϹaZügdGkhjܝ&] ./v/3_ 1! <ʈAEF" #+D( .Ɖ$fYA^ȴhC3ӌҕ6v@Az}`J,,% 8j<()2XpTsVLHIJAwNczB)`1勼ug )J^:EE!ZMs;cPB(˹gL+ =ԍr-C>-=.4fvkz4nX&-Z:qI\"ְEcxJ_7}DF΅fQߊsP< "d0p-aq9 g<{4 9gr8nP oހN/_rDzSn[.h3=>i]dݺ|(_=:>ߠ$;1-qXh=X'j⫛Y\ɌaInD-$㍟q7D|d7P`ʟdv=oW`DE y$`EʣT0 ڬqdeW@5+? ׳IQÉJz+(@UhP0 {mc-%ofK/~PC>&/>:8컧R9tܥܳx ()x=prPd7qÜ=vweq֍~\1G;6{T^ꊌjf[ B}9ErO焰z[¶s>h;SA) p NWk^l=9ݕP: `M 價마RݜOd _nut~'9/a@-51}+S@T}-FO2[@WzOzܨ0§XH1`jNC-<@.ȁypzs9eh2GЯeǖ!i":Se:if~7]?e, -Sb|NίrBSTl|9w dϋ/KT{i%#xSw( -n8dʓeHJWrpy8iX!q!J0eQk"d$2'ӳ P>& ڕu7ԁ38(4nNIx61qJ[W20GƋP낓N0u40/lv6 sTUGwnPJMR_" =MJG'q(!!Y US}@=paBF&5 yUQ-ήҕuT.z&­V&O a q&0sR ^3Kk\4:nNnEՀ ֟G to8_=Yo系9/q~l~tU/9cT^!ۻnB5@huuŀ{b~ bc6P^ ꖕmEmYntss FBܫȤKr>FFП BIP8^CT$ (.Q!]4cX+A`Dp0.MrIS֘":GrClE*k񛥯x1<:fV2;FS{d3hP$7C,XRi~}ʇɣ_.7n|CFu}>^_,&͟TC>?4#VnFc%-7=.L \3uXc@q2y=?հ!_0:ݧҫDoF>btg=OUp{>X{/KO|qgkp3QQ߱w>n<KߢJ'tw1F7MVUbHFY1ΖVy/)?#B߹K;K~ueMBSo>]|KRg =^lE.q͠LURw#5ou |Z8}go!wA s_q Z+h_v idtK6868H>jA'Z-I9Q& "Ѡ 8PXj:Pj3_ m:jlrP3h]E=o*Z%71JW2xziQ 7GAFOҙd6Y/p"p#` 0,*ܠ"JȈ.*% wd0TzT@6^Q(Ф:0b)3 ,qRO1Q2=>p#=W&z!48h &&$nI9ѐA=!tL M&84Ε&Pn(=JI:4:ya5!A^I%c;dX˸"Zi'/As]  ?Cvv&W _`Q9$B}w۬e34""65y-r0lޑ60AC.`-Bx5ȡˮeˮEo*^R;%n(ފ=s14{P3p=/qٻF$W.Ob06[k!),*bVFwK,fF}ykTը\8TbUW/ݲ#Jy ۑg*FFg(M~J۬ w3#W5&,3t=vXq7Ą%b®6Vz[`N#r' ?RR&X( (Xq92."&ּKJ.-q2yBxP:͑Θr}r< 5R -X KMBi0H9AaGj^K/KmyZբ[^FţI}a:}5WR >̿C B㝇`y]iv)u?*n>0~l>Ǩyx@lPY,9pQy뭻śVkq {_XO,1k\\oB^fT-ݔCnNnM# C4'z6,䕛hMIQ/hT{hrsz8q\ˮb.(g{ꍇ2.h-=̯e/{0aY%D3$nKNa:1w{j7tcIִo"'iNkD8InEʒS& -֡N<c%ϭ]Ji0eh+ 2r"ɝ$8< } [ Ǻ)|Z6GW3;A~AN4j5$(T`IxG|&RvX4D|h_&Y]׽z*S淥qEYG𣏕bC)"ƭ|ŖƬRu m/j*\J2oFn+4#Z tTW+5spu|bj%B< ig+_`"U#RC_ 'C%`NSmKr]J 5xy%b9^6LhK`(l[ wQN)}%bEUȑZ0.'K#hs.kUK⫹Kq݋y8Ąk2A𱭡A"]K@nɢ; m֡hih{6m榷5uCƔ+E7=}( } r-tzcz'Qi+%GTG]ED[ C[][O]C[\*qtZ/F[tM4*go-5*py*[u ^ Q9j#BTA}k$]{[Rd\=*#.tbDt ll7WlmN) 4ckCL09&nB:+qswÊN3.j_(@r0P "4>pc,ҚyKSҗZ:~Yإn$SDpbo9)g",Nk*lheb/ +$+JA GA=R9ҨGFda+j*%Dh ;-5DZjԑ纬j4*)ZLҬvC% /z'`EMTH5 aJ™G "A_eI-q}x_d`Zh*\?^!NޮQh̑Qh54(&SvшJ{pi!Uc&8x^B` ;L0hZ*nC0B7'EJT)X:[ALq-9-lNdGξ՝!\zz*eRf)bmHSsW y*n]vHYw z9M=WrO>O C Eߣ.k0-o] BkYZRDIdqmfZKIHin/E;ދ"tݓ*u٩/՗$1|E> Og(BӠߛZ9XeN }JT 54 8'~' g_}H_ogF[otE|\c:iGxeW^-ބ&hTƋ 1|C(c,nkw m-R4R˖~ ! Vu_zpZ0/5V r%/nFo>KK&*ǔ-ie}k`V cLOLA +(օ y(AßʢѢѢѢuZCAr$KaHc@tJP1}i8 rb#JIvc|*o7CU>o{vt31b:x8eBUPgw~bS^R{S|IJ"t.V;ٽhAJTY;QY/.8OwNKLP{لu!s*ڕmԙ z{ndFƅts>I ѩ ڂXC[Be^9v4Z޺* ~xyN@,4T+ң#`Z͸WX(ג9EUIZ՛d!2R\G%":*)-_^]cTvFNɡvoK56KTs@b;dcGr . p`KM2sKd)v`y(%bۧ[?-Q픫KZc&xR(Q/H>[,w)n|Krm.~# m bkuCFwou~I߲H@C\0Ջt.&w Ai>m9/f 1,[= nEt[8&X<$L[-)N-R"99ZY!AUn=ֻ?5+O)oG;}}BA[g>5m[ 6w ̳˟}wp(Nl6I&g&4RɁ)o3+kX=h@!ǯx'Ϡ'&N =lt61s<6Y:}}ۉܒnN@_$ixp%QTXI\! /M@R3Nb$`΍ d>_ Hj\1+g:a%Xs%#7='%|zM1Jǵ7^Q.mX ఙDbӥfDZsL @+SWJRw08Z">D9#L b nS)%1ۣ FJ mz O'8}{ar9 DCZpf K)Zzy4Խ̂!PV;Ǯz:^>>w Xr4M?ݷI95piN2hN96qi?lKKZ}w3yJT.yxQE-\γt@?E:sjF);MRKHN)T1P ?~u™k7ͲXS#ü aƭ| I7bf c1_2gv:b&\IYIɬ|<3~t (RP: l%,vA)^&C=[7ڛNr[.gF?Xb9C7yDžϐ Ipoɇ qo#40&6:4CU9 13W3IE-ErU6}Z~Uv2=~Xm=Ϲ^Jj0!W0a. ĉar%YJnRUq *~x~_OWjv8k%% ]|"n=3ͣ`=-,cYhr~[z/bnq 0U] (>㺀M}BTX'P6+4 Ł>V3QQQ hLfv.YjЕ4&rufP| Q5.UC#Yy!и_xxr"DVI8Bg|AȡȾ z9h_á\sn̛2c(={ ?n  Gvup4Gxp4GQPmc(ˠ!Pz)=sPg Qiud)4Z2GIf(6- GmĠ7oRSrQ̽i#/*NjHf%*C8e)Eq$!(5,8̕7m37B"xu\'I0VdHs9\>s8\^dD&5)B騔Xs f8JDiR~(E3,}D;&U~/{-^Q+#m,14!)uɩ&8r%[| qcQV9aK%\9"U ᣗZxPL:"eRͪ&xD " D-4}su:/Uc; ۣL똶-(jtgfoGin{þ=,dm)=1-u\-fR]ûQQua~Z+ݑ}VKsrju).?#n?쵬gh`JۧgZ }.4̹CKV#O#6׳CM!]fvD_Y@@p$,X^5HDZdj%QyhͳKk5R崲gY9Q:uJ5]EC޶=1$trBEONU'08e #BA0v{]0>ͦF'0קY,=7cJ-ӦG,"@O՝ܿ_A`Gгwq542hQ0k8 ,-JKjN૟Bd~[8OntY/pnϣQԃxWwxcy޻?=U_EqkǾ[f{Ʌ+JQ[Pƹ0 \AT%F)Մ"T0"+pS5ń֭Wж R t3m@5ؔsI.Zر(|qP\I7勹C]S>9BێRn.~zv#d*i בS&K;F٦#!ۉ<96}S [P=CQ A{ZN( sqϹݖ5=E-hˋڀFb4-?kyMשׂ>gm]zm3 (ntKGUum5O4BAp Ob"t>#vwp&rWjvPqtK]T(r0Hy]e>dUSYu6CcjrֲLƧes,n2i|{~v#PY9 ' whQ"G饠>7 Kua ) C9H.8IV*ڀ-;TZ4SC-J3sVȂZb y!<(5Cflᜦ0aHhi}9jx^%"7c ֎P\ ews~}W5RWi $, Q#ͽ̙r@K˼Z֞7\, BP֍t٭xn:$k% +`'Vkjh*TRZd^ݸw\X(ʹQ g%\Jphjq9xT8ƄY-$ sΚAhNRҕ0ᛁ~4mT򕑏eVFHѥGXՉ]`;7psz17X_we7a3s_&Es3rч}[kw UȜWx*f,8OtzCDž~>;'7gw_fwYr~7YXE%[Ȋñ&R(Fpw&^[[p+P$EEQ% !{&=ъ@Lu& 0_Qwdu9 (wg|wVrlq%E1]yyƀ!焨%GJu}B ~_^rwob2wy!( n1EgE\b7untUtz<uql’jph stSU>9<URmžE "F,Vj½dZUAMŤzHܪj9Ww}\#o olzvsuwyuIs]4»C\> wTTpG [n jlxoИz(x7}#Ct>Nl7}l$흣%3j \Qm FI.s=#Yl3t e'`) <WNG|QN쩐M{+"U ]1 _cjtcyun2{gzgX)j߄\Y+[}9XG.Iڨ gSx_[0 J`Ѭ+u0( UڗTI2TyoݹPmVi6RvQkO@M02\Di55lCJl@{wpvS"|yFS֛y%mXfwy׍/ /F>1O\= v[;&q^`?Ifh&3!Wi0$= A4/R5U5%"UIJ:~"p'i,`, >Ed39Hq$8ٷ*;xfQ9 zz䰶%*!LE*| 'KO|t8ex: {sПGBcϿfi~*c^y=jF B. '9ˏה٦#٥AgbBS@];Rd[lI5DtK*LH'L]4?>=R#{ Um6bgRgLhwg7+U\qoWq5–I q}} [}ϿfӰyƐ!8!wB77o!T9W[,x:)j9$U$pOUTX.<ͳEk=[8%]d KľYa˄7cm Zꭲ2{R2gqzؚ짊*^D{3-ZDŽ=nv(9"0yDi04vehy}Anyf`kbOE^5rY߫[kF+BV C\Mk5 ^<`BvswD7BX^xXՃF]aIF[9$WӦn&U*Zh['i[рHҒF%mu|,X/G_ wѻY{"t0vZA:[ tƛi(yVy ]b؋(rT'ni@";(og}2Gs%t2G3eJvM;Nihzװ;_ 僚m{ V׽o3"mzsͩc@ZY DɃu7S!W eJJJJ|MKP ~jռއBs2*gYxht{uydŢ1m. 2h˪Y&cCtVL,d.+HQ0© 26υQRx-a %_3G+Wy诓hR$?9-zQa׀͹T] "QSB.xrdZ@ C >He ]A.U MHa} /2!9A @sACV!2ZddRY>()E %hn41Tb S$;l@:SԀ7N8BǗ^Jh&cAMLcXp3/ʭdI:d85tAI6pr4(Nץ $4ꃖ[F:6I^.pu$g{3i.%JVf&˧.˝cP{*/ G PT^O FAf.LhD6C[<+,+P 42]m#H*kŅVq=\@q+>T3c&㦌LӰHRsI6jdTqrQȾyU8iZgҌed2G&$>:%"vX(2z:WaM͕˦oMQmBBjq,D*jF(;.&O hIl!^DX苓OdFҖ^>{IG7..NV#{|/n1jQrPDkY0n,\y b[ n:A4,xˑm efE0%#hr~%?x; 4M௷%Q>cۗ6mQn~[ 1ƴ_+džRCю2tnT/3RrdB*+CbLee[Z7ޢM#tJKFn":_Q( 3se$'RL뜀2Khy)(h@MAe)f3f3Lwj"hЊ*vZ-c6NT؎(dZ*I \*t'&7f<4@@F\'U>b< «p5u6[M1z{}}3Y|/og1m= i H4D|B>& gcY7qDIg<$X̏y> ٥z-Du42.[7Qںa&?59YLܼeTD4كs"Up5Ǖ)o&G=-᨟gkqIQ/5bl"}BGm6*itCi)[C_o<^*I$l!VO"H3.\S ҄_{\ ~2Sg<<÷?&w )N,Lh?oëp;9݋\s*MP2 yXj!r+ }N dZ?{϶WܯzI@*8d1OI @l%K`=ŖnE:݇%bvzZyΪbUQe ')2{d\I>, F̪cLXi3NbF"8rk}hķH@?Q.FkZ- 䨓6AFvaUQLTHJJb2U1dC&مpL;ev6h;Bܥ ާpvQzQ'MH謯m}޹]?Cw'JHf[%E־4D(OeQqE-ҙ41*>+l?1$35?7?u4?x^;36m{~RVSb47ޤˣO]Glڔ/w,qQE>ȴT:F=ƚن#*H$rd}!żVqeY‹ l $LPwRb'I %&+(<jSUK(DJ-vf OE21\ųf;`BJ-+,Rrۿȟ< `K8@ݮX^:J`M|y(@ b9s|589f0lN G9.G c@V]8IV/9yPAPRD(VF\sV{.KWg yBT:1R!4P0k28*9 &GǷ(lڑzW$)M,yKrSlHW]Gg짐W7[lGρ>Ymz.(p/+AkQ2 /yַt =GTac+^<ɽxSƽ^Ƒ׶L8TGLO.pt_lk/x],eCQKm/|nl.݅1M~Ks˔yXt}qVaзYhJ7inzzqI{׿qwH+m`7[7OH9H2EoMɵp'vd"(]mc1% D dށ?b۞rTeț6t1Ig7^2Í ^myO0rphUoXV?{ W_b^-wldlfwN6XQm1^ Cou0FށAQF~3OȆhD! v%BݹCε3 A> R'mEk M*z /nȋ7Aw$~ A m*~N=O;o'$B^ lDjwii64)7awxݱuDq*k;n>}ׯJ6:t uQ ^;o:TTnv㘗P7ÙT+h-.2/ߋ-g@-^n頝;H@_p,9HöR.nisG,oJ ȿMA?M_oOU9z>)M|͵+X_ye7{/yO7Gݟ#\m%no?::H}ɩ1g h`at5m{)hu"qd@"OnƽC~ȷH .7Wgl\}:iϏ0 ,JV9!tGka bOrrU౾ނ|@{NɎwt=hId^N9"Z6&b,aaAbw@ۇnTLZN cי]\ݜQ+ 'A]V^ǚN09$ʮ,bEQIұJ*Z[r1iRL1+*tVח*AXҼ.(.yґX.ZrTRD}MU>f0 * &K>4[`Ԏɂе42m1ls<aaEi:a%gDCɡ,E%6:t rh$&Z#P?{yzDZ X惟63a.򎟔EcIh_sK]Q(ԣV]NJYf)R .bdB6-Aq`Jb+tt10Ls`ot*ޗm?A ~ˡ;Xd -7)m`䩫ɣ"m}A; Jo'RSM*ow~rz xƨsћYzݽ ^T"YG)^25J*gKFu-+uB.MJKd^9^ ģӨ%Nꭾi!y?Fu. K_Xp,޻L'FPv*g9Vf ̑htx}`)38CCpU;U;U;$-*㝮.$W R6:.*ORUFlN1h/NmjÚn֑M;@NJӿhgqPJMnI6mxcf983-Mm{ܶN ĠH@Pښ$tʐ%/d[[rd;qI\7mc'gaӢX0G*V{zaԝ7Tc$aOvsy#5āwr~ѓ#ixɞP'v:;OW> S*RD#މmJX72]FۥFɐ8ptd%W!n/ "&AD+`m= r26'[HO,f0e /)Ru$ {gb.xZk}D`Zo B(i ֋!`t$I m(jB LI ǹ!y!NCpՒ(S,k| +DGv1ra[ P :%ͭYDh L*cJ_ߏ} W⻻Wk»w@^f+vqQ(lA*bRUm}q䇬yz#oLIf rҪ*ǑBy]m:#3 E PPtA`IƵJ2ǑB5zy[( sFNX∐'Oq䇬yc@6x6lJ"-`aU t\V聼f#bs*KAFeO#?d3%3Al53ߤaUɆK{"G~ { j>-=;n/zX~NGJ[-nki~kGsVםFTAX^.T#E(rqX=oQvc  ᝐO,ΑXA% }j '|!+tAXG"󖜨9=mhq䇬ygz{6V;=9Q{.:{!+@ށ+ \ha{h72Op~ ='/$U]s)=gQFCh4Ы?ޤ6W>ޤ9 TI1TN>_Ws($m4N֙',D˓>HC .IF% #;m*m[2!RAiHi:};;P!5~lmbcblLUjr~#f7옅-NpZ38?r5UFɎP5;nGvkl8|2S2P 9d[u6KEv I~ͱWOӛm)C,T 2-* R[.2oXA)5*xfI,Va+Fb ۦ7;JCw\8j6D:>*WW_]SgZ8_˽l6%6N8ljY G'-$E#xY $ kAGn2"V1M 'lۉmn*_T4 .]!lzp4`:3jcM.۟~Q]*Ft.4z.\?*[R0 |e%4L2:K&&Qv{pǟ[Ʌ2tHFg+YQHjUAR.E􊘐A-;-9p fs>jΓU\h%NH$G55$.G%ZҚ㟩ո|Rkol.iS/J2!A<q4c;|9vy='$-s l{uBX뼴A>(AȑT1*J( #Kn%rZPa2rRbx +*ZIp M&Qs %['|g2h (.|m9~3^E`k2,fj?g˔na QZtяv6F AA zbiȁ6& ke8wG Ӝj;ZI(%Sfj .jH(ۗIf:U| !SEF VU@kϹoɴGsԊn͟Yh*n_; ]lqn#:nSSTpZS:! h3U|^C|Gqzb4d[Q4Fb_<*,T!Zǝ4n% d !iFUPAp(1f:Y)lkSIrيn^VJ+QEGx[K1tm' h x@jDf@@XZkE yͣDx+CZV!6k:Tŀ h"\KNtfҵ:H6QjB́9\r,zu$I9tD &(!RjtUZǃQ?03' 2) g7y&P gb=W)/Ԑz%&NC39"0r/w(U;Rteo%4  d 88fm'sQKjXSrW|A!"nO*@hЯr8c/ba98l &3eחQV$ˋ0ЎEB>LN|V<}|ZDP`8] `ciGTLqr -]R,zb.9Aş4А{kv(E2DgGLWedq&[}M|lpHZršU$TC$KRp#Jc)ы1-C\r~hЉЍ7LJ Im!&噰6YޖX-ZZ4s %uۗ𣜢>NC~}^XMp%Br6sW܇b"@z4f^X|Ah-&^Y)˛'m /ٙo)|ER_R9M48ARРK4E A9 ܒG.NVIPd( I*9ⶹ.$pU8=>$l% Mz'NFnZ)UOոHNC~~הMJ(FժRH %pɈ``ax% =Oty,\PEnH>EEuDX vNvwxKI?ёQs$5֮KDhS&r OΩ WQb&4%`I$sW"\GM^FGh2P߬@6-uՅO %Sd|}^0Vx.^9X`)xn/إZ[:7.:K"Q[ }*26)-Ѹ1I*"L=B̕X_/Y3va8(Pۤ`= &@kNU{ZBק3lr X)2/E($;Yd<◍h]9Ũ"[TmP]c?,n>EG?&bTBͪͮyU6{A`!NJ_(9$94nN^tpw!~~;IeS<NFL2ZfbOEWlf8,Ȗ|xEj/gvQXlXzǫ:v?\r]~W侊Y{? 7~R:o%̝=4Ws2s/'ly2[̖'ly2[>̽bW9eY4O2O͓ed.FIݼ}N,/n\`djP~.HZLGigA2LNƶh1i<f)AM7z_)%n kXcK_tL+,=&Fp:8hqt#[ӣ[ջ2/rr9x<eތ ~2P($Ҩ J-|(zgs^n遮X|}πWj {_zgzɃǽ[ģqW[3ҍ_E\|4el<sp0>/.uJq[!٫(M%KeO꽫 Q4_]^{'01)LT46W˄UJ"Q7s>p3Wuci+Y_$ޤ]AȤH$/[kpg~HL1ΤH+LM9k'xtd6ҨNk5]s3k #'K_jD+Z+Tb;Ka9>|ڝpn!?NELuڽ**[yڝt G9SO!j,1/kA\}w>|h-FH6:z~肷,"?sfV q T:-?XSeLcmsNgxR^ +njZV\Z"8F͕zҘK*M b2\D3 +V!`s ;ppdiL8JE ႡP )ֶ{K!ϸWjY$K%װW7<(ja^3g}VehE"ňZ+I|nf>7`!/t0Ú/ z3:.oXMG&[Z@wy^ŁaAI0W'Qps-8zaoO'ksF I7&)C%Fn!>w+S !3 , m- 2, >pZP#V-DG< Ar2<ZMF>PԮ!BTh{[8oo' %T{FZTYYTNC<}1 <X0XJvs&BJz}J,#1Ȍsn3|R&Qx{͌-9w: b>N-d016+L6-sbL؆Ugr+ζ gy-_}r'.7_vr_zuZc. mLۯ~ʹC wZ,SgVKe&[xƔu^,=}S)&{0Z_yR5jp0GYbO43U[cL0ҭ1tҼMG쳭L/\? &5S5ٿ$V4w[&n>z;"8R;^jn37췕DBW6_|U]+ks0kyR-}_3|6Ɛq#cZ'749~N ?:<aC_]AkSE:&roֶ֒+EUi2VgҪx*ϾM`37+01C-/6نdh[pm8; Y/gb傇z6+GopA౉l+1*"!)9Z <8!r\`6 _lj#^9'_o}Okˈ%] Icw.qtK%vڶ,&yϰkOMl37,6&l `\+ֱGn -l_ 6ժRY֍&64[a&W6+Q5L/pQT=rStQFldH ޞ],4D|n 9Qer˱HgΊaFqN~x8;c z:v HÇh2uXMU )4Ogr#L̅@dHz$C?YBOt$G מ2n" -R" b:ǚa;,Wv:oC">I:ꌋjFG}8w>σw`abQ$omJ +Hx.JʈUȤ+h"~tI1SW5BRPod\ =b IZG0ȩ!>g\QcrcXOZB7Z5JD:/^q+h|"h wTno%40s\{G Z6Gk"" *LKdO\Ny a+g.sĉar`Ą]NF413cn"蝨 '@ D8 R9>:0JU%`N1EB!zGkfTԞ EX} H@7@>p2ųF,8%(B큷AbiN9#;*Qn-B0 ;QYB1NjWȃ 7<+ VUSUI3Q,T%\L EQ`h@GZ!6*0wET-OET-mFwbl;'#}+s&-"6B4M fcƂNA: a0į b`nQ$$Ap vp) r@Cr.1)T!Aq.\Ed>NrB*Xx%RPyF*MVX!\¤.e9Ҁc+@ߥ~p5de-d@4 d 8)V,-q=Loez^fXB!?JZ9{$!N:nt[PI*רP6v(*++*mVZd'^%n;9\eH8͑tь~7|[\iUnDlҠU6Hu-'@{ 䝇,uB_K'U66ElV-+Ķ4)^t ۞*'n>oOϩBBfݙyO2Wᴺӛ/u—п_O./uꭞt}?zvY/uG].i9mRbJ%WO4DKmc:rb9+bU#i^ 0U.(hp 9eӐ9L<0I0Ah \W┇Wq45¯oB7|35`!jՒIz6S4fB TSD*#j] 똒?M4X >9W9o=]5m<6zG_^z2N\y/o3`RiF>!PY >3l`LeP`.*xqqWl1YqgR gK.L$Y>H0JEol0Z貨 Nk ($RV$"5:c tFVDZgci< sEvʀ/34qJ`MNdMNd7hx\we=r4b JdЋgdÀkwe1}MWzV weWdfe0RW+ F~~.[}nL{2}mBFY2Kr}LDi l !;4U7__jߙ5kQ<011_@{z&s-P!$ɗPިQZ$zh~mH;>{xqSVy ?o@oY+;%]O?JyǒZ[EOӅ5p1^(yǢ!_( uq%Z`pzbK쬪.ZWM P:e| ߰dW*# ǂmr+/yk}8n K-;[^,h\G˶^l؁i@qIœ:_B.VW;N ]==|dpr{yƱc#oA(%|U8[СJ* t(lHu׭:c$Vj2oh fn'0 }JX&wۨETEy Z=j/xr EcK)K7Q)  xJ#;c^]W.k jJ7¾?|ko?Z9Uukx?[W fs+_Cb Izכ\L ܺD@̣he InaEІLƛ乽|kѩL5c/{ 7ON5J`6}ٸ2]ߖ,@!DA\.%]ՖR 銫"Yl )>Qxb8CܑJHJeNv\?Ә-<^P۵J5;l0= ι$s&&U0dQ  ɽ,,$)(1e1 uȂ)C&SOO((v)P3 $U^FuI33gwuw.EFJH-/Rk^ yLԮBBM3L mR*՘KGQزdqO a<ƳJφۙd]" s%xfJ3jι~ڇ487&R>[hi''qkRIǭSa+'M5*<ʳʆ<[sjv')5bQ<ź;StWY&G+y~R% YyߓB31d]gmc:;?ھ Փ*ܐ;&l{?_ lwS.Ϗ.?n~Gk7{C G+%\ #̫Al`lm<| o_}17nTBBLB]3flHQ%v?J4 Ov<ƕ![2-:ozS]Y82WΒҒ RDBi2XOuo0g:Z> Fڪ.wCxiVd3͜bT,v?«GxڍY h9R"*:|Z1- cg?րJq)r AQ0yZH:[2*ڨCu \+|Q(p BzC6Cuhx~L =z{'4nxb20cZ!<L袉k?ZaelLz@pǴCx+-v> sJ[rES5IAcZ!s ^eC2i&k4HnVD#Bs|4ԕxg}ihf`m*4f<=6mGA8t=pN&xzژ]͇?B wͻ|G{q[pzV3O[gc:q /jc7?O4[00+ +|`<,2 \9 r92f<*exFQG9W9@&"SATS뜔7O m:e!t~c>'IFPco_`~a d;SZAlXc'*Q:ڀq}:*g'GthGuV&%_\{uqgunZVMQʫHq~]L̠1#[Ll寗7h}Uϫ@ok"+ybhLMou?:t~//jTȻ2fmhm&n{Z\R]?`(WWGCUI^6=V?`Z1:ǺhT*@Omݚkzh?6|fB+r;5 ɫL*9DFm0rQCL KM!j-!Mc墒N:Wś/d!U,ɀra{o`%J)& tQ-CKd(F[&C dυTy&* 狱P;W#cWʋ)sI.>Q /CrN;"װ7\~BCG{2vWL)cJ^֔mI&yu?y,s(u3OKr3&+drN%-\MZ.:Mpf\ w'{nU%.tPD@gJ Ȓ(@{%qBNr*z@,}yT>pe+ߔ(;SLN7)BvcIf=I&_gN|4) pI1N&FxIN?osG߼ §ɯ^=o;-]A-ӼZDJ܆;zfd_jl7Fk2iJf< Pr(4F;:|q3>!z\uӺpӚ#ɥa\k[6(C8w?|wtw@ ?+5Gz_W? B?_N~g|n[ 'Ȉio۴zЧRr t΁y 3I+s^z)H>g"PM6y"r/6} Z.do/bSBЩ3 V k ' @֨9蒹Yz6X^ Cm{7(9-$5Ϭ0kAk286i8&79,~g]fXdfbIa,ʟ zɓZ`L<hH/0wlV(PIZP(K_u{T>Q$<N aI\'w eBiNښzċzR Atx07rN#hhqYXe<ʡaDjD',? 8jdi5@3IQQfHAU &(K+M̎?IFXw;1 jUMH"AY?[DQbfB6J[ ]Sh~(@)Dr@`=@L # ikÕ,V-xϬ{RԎBPkȔY9<-cϽO,/UXl4>~. 2h^9&)+H5}aKߣӳxjDoQ;z=Sg.;jyt|5BZ6 ph(lJ& 6-Ŕ,8U R`,Õ9[n[zm B݅t:_7ǐl@q~D-kzt szL@T0yeEk lگk9l[vE+f W}9O%ep0)e`Fz4h*=}L}-:A>d+[L> %"@.$r4mT7T2 s:,[0FϷE2z"ڭ9mqEzfO}ٻmlWn$nt=ľ b#K(q=$ZlZ)9t)ÖtTZ1At +u/dև ^cw.4/kN!$,愪KYKnwy/9Fzƕk¬Z]儗zI9lƛf,FY UwmO^8r^; У~=,0^=fgTnLm[ (>Q c 1w0%(!24vͯ¬ B [/d &UGD1s.kj [ tG}m\ݮ %oIGm;tW$T?b{WD}YFܵ\dVȗ}S׮RuZ-h`.Ys[{}Ч;N 걟ЬE8G&ǡ AE@|N@ ,T2`Fr bDvOy*#mF<˩R QNϗ. KcyuT0<p?)gsTu[!طfn[ULgWg\Cd M*7. V;حK#u ۞VW@^"ܼp4Xڏ*;RvNڎ:K1=o9s;l_Gg1?8Yj8xYϷJ]b8 -dj/5k|WX%w?:qZ |0[(SAJ?s+]--N}Sjq}q;ȿVbC_·_~ lhu<:͗G n?):;N u۬ijݼMGhwNLfzjz4/<<Zf8 WJtCsˬCҲum}قoSvK߸Ӊ}RIrs_wqbELr/-N[LǛtDu;;l%pgɼu: yA.\F}I|e&d֬{oQ7w&/g31Cxipi817}uG&;ә͉M[z ԅ:;\1ǣNA]55yu+W}[|N0o0A|w ,qw9c$ݨ$OzX 'o:=kMO`nMN+iG.L>w5xnHM 8u\w_N@H}>3;{>;%&5ra>be5sOX|cW_$s\CAHߧs==yKil|Nz n$8vi7rf}q^L0|3[߄}m~~g~gy}MoMw݄{)֦fSK zs>Ty$4X #Wu28{~q{ }4#Wy.[eR"ܴYہGڛ`&  s.9W l0:znr YvD>ӿgo|0GÎp87vJOU5$.%ZIC}I/\? Q~&2_ܹT=ukqNw=bv qh98T]'0w&CRF:GC7[C2&ߧR4ڷqY> nkӐ7rJs/Be(,BWYr[S ; pe?ߣS YkY ϥ(:" FDmH_n8aj_keY˟ï$3ߓK{ H,?Fom4yYiZ$óvdq娴e-2M,RZ&Y7L{Ղj̚#әY׋ ]#$њFlZVg#t0ᰯ'+q޵F|qI!E$]A HW*l8<7 @+^h33sSeE)ֽ(y0tSfy(Իƒy4|2la.sXcnF,9ב1OnGA'ֽ(^nvR#f?hB]e%5rPב;)n.I.;tzĜ6r^騜lyܮzh}T Vt)紱֐3~J[I+mk4~}?;ZBzX0cx#rCH:%{[XcN&%q" YdW@(0otk[PusfyLpYeӦ MFxC>g?APN2OhV ֛d577 D 'XD؜KRqZq9mzؼG.5eW:ƝX,~*џ|9*.!=!6N.-q2+ݸPo6*J7ۍPAٚTh|.N^Mǂs2*7?C)qQ،s`w|;2◐޴^vDղ9?䞍 ؋*Єa h OtI#/0s  \ v O6H\H{xTI؍<]+V%$Ctek_4~e~W>zOI}\58]vlo̜qC#_ɾߞJ&Te5]I\bJTGRH"IaoD`ް A!ã j cP(ȋp)$*Քxni2%A㈀6(Q5L!bQ(Y(4a\<"h5AMPʨh!< 1p 3>Q` AP s$ТU2a=YyVɳK%l<{!ѳE.`R$A y!|iX+ gVId4hɾO8󹩄mH\pݛ ;L8i,C{ DosLoj\Am[,8b 4wV($_F vVU_+&VƤ﵂RgM;kTQEzF-l TL::b\a[w@A<=,|gZӚ_lwA V$袎2M@h*%e&#$̽wQfE. J%}7< GQYE?S⦥<NMb8}idhu_ `sBX?L$}sL F (08i<#wJ∶=P28Ls l_-!0W+<焗:K2:=u@U\ `,p5vgHW Zkb[HG6wv??REĥ^\=nG H>`gL2r :%&)Tb$ټ̣ zGޡDz.Pyj` !6R)Q'&bm sJ#|d\(h LmqIe^םj2E#8k͎F8Zsj!#}` B][Dl $\` [cƁ(VOY 9Dyɭ `Err.0g>S|Zoxb!5ZS5ڌ<=nG3R0#P /[r7ނs(8cJ5(a\Z"cI{.򝰵,}LX 눹'EB='hJMI눲d'X0-! k9]i!6ſy՛]5J=VmL[ѓwC$=H(ha}O=x7ؚ~z>< 0_ho| vFVM^z~z7t*n8߁v@GES؋OM:λ~g07/mpeoSH"0^eP- Dw_ݿy,*mhM+3[QL9Ȑloʔ(n*gGP*D`+ ([xim֮i6<~- k,qJE *S{-EJKN%+OBБFp\uЩ|x0̣09i*tycYtgEaj{ۆMFTP[޻^,^ H};?]}FfA5_Ooڍ{  lYkw|_a*oZ9ME@wvԽo7m0 xw%&oVU~3Çʁgt$_ǧkUnP>hzHq<{9AsHo#q9x|tJu|7S#v:&*,.\}?W?߻_hU{Ur$1ivrI0pXs*h_UtCԌa0)f`GuITnXT ;xTŔC/ U`bƥ-  wJYA Mu&xH~u>VN#(|I{M& ַؔY+tv0z_t枆ƧpdܟjA6gV>xh+;8q`VQ}I+Rz:u,4D+x}Tܱ[ѓkpbpE;l8ef@XƔ:vSKZ 6N5و+nk)8 Q[8/[xRfyHB1KϦ[bT4sy2K%VrY UD3"arUV[kf1eJ8<)YGha؀U"5FDCst"S׉H?)XԊ\3tc5rT㑮½ݶx0$FW %sSUcXd9^fFg![!{b%~ȖLnJ:.n qR_7ʲ nyRQi> QKYKnЂJpecvdDVccKl@wn%8΢Y<ٿt#SគrA8POip"iPH9c~t,$p%<<MK #Fؚ9k!nK 6q}Lo|thxv&1_V;7nM+Qr <2\_iru; 655*X$LO`x߼N]Vi985^}ͼ? #o}>3ѹ {]{=}uB.+\O?\x F\+NY24~]V(0 1S )"8~J~~JO)օ!5piz41FZ.+~My&;[ Px.) w -u\sBMbz&?՝97Xzu[電upfԂKC27 9ܳ:#)!)} ޥbqݟ13 p+ m˯;3JEb@X% n ɞyyhu:b$,Zzn )DQQM\4q)(2f .S>mGV@Ur!@ZfƟ椐zQ*e} ]w4sX_$" /p !KFY,8'Dk,X&A=[pR{~~_ط>/NMx+Ip)p GIÛU즁?*ݤ{Fuònw4f] h]f4E6lܔS9`WcfK#_|xBBf0x'd3~m3F,5LzBM 7/%,:L ՁFHj$jiFpwV,2g_ge ̄3D+{ UUNe҄$S*mWP+ݼU _d4?V˽P;8=6-6{D B.xJ7]8;jۑF^Ƒٛ,} j©G !gy sJx a HFXœנp&qbn E۸%i"Redu (o) 3EL'atq]i'mS7XP7p*>{θk\FiyGbV a\)7Sb1́{$)b` }h섰b. 3xG$ ;J(\(ȟ(][o[G+^v} ৵'0 &3}ȒBRAVxH}nͶc,ٗ&& ݭa :aFͨ \#Tk T12+S+@26Z f#k\ies:8i4)m Y-X*G,*V(0ܣdK*O:Z>=dav}Ƣ8}\82a `CD6ʃprސU!)mrT 0u*,@@i$,@8-WJ 4]+uV .R{1Lu;b& "ة^SNtq M$u#y%l`S{P ȃ9,,i95$'ڷ`o=m 9gmKt3ŝȽD;n  +&QEj"p7Ϙ%=oE f:*[E%t gŇ7k׃Oͧ'ά nuC7,JYzzϺ]ܼztj͊|mۗSC;{Z#պ{DFm/@?Y!c;;)~܁6}]nڬWmVo^}}?3xqݻ͟;`8lwNBle:WH<9>֯N'n{R oߎ$Tss2.߇?2-E'Կ^U/\O)A"d(33R=!ӤѸb"tpxqIVKrƁݻ$’P:|IސrCcA0qGʓᓦBWҢ:"-/a/8M4h0Bz{hT̓L -crZ >fXYY&7wb$`XƑxk4y=|17 #:ݦ86ÂE}=iz=$$$/U3ķqw͇L4'MDs4i iPRTc_,WMҥ^hGN|o^ԜԜok>[Ycg0}0PVD4DRQ#tǎ| tV@kQ; SwjƘVtN2g項t^H Rc俿_{l8c[y#;1%xf6޽%Y]-)eǑj#U on1B|11*VXܐ|u>YK[1j#qq7fX/}$#V1yGj>rMةuOԐ9Y7QLÇo\86]V۠hF^]*OX{gn?&sf~&ϭAf~bK)%2~ #2{4΋vżB%7Ynۃ?wod>~3#؏Wlxv˥dOR:tT<>9N&Y'|Y`N.*G}Ŭ`:%g{[9+&', rV Ys~S7+$4Lޘ5y^#=fvV8 ѭ.]ЩYx E3Eu0{dA;+ " (\L'''ڡ\a\(,=J\NΕQ@!Jk IrR RZnm 87fSCQFIXA!CѱPoSޔCORڢI k ͇O(o)Ji+}z_J ǕY-}سeX$YX,^/n]G*oVMI_:ݥ,鶨\/EjKc[q/s§媢]J۪p¾?TcOׅ_߷&tBEׅwϿ^RH'esbFcĴIgHȽYg6mˆY+F*WfCXuݫ(T՚Z1/X] _*>y\E8 RsϺz&gGONѼ泂CM=+8L)BRip1fwz\1jᔨRdfqU3>&W7vK,a. ѬUd-sv^ 7;C|9·[#ZYBtYklZԍ=Qm[3ӳ>zVcT ,S@3YDoJ@T]aFATҙVX75 `=T[yiP%IũW]"DgA%^40t țF.;ۤJ{Vʒcy.<=RV rz{QV6㢮Rk3T:Mֱ锎H+Q.p'lvM4'#pvzZSOq >Z)K=piq M5¬-47k6Q^hEEe)'HOT)x=7<*/{mZq=GI/gs&p1px1K󁀽p[R3rśK5d$Ebfk o^7.VbQ0+&ܪ{?D!9UꑁfFJN39M6.:MMo|p?wߴ_Ez6Lznzݨ`kDnǒiɨz/` AT#ԙJG)69 8z-X\O(ӣ8- &ZDFu[m爠=:罈Z 7 G0y8!*fu!ÃK":R)Qc89%޵5[&ic'5U3\oMX35}DHvh6~i ;1 L`G901PQKUJ01/i+꣊5gޑqkĀTqfY ӆq`JcQH)7HBލ;xDӭ-Xl:ImaZWLaskss3 i1lvl7JF\ˏ#r b&Hr5j-B|]ᓖv*yC"8xPŊyXmn3(PLp>a"+lqq:ҏS%e=.h=9dlQ($%hՒسՒAs98>TGb[<3q1{g\;s`>3@Q83DŽdyI͑q;NbLǜEDzO s~>1Oq߉dºkG' h.uOB/(V19>Nu rH$ 2'OdB)pYԊ 䢁Jƻe&׷-`I(/9JEUNQ{0;BD(|+ ,A.d1.)aT2O]_쉴&XYRA+4[W(檀+4 ]LK_Ĥ&6rx1!`: Ar}q%1&l?on}G[ܬfbi?{W{dE5 R:;+J9&qaMɚׯަ c/Oi%q_CJRl0y!=DX<ȋڪתbc Wc޼,^y *LqJ2v۳ //>GȋX+MQ_Ňeȋ1]aH/uԴb@+//>X؋Xmtv0=B3}H1O}7:ƝrPfغ;mZD_^|Hz,d>aPm^k7@K#_a{{dw)Bؔkf" 0)0TS= v<1á>kY?<3&C)í& >U7㏿yݱ>}YcS񝛅f?MA; |6;tavSBWw'$ (i'wqwH(.L6ĽJ6Q.'׭ŕU Bd%!%9ҧvJĶI/{.axa}>!1ط@ޢ}+@Czk-GRG1 \/3.:E[<%wT;q'~Bǩ5MûxR@I!{Dc\[.@ z sGƅ= qJN,ȃ`FOŤP09.R¤R3Oɝ9>Ys_ P8~sd N /b9{?+m^9ɸ&]q/fRzӳrai?ӧ1aa)B녦91AYS7m1Aq)YI*cNJ-4 tbuI($$c$ݧ'31d"ISDb#+h('Yi/JN$jzZN0 hЖxq~,W@*ۏ#t`tPh;i<כO !JLqᕈD  ΃mn<=y(:ܯ8EӘv;! 0}v/{p:(yz0ژ/}Yځ:,T0%4,f BVh#wԯ 1]S/jː5F(Mq.nwm)s|i?uyW(2omT'1M\nR9jI>1W=DxޏjmŨ=,፸ǰ 1Yqn#&(C fL#5{Ɓ{Zp,DHf5Vb4(EtRr1~| dk,jRHHE0i*e4Vȗ)eڪg'_>:a;%.)>員pb+-0R %"-*j5F29bH!%@$ȚZ!)8"^sKq"~|!x)BS `LL'A Q_zz 8OkN (3qQߤK9QoJ8udMe'L8sS6ږ`M8d8q;X (J lSnƳO `dE@ݱliܻ B6H6Fr!cf:9Fkj/@?sZ|M=PK"KH|)%ZlTpB/0( `uE U ԐB"zAH~XcEHc LB|2U"DlVz {(ra6ej kc-n#!g2Qh Q T=sZRFTIFѵY=w[dH(8!6c>//(phb/ !¸c#Vi{g7Uc Ͳr&h\IwԝuclmkPkMQ(6) ,j 0Nm#2@"0fp*3sm4 &41HLmGhPPn1E%`Tf°2͘P !PIüav5 2+R & WVfr&Hƹ#U[.Yz)A`_t`9 ,=63ԶAΑXZVL bXf1a2Ce0#q[0Y<)Vc9'mf_O"W@[#O-Hp(qm4υmriGsē^yzvfsl=1 \9ta Uֳ0Z9 ce9FxsN F`MygF&n6\GNqQ$==H1"tduVu&Lc5[<) b)_1*) rB݁ Ԙ $)U/dӓ1PL SE R&ǃNxL mOoV[|^cM7y==$VD-2I-5&>vYԊIH qNWm"Qr:$ K2TO畁'f7žCmϤ`5޸o;"j(cMK`)x'fnS]Ͽ1")$ M,{)Gv(ԃ冁DrL"_1SDr,{XG~r)Qv받?l]}M3Տ_ֵws`%Z5F6faTaP=BImahF)ϒMO-=)!:I( d4 aaREh( }6=lQAkwJID " 3Cs#;nັp%t\#E)`%L-PZHjd Qd,d+ =tn)Or=dBDchyKs k$ACD 2nh꺳盯b9P#%   ۈL<[K:+?Aȯ, '9v\u_NT8g1U®{O#me 7>Xûz1iOOwɂ`]uu̬,=5Ul2.#8P>=ϦBsiemq ǐ7:s*_Г +F7=ƭFD)sO+k#!"Z%Sf_PヮjiP":k\ݮW y/F:vFTs^EڍF#P9"-:%ps{ɁJ!@j*4jӋ-k%zcQ Qȅv{J%iރH.c+uP"pm&kc:}"W2*eFbK8@ɩ4EйKIWNron¢QM_P#7LқiOAp'<a"ENQ@Pq&ʹ*ZAHy՛&Z#j5lm &|@;))D&Qb?fk- 4k5!zGB#F+~0 |vpDQFEAkLswGK@X4͏Z)*4dvRpJH1UJ,2,k!3oM.W,C0` ФxŻM$eYA&#r҆Q;~C X@B#$>X+'ZОEm.֬BW GhDLq);o &<@d1 EL[ڑ!ZALɈhHAx A'gш$K8ZBƫuT@iGXq%DII%$tA3edz5iBV&z0/`!#MuTF TTíLjt)‚z_R@/7j MPs&dJ!MyjQ("&Gġ %Z!G2P{˾ВRJ=Q>Hl AFƨ@BHVPP[ʁ 16X$b)[0(h' \p(GsB{aXPR0U!o+N/!qǣx\5QH /+W\৅~Ktl7T&mOn+*S@`^kBYH/05f gῄFכգ dN})9&-?.aޥ $xTR.m*4pTr-P`qk0+-]2}خ)>dc_?vոRkcB9{|ޅ {)qaWuo3Wɲy-Y/Eg%=y%%~fcL/OpM,\Β_YOj%~|,R{ 柹{AHw{i;gNNz@/VJ%oJsfJݻA+onÈw0H=IxSEB -U"|Ӱ9`v@}Mެ`_rXN}HT\~(aYןxuտmJnQC\E轠St+!-Eyӛ-%O6]U[ V]0)hOOvU9g:+rwxv:k/|~,obZAN1g-Ε "P%v|[|PEQzH7Y|Ocl:^,i]\'3 h~q"2z'Ϙ|\czm"5ڃ[|Z ET|8,2[1;5fMrg9v-j &s p-,kYcu~^ s&7\S:N=cҋ2Ҋ;}?p}Zbhɋ}޺^ݒj~֜'9ǰn¯'_7yuaHҠDaAշhD23~{ l50aff*ͬ4Uئ5ێx!%mc'g1 K(.aT7_G_`\!6B*Gߝ-.eNkףWGx[(d:(jq0QԹN8q`|ǀ~S*q^B Wl$0;ɤM9[ؘC|q7lJ̀ſt0qՋ"& [{ !ҜIZ;Np +C" ˜mVB=j`ITcs:`#Nk&Pկ'1~A&`f))w450 :RE .5ӊ* *sьZLaiH Q944 5K` #;1.*$:,zP79WsVHxRh;4R) fra#xpLk&8YCՕȈ6bKtsI a'yb`I3[Bs_M 繾G=4nDld>}ߨġz+F @W 8\6MJUO $#g&oE`9w4>tu7PWMU8މSL9*MhTGK{}{5, ImgcB:eMӈHBs~Irt0Ă71BRR!9I>,w*`[cu U..Bg7]2RmmKfh\4F^pb4yTh1P4HȗyiTkф! dGthLFFqtȗP<´>\mV v&/<ἒgےZ o8i-.gՕ*t&ؠ;| Ik s+YB]( C՚/tk&$jA' 1;ɞQx%3@Bs.Zeq'>C[uɫ|i&%34A^H9]6@x\55:Vڒ>rRPZ\;sFЂex޵>S` BD٨":ϴ˧%..[{ p@!'^ υU#\ES2s`E/RVFUG0 4e5pԦT as#G:A),$9YG0=q5ފCcZ C+ØV )" vk^%52`yPP ̳&8֪x}*+s&0*"]?Gpݏ1@2ͣ'gQ(&N@Aߋ8(31=Ook*qxáӆy]/V>~\5˙zGw7>;Ւue+ỉKk :G#6P}i-"Y]zP9QeTk9_ϹG|e:`'sBkb,.#^,GnUCMyw86Eť% ;ΚLE`hXcXcΊ 6("J% LIt8ƜՈ3+C͒02{ dzCqùg-#j 5Ʊ󥴡042; op>"=G\KTJ_+i`LVʩIl3Vl2wkղӸFKI FNI PqN~?*InjDQȄ)#EOƕLW_ qer68XQ!zeRN }Rب>X=Ϭ"c`YɊ[+<<3-4?y%<x<3d׳݋Cv!NIl{[E-7*|^/L?g9Xm(.4>f(z<gtdh^Fq6p5{迧_?CMQ[Fc4ޯǏtꋦXQ't8_~C3. S&m) Frc"#TYlscsߐw_˴(pJڑpm$S 㡞]7g\ݚbPEtgR7W!!"@4UpduujЗ,dUrVˉR_Pޔ* E]vK8~Zz|QW BBVG} =<=ղ6Zyӄ j'm˓|{ޕV;.ž۩k]5rJwAP[%$wr i^Vo.Є|OgNש6l)<*-oneS\UշV گ󗼹鴫dWLE?v Eh")&yIf?:EXU6gڭBK( IѠ-]('BrBZ([xpقߔSLoOmS8ZY&) -.}s}:Wwy{v^Q& 4D (D5 dۗwvD9O7at&Jq3eL~?vbswFÇ?V|I$n?_}GڷWs Q$$␱8P\q6zA "cL($cQvx1Rڊ}oB4 ڴ';JF옜 J|K/Z&0 `I1(xdPTl$0U/]\؀3(º U`O I "'*4yd(e<-+k6fA 4[Kc<9N<'(46rECZz:އPY!EKm$0T9 R"S"rc4 @1B;]LQFq8Q{DFKQ~{ޱB (Է=ڣMUa)j@cbYuuEY[&󥞫l,m.y~MgO˿¿~p/՞+`0ʰBhqhgщZ|#W7_an֯Ӿ2W &vLZ>WK\[VH\ K::?L视":'9A5|ژmWK=4*#1yJW;4*#웼8ys!6%l?oKEfO::6[3b_oFla4a d_:)kJRpibUniD5\Sb#cގ7Q.|ŇwaŶp-+m{]p!u_a9\DI\([I[>o~ZE"&95d^Yj'N8,H 4h!9-z r0ybC˲2tƓrf:};RR +KwKsbF1T*=n9C6z{/O4Ǐiy<:1tbġI "Jn|NY~z5JRR14}Ul|Ɠ;6)7/Vl]g$ao#I>rN!;Ɉ k6Q7f޾ 7_-h7t@ކD`Y@;l8bDc@`- q'#(+ 9exC\F][U:Jkj_[mɵo\{GMȅˆ_+$NC*ZDQpGINyzKnj5@EEĵ T\_</Ǫ/R!xޘ Dp<ڂh qy( T-QnВ $CYH~P񨅴d˄ 4H>)'E\-E(HdbI|F*:9( ~dZNsL+QyMAkO>@kэ҆%S؛;]>]U:`pa7^uٹ)S bf .\WLstzT9c~yO'U=OrRiS)W? -0!}XnB!28;(˼fEUqض.O^i[R-:Զ{ږI\SB.J fPu~x/Fn\\Fɛ|._F4^'͒(v\93c\s㒞%6@6uI?,&nss>7x^z2iNqJP\i*4 ,*m(~ ;Ě䎓`9>1UCRaXEp(` o[ȼ7򱒱YS߿#$M'WzWyQoV逓K7^eMښlR52SG Kд R 4,ak̮;5"NbQ//J*~5D^ba4V&RJ•qDfM#'L_Ҽ$uѓʍ?8ՔW ]yޔuCq&VNȣvqwrY{iZUV%i5p'OUkN_f2;# %k&^p vVP+cy½b;ꦪ*UUW{MUi ָ-úuа*ѭ]ҚG4Rw]UOVfm~_WdUs*YWyJ\c'Vr?O{g:Cԧ;Oy _|>̜ccd,hq;p͉ _ב׃mo~]7/>$۹+'Pc󿥇|V jԁ=O:p`Vja;hpg1g0S>;7y\;@Vt_Sl @;@@{5D ;RݐOFK NDz^]!: .G9j RfEo{6>()t1* ("FcWW{ !7K82QX0QR,b)%;zi{ƳjrȀ]21 kF_ 臈4sDUwЂwu֗?}*m}J͗o=oت+%! =˅y+寽3hsp1c[[ݧser xS_1Lᶦ`șXHJZElA`ZPuKpEc!~EPG}]8_6oFW,w5 sG5_eDNu@^枧^2kEP RR7V%Ru=InQ)/e6XDJdD*4XFP)"!E'MNEܸ\>Jp[沿7Dp;I'KA^: &ιLiw4d{4Z+tL#Fh *dG*%s$oQe*DMNF% !ZʯDuCvT5!n]^s} ӮY2Z)etD3aa} }=>ZQ\z㽊._}c5kܪA7jBߛz֧_|48>1;et[9FTDꄊ ѓKPlrIkV a;ҀjЦD5HQ\ 2rLL+\":"QxiȲ4$bd})ԣ:J(<\!i|@#݀ =D#6gM FK@k:x"䛇,#k0Q:!Ӏ[@w*T뺷p)58+aHT[ר& : A@- ֌nK=-.UAmPߨdFKeB=NM[IBNCYQmU HUTx! Q/ ,˯)O|:`X01F/(jJMA/InT9Bk"G!0lAkBCYi@k":<iyhd0Rjp r,(VGD.2՚bPTIlCRNu*ƒPRzրjvCg-Ĭ+KEK&|"W;uAl|K5c[bat߾?jAie_q˖_4}Ks0T%wVtwKGN(N1= Mp<k!ԉ}!-ߛ"TyhEz y֣,r -CC[7ʀ);[C[ ~lPz[0S4St;y7([ P'xHrDP!攠!۽r-s"ܛjWWa݅pl]uĠ#ɻGIݣGJ :/b5lFa;I!w SZ ثU,}=lگ[d^ÆM L~5UiWq!Wa{WnC;g)΂#~4Ms6iVOOO1'Oʛys+aR־ڬüb \gFBu^ߘ=l4׶O|6d_2ٗ&-)rVDLP+ufٴ;aͼٺx@fz hbrF*dt~Ǝb$=\%Dg'vm wym'@@6.XV#!nƎJ  vIVF.L}c`{gRVc*9!vHAqw~7 Vwз0< Cܢ0wF]nnoxpp 3yS/cfw70A49ʿlu? . %_o}) r簛V+~{1}qòi;.߶h>*xEUKj¯Xȁ_OpҞߗN+]ma8fd?'wb[ c`XkK%Pι—%HZy{&vD 96oׯ 3|%vA&#j,pujp(( (VUaT*I%r=R^)Ϫzŵo [acVr$o *iZ(_YǕV%YX_?`@xiRpzi .`t{d%$ƈ=U Pr0ñhybc賎>(`p_QTaQ(fp4˲Y*&z0;U8h1-*A|XO,.5{,La|B@6B۟aը0tADz`%Q% q JMz0Ug-&=фjV9*g|=褀hqNH˹F%+_:_/M]άY@jםHq\ z:nneD,C)^5螢# S㩈E*βVĢȟ,߇!$Y.Yz@:G %eU0]E H[ z9JDw<=h=8B"wv  ᤹e8Pc]2U9f58R!HlWQkFbDEJZ**NSA dHd(6ŮL1#1#L4ZaG4'(0Li=AauR"I}2!%%ƬN#biOHM`PBTGTBҼ8 Tݻ?r٬7 slp!!(u먷Ht=mYTǓۑHP7Jy/ՋˆTV(b176u!h͏zcF7<Q(U--fM {]Xd iA!,w^McYUN34Dk-'dWEˌw}V1Y$ȻJH6d둌ф:gAXdQɻ 9ZmLt;O=9P)Y͖XwchUFVHRNbyLۂ'Mdݻ&;E+3Lf\ d4J &eBҍfgYHdڬI-GDO>0ٍ#NgiIɴ|D1EUD4z :SLc$8)YYM)13ZGєJXt'SLr3(aLc6q&dS}|; iSϳ t\Kt4OYvBZ"c26'Ztj$Z}}\9Bj<Ɏj%j䈨q ^i ٶUQ N⼻R?k-Bf6o}ZNfIz'nd/ݤYufb& ~\lM0WlQleaס<,)BJ:׾0ZMm,kkVwrQdV^^OvH˦Mݶ}m?צpK׶zS4y>Jp9kXPUMQwoȄy r>چ{5k>éۃ&`Ե2 {;.$궏įU}4mra-~.w=Ny<]VC')( &x) ^.H4ψR2.>'K5[i~xp㣻 묓 ^~ Y}}o .!;?n]aEdx76_HH-h"krM6x'v?y: C Nx>}:o|\y٧(S7CөD _!%4%De[Pd ]ǥ-T%֥߃(ނIڵ\XnDmϕ"R"%7`'t*N+gjT +(sLpP" Ÿ5jw!qkV+Xg<׎i ʍxS.!/@p,o!<*ƾ}QUO. FkMۑ=ƂgnyMLJ_RUIaYCK*lR {r1M<>d K P +o2KC(LIB䥴ԁ%0rWzO`{}Qaܴܬ.C޳/ڄk3^{b^'n @lY0~|^g*>}NΟ~UAfD-ouh+; ׿@q13h~B1q MlqA?y 0hp&6mHDktz* Λ;do7d~yt:!K?'0 O/~/h~'։n/"xgso@ H9fwͺ; +|% Z+8)JhS ϦL}:GN48Oiղ/I}~_j: W2"|KJp=w)ͼjc8F%ܖ:L4fc)@biZS-:+i&Te*v;B3m=9$#٪d"Bj)u|CkĔӕ@BgvHnF5Hd~Z04Ƌ_Fuk\]RKY}2J-@ UV ^"I0ȠW@{i{Q &m"tml@=xuf))P-Щ%=NQY=Ǩm!jb<8$>-#20{ęAI!k5A@R%(dWб5cy՘LymX% b6ܠlń&-|BI#!I=iP+d/!ASSQ225 Rlݍac,AQܠ\Y jh1BCw!VZOˆ[]@t8S-i\ژ#u1u`k22S 2*cM=I4 .ʥ䞚${̲nn8%"H3@ FYaMG-P!WسӜE ,>?wzԄ q-їcjM˽9oK(YQ CI8#^ vɔ[ڀ]dH1|8HPbGQ0D7y nQk,%9c 8<7Ġ\RgQmZj7|0˷aѐ)Xڲ/mE־ J:dI*[+c٠X}A=W-թr/y*r''oWU@M,5sĉëASԉPx=B)["A08 T@9^}ufuZzo }^o5"l̸,XʸwE^8KeaD.H.xVڮBzi9Z7Gz>Oܬ;lX~gbU~XK:*u:Z@T*L#$k+`mMVS>a Ϟ"i,b) S2FK5܄c%GIbiis /JG GP|B3L.JUb-m^."WwYf2xfK[x6'hh-m^tfZ Gn(E0W8O$Dmiyt[ztP< -mx}ZXf1q!/3/ۏPǃ;׋| Yw/qR];»ҋSzrRW|%wT%-ܮ4sHcIiD%BqC-J s [&9OM?%ހZ 6< ԟkgxOxWz, %YzQr-d\m+ ZJoZXz66e}?Yixth]WAW V|A̦A$O͉^NDދЈ&e! o%$5W7Ӓ߾7o_?}y*&.76zr}Xʱd,CEI&ChvE,fjjP9T$^<ƫa5ˇʑn Sz:E8=z wkVBnRF/-X C^4tIX"ǚR;Vm1xa/y3CҗJ螲彃e&tafճס:uT}ꢋסa_@$yԷ6>xiaIg`PN giZtڣ~4UL=BW M9dY>8Azpdo;hPvz+X. vMXά:xX!ѥȈ8/goKgVsH32M/מ@n#,`FLvXܚ O& _B)T1it;~q#rMa8{~qߋw/BCF^ 2LSn^ 87a^O0&qgbE.fqҐ|j0ztzO )ʓ`7mH^VJjK2o(e9zM<ߋ^rϭV|p/31 7xQpaΉU5yUrR m#1x:ZxF.űV}BvuYGye}VCr#6Cjm؛ai9\KOA##D˜$}Z\6t?k,s-FIއhWeHvb̗+߼ %9 \}eY42!ʎ|>LqzS}s >5s=$5?Ow~Χܦ9<%{Olr`zw;Nl VlF1"bzdJ\ M?M]uFrFȅNC*q 0d\\dz.\8LO=!qiyja (K)XQ]R%iu FHy 孤56喖]hp(4CϑL֜Ee4/(,S/kLo]͌YMG3B§pӕfͿg|lmǧs/& Jz媿]`>~Ǿ*Jkיjh`LFLɋWG 0Y \%2ɏX4IHҹD=I8Ps Gh!~DVص33r.8F_>-;^->s5?US&9t߸[r-Q>5?vOf&˻b'#IeD6AC+BC9,A2 z,Oq_ 7u`.Ʊq ZP:jK/S`2ᐊpq Uߟd4BaȱB!" !<Y6dP(rA^YQXV5*TU*x $s{nj:_۫J^h)ɶb+){bY}8AE`xWU3&`̿Ewd̾ qR~ehբu^a83V]}Y 7BuRL2 ̎|[@^nΞӉݜQ뽯D,V;+suݲx %ΓI#(}x@˱ y0>UI*ՂHSn|yIݵnB堨nOSnddxIl9.).WƔjU7P/+I UIt]0QicU%)Ҋ-6Wa 0nYJ K5XN9Gd9+/ޝYr1ɯ #*c QPU7AKMJ!(k9wC 2%Yn jR +NXpEJڰ*0\3Bjmsfu3+) ^֌ʪOb)'-7jA,\' VXJa2U* BX^:,/$š"8Է$U*>!!LX+^;͔tqb yuWbm%kܜGC7mn/wdL6nڞmoxXmeejEqb헿<.my{vDs}oPsrnꬸ~\ g!h]fy=f1M5yO_j9B4Yx^Ys6Z;8~箰r'*'=aCߟ}IkJ}; ~-fub&VDi B80(Tr2 |wk}cX.x3ն/͒7{MF=qG1vT{.v=~9 :Hd:F}mR'#ZD 0&1aapZL8LklRe4B࠺*Mpԇ~5a<4)faf ښ>6e=Vé Fut/`0چS ^|R-y 9^XN:s7z`;s*< z}q-o\z{XM.o<%[4\t';]دA7%?W7vv1eJ[hGA3ۿC{\6<\&&v1'N{vܮv88q)d !. UaenU==HԪ=c\:7%ҟ 1U7JG2V:237>)3 ?ʍn%v& uXFShq_:eY Kz2аX0a [F˖Ew0ՂR֚!Dutv8vo0j"= % -N5jh ք{MkV<6FHAM=)4䍫hN)BF?n&hby:n3](S Yz@OBC޸F^Ժ)hՌW\7Z5gE(BU3:շU3qS9bZWG߷߿'{l"(o~οwU^@Ƀ=Fv'_ecxqvm76quϿ{'l.?nH3-G$r"TDgpۇUHZ0L|Dt(:HRAbvcYNPD u}2d=b0yX,q$ 'K3Ep9PR,A2і #G4^Љ"LH- Slz× slOC<ZA]ll."ϊgZ0hYsw5,tvSO*䩓$,zj*"zԘ* 07% _P`\ˉھ۳0ȥV}IC~p&9ZڙܾY h2:@"L!,B D@}yȚkFͱE#<vYq ۍP]\XXEI)r u-nJIМB3=BA,8\Xw%b6/Ojk$j!jZ. K,eU24V|y)B>7:\E E-k/ Bme ԢWkT",k4{1B̮QO*"s" KZ*]-m8cUQH ýJqHZUuE&F{k?ޣ(I:/cgJhPʕ FE|D D+ARQE|tDO TuzG+$q w^jpBYv<]HpW* Lw d1!%Sc&\H&E0zD{/0i:-S pB WR|~&84Q EV-..ڼD6k6N^ ˩nqUf=>Ó IKNCh'b?C$jP7GŠ H;WapL0f>nЈ aBrI0м +6LߺoMܠu.y"ZMӶ&52Nvg\E&fوeSgߐK[wj_Džxօ-vq؅k[۳?ye{,ejEva2׊[]Wg& %4"x5qdڪ{dfyuA{]{YhZa:5hRgeVv1yYTOoWQ2Sf,K*l8˱t\  58B KS.ׅ @PW*)8ɳƵڕl.aTn79뛛m<%ukɰF]blFlCQkJ+0yUڂePEfL3a y7KOqU^đM3Nk{ qH+w9Rϻl!dZ#qb̗H5@\,` 3LOnr: &cw")Zg~͆g38 eHRNIQ:ˬvd Q.DҠ'k4uQȸSA.opE'9qc,p%"XKZWh<"U\z ؗ4Ui6]=B&Lf?7]S14{h<[zGPck3Fn4..1I\JfL$&bSMӋ9BSOa"2]pf|s S]hnP Ds|iTi׏+E  YY+51MX:aAYICE*II "6䏡0r\4EsebaebO&-;i+{bC|]?IC\RYlje'9|3 8ʤ^/6ï"ňCް:oRӾ2FXD0R@=?W-fZ཮ /~.B323dG}ώJo6V9IBjF룊qh>l:SPzNX< 8r.#iZ2G1,Ws/]E2)N!_mVLUi^B"K^uNX `.qEҘd FpHl 63 ZZk.rYhJpy$|1do+a(]F,Q sZzR KuGa9ӅxJӤҞxSa20Pۗ`FaAЏ?b!P`ȽD8{= vлw^Obp@kapmYɓعlM"es32$G}EX=kL υ$Y ,QO\Dm N騯/ @Tӽ2fJQ 6򩤈ƨm=,Kٝ37z+3 ^åx9zIhLbӅ\lӯ58{.Xn^oLmo;-F 㗕Z ^o`x=A يTQ+@ϝ Šz%# ^>$׏ɆVכb,0 9=,y9`PYOA{: jf}*QpUm>S)x4СZR=yX}lXiyNLM4 n _l 7ǡ;Y)8 q8S‘-G+6vY\.9p0KG4?}wY/s׋d/?\ަt~ɝ=iwozOi~}gȖ?lKMOe/{W9kq{QYD7Py3 듉}Y\i!}9J{-wNu@iF]Nej. Ii)ʙ$nKg-=A-4IB[Pm$YKOWK'fr ^4>z3|.[a=2<1<T纀1'1 <¥]EL'т+g]m՗X .b;pa#Qzs^WN; V`jE`%j;^*oyf Z-M6UyYFd]:w\v0q!&ˆo6`># 5v!G%:YRwmDZ2mXU;6V@ 9 Yt|S`^ u~@3s`ϋtܖ`F?V33H{b~|9ӧ4VHWapRJ4{sSR}[>悻UB4<(E$ٗZ![qNnŨ4S<):rMMHվtՓp<p+`-tΫ'3zrz4.'.1Ed?ju+/;U^Kk2x/SLqO-P*@X۠Lz).KoT $rQmT_vN<^[T褑10Mp{esҒ[5{~w.. ?w)'cŴ$ e}nR~j^~ 1Afݠ23d/Θ]>3 O1>ܔ)>`C!p3QB&F:xh%N1FD"Lu;̘CېIdu4hp 5_]XϺv3E;kmo_ntaN?79է%Tcy/$'dMxn+6o}?UM2/jyys㬻. jQ`ėb=GsyEas籒"ڊXe"ʥVt lnm ] 1^g QY6dCb8$|͆gB0o ^.(ɒ&Z)kd4Z(EDeޡ&jK>r藫:0$^%_ :x "U3ɸuKI䆑4FR*oQ@c(aJ9YZ/IHnd 0ҳր&9ܙH3!eSMzWgI!SY;(JY3Pd==iGܵ ø&Zo,T>LiI=r`PٗR~\7+gL1?.۩R:w=j)TXÕhH[CDxea[(o8S gbqӋ0e5Lå ~dLৗDgA^KV iҢ9 - k&1U~|;Nhrf粁+rG j䈔؄%6a0n?/_VNz,Ȁo jzJaNV!8]eӕ0&>lЗBoyF3 8Gj\bxKD'Gۨ\Z NȚ1^68yr7X #~KWJn$y'DUxӾݖX?ч+iieFcN_w`|6*?G&O$WRE@ط|=a֔"R9SODz=oH pwS'"ѰU)Lx2%%IJ ?gqOT˹`1k-T 7Ά0TF4sM?nyrɪ.# &0Qƃ\@+=I0< mtfo;q B&B\%f _l4u7?.GbPyכFC8ĂbfJ[fn(*qBTkAVX2j9Mǯ _iwk^mؚ(&SF@hh6MpBE6߸K uKQŅb9q5cv8S_G4A=x $g݈yK_g7+niefFCfe7ާ^7;-` 櫳d xH5`̜,WSS=LcJmBnWk׵US\Yhj I#tR vR&U]7zmo ޵*jTp &tR:W#D+(v,R>KMaJW#J!U(z) EN y7 HH &b 8VsԩvDכhC; p{|VYxo| Y w+m+rݻb߼[Ou='ƌ`\Eɣ=}%W`;-W+a|*R'r%gcJ'fI.Vk y}QZCEi~nOIK3٩peI.-=1IiJ>WW'zkPbvdm6yi@Nih6s 3gq_='C3dP|6&M-|QQ~d/b {<{JpdU_OG*#O5D}+'VsU5Ӌ?_/gw~Er_oCLlԮaO5'^bDuB1PGm" y*ZSpuӝrEVʃ)Fu[12f1VEKnSh+W"2q>nwzhJy:ȱnk"4װu+'Z)4䕫hOxN`b|riZʥhK9Ld9PRR \/xy~{j[O3AI l܏v>Zx GD2Qx6H(Z!EKgb/J7]^(S,t!m꘮US:E4fNLsL98Lb_l&ՈبGi\~N$9! o#Hz1[!l䀿=7;PٔύWldYQ֛H |(o%P;H0rn >dJ8@!|΂"#Ѿm߶֜1 5CrkKr>8 Tc7RHmov'CmOj4YLҵ WV1(*"Ŋ#Hd"T`$F hR+[|;uf亡4Ӹ8Iz8nhG$uCY%j; L^wfɩ0B85qx g%hޑdū+'NӗڧM?o9:ZL?l`H ƣT 60i4y8 2*YHࣷu PYvơZXYcp EX1m1 OuU MQ4ʻOo}z}J2^fΙd@^:g nP>`_C6|UQ'=њ 6ԒѦ6ȕVs8n8:Ԕ?;{j1?=Ib\ù1U~$ϱI"tF.ֶxjz^PNR/e1s_8&.R(oV(UdtgED% U* &EgDIf[Ž#QCmGkS5"ZBCc3aw72r!eTJ $i_2hwҍBOe&12s[c0pTÔ YKTè<bOYסf *nKw~Ո6#V,5_BZj8hiOe,6L@1,xɥ#!nu2':5LPP9L3OS') -W,}e0le$J I&O2 ~u'7a(.eگV \q~_5R[>OCHH aRUmtr q|Cr{ݸt1my`6zѲ^) 5X B7T7NrKA+k%NǺeo#gޑ 5ylk%'YFohn2vXz1~K6CbK\(ݧ MBvAL.QB~=.?s9:ZkwLǨwc:.1=DTq.=x #Ql봫6wgOO(a dqEpۖ[_~Wz+Hٻ/ˏg_ս 쫍~o9$ދo;-HpƽkQ?~b_~AлMwi?4eʇ׭[o l@C2H(ͽjZ2g4A l\]olSxo~'=hzHD+^BҀǡ@mp'O PbYOULU[\ph[\pV%"T<.Z7Oy2Z)?vqOk E$aBdS5@eS!N͒s/$lڗb0NDX H\Fu;Ѕ@T|& vLz@l A5+?tlX,"Dn@e;@MnW^@5g皩!NӼ5ŝ.S 0ZsѼu+A~#ǺwE*u+ohݦАWE:eis ֭)SL3זOMnSh+WўN%vWT~ o!6-6 GB$>Z'T꾁kL6 }N Չ@g:+o 9L6#)I(aB~N)-T$.8j?t+?^ǂf/Mya8gqOT\j"q6ē\tiD=o+'e)kFE˞ UC6y9X`}I`yq˖H,?dHnHtx]ZŪ*ןnj^З1lNaD16C"]~"^bXp ![L[ CuYOU; &( V@&-Fh)@'" +T9( oT+Lc͍`T9\v /z+QkM JƠ.u(e(E`D^d܆BBy$` >\{;%6}nd?ikn$*nDR+EK[St ?_q6f*y~((#GbnonF ˘0{)WᏴQ]Mf~#% \|Y&<+Dkps^WqvA5fj޷ja6iVTMϽ~jl@3CT)!j4OU?_CA[|s^}nK_i_%^6alnmQ;U竄W }^ KM=!*hG1 ѮDc2$1g^dc_bW j~+7|L@Ӹ2j3>Y9;c>߉azdQ4cNbi[,\ muJ #$Nzt(@J9tW1A{:8L%F 1HuO[ ,ˋfUn0̏#i{$O]s{$bL#EUUML#iҧL4f*cLSbE4V(WSHr OFnbHJuxF^VyQG`bA1'F L K S ?ލQNc"ӝo;`Lr׿EVΐB WAC5\;QfcK.m{OKpђd>K,X?ּϳ` ,gղis7oݒqLSV/c|8s$.Z`08ɾ qĜɾBKJbu=iL0!iJLk5׏fc.0߯_-IR̨P ZUR%ď? 'AÔ9P_8~HfCrٶ.*^il6ҕiʨ)4ynSg۠`DzS cz} %iq3V gx$6F8f =+9!o սR4֔)WUxǫhR!8NP )%L)/Sv(ǔ˩T2r5E(Wc_>ЌFw:|>b5*mNfˋ/b8"Dß:̇zk8Թ]:ga5vm݌|h)͔ho##4-Qc602~%/ұ5P@T넣88oH #Dx4w eesP p!sx!"Uz6v[y-F| ֓TGSܛ'_A 96t 8KOyՎP!BiOIF j8d?hSV?94Hfӹg7~1shj&89{rY㠣*CG|4Cs=<^G`ݨ5>]*fvo I@$XeR1#5 )iiJ,jeFM>KfgAݽg:>3s>]Wsy}>:q1:C:M,2hKTZ:w1ooqڍ#T.xVÕ*ORżEj*f=*bL*L 䳩RY'KϜ _9PjV=Ya_W$Bs|&?Fc$?G7npch)ҳ=â_>zm ,3їrH_juH"wLF!;/eHX^m @n)6dȆb%9*:A@U*KsIMTWFi, Y'p0D:2@YZB<+8R"su=T8FJ-#5ϔn=չd;>J֔52չL\؇#ke&/CQ1F ![C~xww7[Q*QAQRnBvY6!""uEUq*$w7~7Uŋu?7@،uNm nۥۼ"`.^o/VcxZW!$}5!lRγ?ll7@RJTDnsح_^ܾz>~"9jW'?PӋj_?5ݚ_o;U/.[r{$ez?"0g98O/_ޙ%<[bN&;%U'Ѹ?t0OZL?7ڝJ M>Y7~QFeP'^S3ra1 R{ChLPQL(gZ2+=eYc6[k>VR_>Z6aæ'鹍`Riv(}ZˮJJ\Qz(<'{ 'ҧJW5JY(e"R)I)/R+8ZQz(R)OOI}ّhT(=GrRJIĒ(}JGRkƥRbUW Giϊ_lz̻v 8-In .4EI/{3S_3͉̰Ԛ.Z~.Fzù|ZI,ek|Y'K>z/Hc~+K?%r,V4qdvXCu37(t6J(pފGE$gdXB&d;Ƹ Wr*G`d|^J6䣳N:R|$TkySRy(:V6y2wo^@޼,JD癢%եRRǟT`\@DT9]zeJDW ~ͭJjdsfiXQГ@)ʒ<=ܹh+s*w,&Z|4뵍uO1Jz-4{"cE)<2bL9"`L<>WR}OGFã1d,mtiR=*. AweqI42t+#$bbv׆=Ƽ̂ʃ"<}#b%fC/RUqjqS֌0gAC@#QA'X0nDWĂ0%ƽ~Ml$#m$1Vǫ^Jd2Ix>{IAHzW1H5=tOo}0LEZuWQLWy,Gm 8=T 7J,]B Đf9"{xqݓ-zG 5GIP܃ MpQECIE`RVXC=#U&j,%īQ1K+<2DC(rEiE w(!3(ЊpJ-p,:$2#/`5FsWt&:5ՠ_j WVd"R1>`9⥃2'a9 5#>]h)(Ts~;_/m%ޤ} ۧgv\vWOݫk4*w?5+jV+YWJr?~:;9!_oNШ,WD~}< c-4Jy2w[CK T aD}7[w%}>E$Wv}/UDPn,Ry%TE!D|2>;Ii;V(tv:; .#^.&ur<"r6AԼyh CAK88-PQ3H{sם9rtNj'('+-9Ax!wxw%b:޿#Ѡ= l⨮ZuĶ*Ae=Vu>}§"iI|RF;mRO@lв18ߎjsKqW$HJ𥈓:WeL #7;,e̢*A0J8QLsz1nŇ" >k9o[{NE֥yznƌPŷ }𣹫ԇ>i}K Y7 wgE,I\P" LFHK( "r8uH 21\!M1lʪ3 Fʷb!׀90"EITsN9\(berʜ<>>ԊuElh?k(AW’%oaI(EPַח _.>~}"LLZrDw cTrΤ7H)`r鷥[&dFfY,Hƫ3E@Y12̃$o|kSrt\/gLf yHSrl^s_&vWIYRsUו9)JLہZD s,q4Hp)4ڏhJjƹoSl$旿8 lCrp5:@YBv`'5h |k&Ȏ^]$ hya8M\HƩm%a!lN\?&Ċ'LP!Hr-U\ R-u9ӊiwZ"_}^o<&iwE^Mh4L__IdV۴:6qX]]"wWK\UhygΓ=:O<٣=ھ-/o*++ 7[mec@;r6轒)t~[M~tSSc GWw}LIȡZ?ongK;a2,$.F;>Mâ4/0|UYZV:e#R!ceⱪғHZ{B yfa`B-0&s-pU!(qBuFM 3ʜ`:XN9#Goq4;mL`A^AN͆` ѴnČZ[&ܝtrVΛg+͜+P@G#7K WVX5e^וuRd> L497}?+;+4`ARiz4cw&Gg)DMML&ȕ7Uʧ(aBЈrA4C1C"|TӗS4lur_6 P4:W_ c(& JB/R`M#O kR}"ۧВwOS8ƑZ Z~w})Lc 1DeY~èA@3 e{p *pH\!/040,FudOmzd=B̻7:3_}X E`Bˍ2W=c>WրGO>2 ľ#TkhI}z+!^&Fw%p`ZrER)h+ȗJ?QcB= ZyAC0 x^1'iv0nu!PTTX?234s A̞yz;97~z8wA&xI1 ['],%~Q %ȗr ~X|;NbE^ۗ'ݸ3ifdvyӟVaj'D-d&srT0Í~Bt ēj`etGU&j!9jМKd 1ye-+kiTF O yށȔd>OG)8}`K*dVZcKeU^J;% 0^?65 = [oyb6J)'j) hHihtu[zPL4Pһe rE=@.NFy"r` (Le ǀTq4%Uq7ԑc6A |zYRC g.'ݹGdK//7N0C탺I3f!t_=3?`A9Hf~Zˆg،dpFi GM*Z:5ÔcPiO-cz4h5aV`7U Wj o#]w#Z}^۫ެۮz?/?}[$-\|:i& ez()t*?~zr)'c" J&LgKtVt?KTR>zoInsKqmBksd;bOMgW@~)7V 6k_VN9 $}`!@(?N Ghpt='G%IFGiAu<]oKq߳X^۳.ϻUܡ&ǰZ˫e8ܮ?ݮ=?Nt$\g۩cSAVzGwYbfvQYSw4%R޳f=X7nlJ87[b[WNw4nc"dso֙\և|&fSpHԠ瞱y0zUZJݢ醕 TCw}̍['L$_ά֚XK6}|ISMEfr+DPl`2g< d QblZgV=Wr|ZNFO=hY $h(=6ߟp3!Cw4+ґ^H_ 42ɸ%c +'%vK6]jL'Mƹ(uuD+m>:I8Da!4`ҬI FFmB@{;XZbt4')[+:5(uhҊ81cwbJ{T?2*FuSC2{ `!߸v)ћM ;Qχ.CS|hvχ,7N6E3 &9w*hT6*Br5\DgFzoDl*\+*p-'T=kA!̈́k/O~!&Skl76fB6 H; 6s.>,qXվDel?$G &VW9;\h :j`7ԆG),hP;/8} _?.ζEZ}.:Ө?/v8V`ֺ@i(O:a;Lכ0H@$tw@]$ 5hhDžC*}&}Q0Aѐk//1raϤM煃~0KK$tyɣ2s@Ap9@<8MI삕Q9#5;RIU͚Pb907ᤔ!TC\\;~)HKtW@+ jשպ sYr`IƗë'+!ڭZP oc=go~由 O: [TRk>JŸ)"(%c( j1-VUNJQ)E%LVHd,b$(P&Vt aѳύ:5OkާexG= ĆZ AVK.0nTJ.ʰZFU+`c~e{{q^AÇ~g`>b̤ŁOt^OVǖ\ nʣRfLP4/a1b?;30V1f"b`r~ 273!i]+䍗-i ogoi"]v|foʪX'yM v;j>~WA ?w3}w.+SIjcY7/y`b#m"~6n.O׫7is? ̑TEYH3^22m6b;wfNfmwvAHs2ުG:騝&l9$_n8/Jyb4)F6<`^u?嵋;mgzc92h㌳\@>`pd+a{CѨKt)ŖNbQI xqȔ(ÙD 2@[EFF (˒XIpO_ Gʹ%'6o}~[|X֏l 4X>箽df?.((z\Ŝƹs꛻0W>6aBLuw9o^ŋB%?}Aܶfb =_];(XZ^, YOza05.: f $:'N4 *뽉栓AfI "NfB_Gę1Hzor峋G5}̮kVgEg`3k2`>]mnlk֒`IH$.Ḑ$(8_EYz_ARi>]`m)>.}-{y=}i0xR3Q>sZ:C}!+ Tvͫm,d]؝m{M2L],:*p:|so&ds2<s T{"/n5X[1J .w Augd AG9\&POjHC5%V*y`f+&v•,P1eSߢ*m &)0JUҢ1,Six fm/;um ]!>ꤣv:23ꤛVAvty(QцtƓog 1-U$1Rh)O+mZ}? Lw4 \r]=pə:9X2(bN@AFRsF5}OKMLR5nb -%Z_@笊I\kJy/ХiG1 D˥ۅi,szvYvBĔxve\4EFB~ ȵ;^hZ$5."Ys"s2zA4(r4t4A4A4 {D ]']8LVhR< -"X5 8p0Hw%J^KXAB( +JA:U`/8g"9O؊H}]4>n&dêr1B(n8>MYw-{D}Loygx38_EZ>ZIx ټVe-EOgqgo/l⧚nnOv7wvYQ[)<+%miOQ}F50"s#Rxų0ոJY´i76'_zV*(JU?+}u`Sm| (< 0a- )c4S$3 .JVy!FPl{S( 1>V<<1 N23Đ֮:S6vZhkt1V7\UfXU /V7+jj`@o:Y{&GWޘX-$Аh7F,oFB0pk@%i|w{="•LiT b@nbQPp}b;-.^HV:6ŝ] Ϫd.?馝n4/=_!'+=J+ *-ӝhW #=e \LIaV4Z!!:0kTȣg|ec-ys)ܢ$ÈJJ)f4\ KƊˋgw (H 5]P1^_]S&qɁV]S%ZvN],8 70':0&bvRֹy} ;~8( oטRK7ML]Ɗ3*ЇPh'Ba1A!b'OK<-[qTc6im>in߀ |;ߠ=zw+$Ȕ |KG'؟ 6STL?ͪԣS`G7҃\rt} Z׳dL&dOH"ݠ \?:EdYFe@, c,XLV 0Ivj4P7vR9}MY$X-nX 8JK+&&Pz/RkqjȖ-Ȯm43`O-RzLRU$\se%(9|vDc _!Xpb_|ʿ g KW5P_ PlyN7 9wIN`סjXdMFh=gݿɦl2۸n9M?ҭ-#n6<`>pcFC\~$x7G'sRK 2x.+B+ ɵ|:j`_ՙwNc͂ dqcDWtiQJDc4QyeyeЂSU-qZm%-Y`~Ir@ovv#-0e/ר֍oh'yZ  3PطSR3kF 8swd viݙf:zf/LM31Hi+4LG],m+ufRz.Y0/ܿnDl6G"e(C\ +|Z܂`RmÃ75`7L$ZXJ.0^b6]ɐ9#lLQuknZ*utα2:fRR[;TD2%%kkh1QaD0}, ՐI5ږ6 ^rj~krkKe:iu#"eWgXPmAÛJMMM1\6L ՒxBiu)TN: h0KJixuɣ *uZ[KcA,$6]i0%Wl/TRvV h1'Tz^ :@v̀=UZx`[!0`&e{NSs v;j]גi҇˺w]%3 0bo?mvFm;m jm"";b{2Wx?ph+=婿 WVHo;KKH&Utލڶț_dSOY纥}L WKn2˕̼UB!C^/-zb> Ř+nK,(oK8huJD8Z`)g̋{kD .4 ]Q*έXw쬴]pq弮Kmo$G&7&# alF[ݸ j V3}eEJ/B1BR#R4&xe*AeZU>"J#pK oo̦.O;e fYZ͘XD 髧/' $Cu_4SJtG0 &}#";_;'az?[LJk2iC>#0f&6nN]ٟ XNRM. &y5W\ssLAW#baYpJ0 ga|z?\Ϧɕk~zU^{'%훥5F(s$V0Tڶɜ3Gz<ߋ\PGdy&J]lz/wQM9k#=v܎+[v&02[q,W{UdYZEvl6v]MK'!y;"$m!ϯ~iƻ/?ZIKiǻnYO.d tFl#k8ߊ*F*H->:rݵcK&h';O4ĺ2gteo 5\ROkqK,;ލ"_ݮzY᨝MA YxY"چm0Uz?}+I°0߬ST0pPߚURJt@&F!Sw7+' @Gh I%v>l$Ѧۣ^3ƫe\I'P`9% g\1SU>SΘIr[EklC[%E^%YU"Лa l3w \C%(oQQq<Ͳ}2xTD%21*˔h18nWCϽUeV*0 TXqAC*wK('+Jxh2yh& fx([{mY+4fOԖ C6vqH\ӃCt""DŜ%XBt*LJ[M u %HA`}x% pW7C2.-P$ AX;-!&gpڔWPN[e>Ir;'XjA#0\ i5 OQw .͊3>bB`5\-?0o^ OdqOɆ?nko- d5Q?<]aJߺ}={>cLbs'pCS~EjE<%G^ݜgEL e<,UkC>jfh@ygAXuݠűy"308QVcA7 v +3xa&57yiρ^,䝓?VZv6A]#ˍf6l(쬝IۦgY6JZ+8;`w˒;~zt)n&cv5V2s]zg%k!{.{c yd'x|/p'6aٛBk#=Qs۵Q+(ԯh@`'[2trnPzRQб]mwSs-uc6*-~'VYݬ%n:SGoB.רFs5}ymj-LߓP3|*K?=zͺiA}Gv8d*p'.s>~"*zO*LL6|Ȁ ǍYhUS'm+ 0*|ʒ܎h/ZƑw$S4 Y(vrdQ?7J)n{f)BBXUfKg};\VI S imd_N'䷑A$ tCd#ck4 .X ËDBBr J >|L/icLR]J&!!UJO~y|[LbwKD߻"ܦBB_}(f_?|ҋpۀ@mAwٻHnV"E}:nk'l'uKꪶX$TQ$Eԏk:PTnQvJfww8 1Q`4}zP15kP‡mC]]8 P†d^B\ 7;`slDu;h|$D3{.e3L6CFe)66W 'Ve9ToUy~:pTe/,cp]yxlZ/)Oy:u~PwvM.l2hdiСUjkpotb6b*y}X_R?|RDfIi./֗2!&Lef~'oCD*XHބج$y\@P$iJtGћSI f[vLpJdS;ݴ &֍}]YcPMr6;zđLROvV/E(CVOVN<[kfY锒rr8+'WO+L㺢*cTNI5+'', 5V̱h{8qF\H^h.pqsK<$[e-N>NUR,v˄ S 0eu[g<;v-8e :`JrT%h#5 mEZ^Em]a@Mvj02@!^g@M&ɶsc}Ԍ DF ZϧֱD usuk((g*vH(HG5Gj f6`0قmzm& -5}FL te,IhP;vk*GKkWLԫͲqJ8Q:+kVhzmuh.ܒ*Y(|SmPJ# m'm8 O>XU8GffFlXy"VbglcBY'(Dχ}43HDq4Hhw֭ -HhuF)AUeWMK{Mjضll %V3Jrdɨh,K"nPT jh*0e+Y#*m6wK~\"ʔBOBof#kjvL ]3S@砚`D а#9:)"I3aMM#Z@`J,ḿ{HE>6Xk)|˽5VlDcQPz/E m+.kX -|Ud㚛 A6f ˓!2&#Ҫ##ϲ$i!V a}`Oy:u~v҂R3KF#,4?n(UQDBWe!s?[Lc,TH͊Ɂrq3|IK  * Z*\r#j09͒=JPV([V9|4!/s50>GxZC"g@y '.5[Ef!&G̀\8%l("12jZBLH 3 _c]/2~iFJ\:DmޥCs6-Yal3&#(ZqH&,E[-`2 U\Ȕa gWNo31zz 3kñ$j V',)o.x'H5>N"}/>_} R Vcy (*]! +[[TPZ R; av1[xA bVt^g[9R+ ʜ>]Sl;Ȓϩ!f^ђ$!O%*ZRe`抖l8TrXW FFwUPͅ /_O:>~X>kq7o{c<ݶ˜O<<] `g~[F/1z?̠|`ݗ,@wW4nM*Gi« =`, 62)WM$L_<, {֘Q$BVv&2XY6 ЎW6J̅4wNdE*|lijd(G09cILJ8"`u Iu,E5?WKZb>7%IdF=QIYFd8rIY6dc(ymYn0҆F-:15׵X U;t`ivV6FVVF P^I|Th'z9LՖι7zH-"d+sMRBӂnTY5X*Y cDTciۭݞGŒsϜ]?}~.,@)*~N{:V^)gޠP-g?Yz]B( ||+Y{J({V%wwTJV^)Rh8PE~5wJemKѭG2~I:=AR%hfDht0H|n8VOL|.ϥIԲrI7KJ\sdVg(9a(^]Xwq*`dsԪnm xb8֑8>lV5(n7䊒M-%uN.y9[Pd8d!T[#,{2qZ6hSmٚi-T$lK'%<c*KF[DT)w/-hMI8YRV7mjy]"*9.YRmXz{#J栥ܩSKQ}էz7oXK\RZ0O8kRqZ*pA͞KG伦<~ U4r~j3:?LUX:@m;@sPmňs`fX>vq?| ʀ軵G Έ.~LhU-)YWQ!%=ס ׽MO{عN@ˋW弝Go^Tӗ}ND^2yiFNA,,TqV-U6FJl†*u$yU #W9p8r88+]w>Sdo;HZtB2Q{л ǘʓZ,d TҒMj6`iQWeJESm"RH,=G4۪,Q"0VFFP@mm[A`Qc(bd\v.)F=WP@IMijU$BDsD7eU, Y j`KObEvyQ {PlnW#Hn쾩tT|9@+4w × 4u: i$|rJ-*U!梔5H{EMx5_y[_(1ڈЎ4%è&8oD84W7T'?dEUJ+vvZ#esq>^ma}Mn@ViP(L2D-. aP#MHlDDA]").X5T$_x&plO}YbOu}{ q=V1&UA\K2R>Vf:'ճr,$-̜S;@=rw_`-=c&+nW_\tZVчE,noXRCa5YJXڴ7?כr?ty*v!o I?6iX7u{a",qnc޵62@8@p?MlmmQ &ck$hy&ud3 hx,rjrwbqesGEUU,Kmeٺ*]:5[AC\[7i,J̾HEhŻ؋uL(o.i ]ZN{w4]@@GEݲh9C\pC#a鸹5!k n6GNNQj$<+Pͽ=^ڢ۝-zHh)0aMGުpo,wǦN:j\p|yxr՝Ս%O/uU_ʼVk}|b6[⣱y‚dUϟ^YMP]3uj'?}}_>}7_덟$- M;OO.d2 M &95R21aJȜH&~~#kx( dP ]Cͽg),ރ]/C:Iʟ]X9jlnUp]%Sm~1wF|Xu"qη3_H(${_w-'C~Wl}91o4 /ysx2)v{5-誝ЦM~t˜pK# >5QnnFҐ\EtJٓ?I[!-16weYm U4Kǹm In<:3EXӉ"e7aFА\EFaAؕUzG/>u/?fc*Ы̾}C7*Cm{`G&-}xip“_@@ ;nq<Y<ǼagXؚ`,"Xdi:ℿ(XKfW&0]3w;NubM5A7C'RXn2"`C.k5H5a@Ƹx hA $-Ma@h ɥ0 h_fBk:))"?-@_ԹJKn%2g ue@|;,LAbAr~*a>@h" 5b~*cwIcؓY|S,Q" ;( *-T'IXb ڢ1}QUܳ f?UCrէZN0=#(1o`; ?14+WB?|-16d)?uiА\E3tʀ8Q|0,n{۔Jl췼췜6dLwiF |TP߬:9s2$+e< + $/VzVEmznLZ.H8/d\ ;v&P ry;ڛULCydEKqe)c"p{bR>gt~˙OZZ> y=X]}O?eLn8K^9R֨A߃Jz!k:d.V (yI,sqO'֨~ҳZQJji0[-})A)})H>sNY}-^z?J\ʋRf>I^7֍a)\wpWEdT*շ)ata=<橣 k?cf/Sp7o@O\Qݧ|x_L:WٶaY F~eP6;z=vklTum `TB]k߆/)\2_$*+M墇Zp4b,5ZRMZBm2Qdиc+|c2r[8832[=[675+q3R}-|iznVddt;UԶX9Z)B⬨ 8 +Eȳ҄Z ݘ.VzV*9Jl/Jγ҄ZXY[)<+%AnC}Z dfљY6iMbSvJYo@kEF-3!0]+Z6p Ѱ󣤼>Nx JuuB-k%ځր0\|0V~ӲPY#eP(IUޱt`!Fb)$[4PEG==pE7{!왯Cl܋=da/dNMfIF2R C+JY9k .gDJ] ;Lf݉՝!>v7mq1VV4Nee;:*DL6MX"T*MB쩈~T֝kE@@y:D\v2SrdhF1WxÌM]IJHK4-;0o~%Pr/Ӌ<׸(z8@lVEfAKVQڒ_Ō t]qcMrz)gvG- `7XV6-Y+XID߼e ,D< Śj+n/iam鍹y3H96{R5mYoY)YJ+RyV:C=XY[̌HRD{&^T^)PSE\r 4,fbלGǕY'G VPSffwR(uc olN -٬E}/J9s)&>/nCAK,-e 9 0eA!,msuUK ʐ1SFw4rY!Ӳאu$AS;:)tb;қU*]9- &2Zt!<j }2߁螐|va{w ;wj1}Kvd/Eu;U8K踩vua$"D%csrkT46堖{OcY*WT$zhXJJ08P*ak@H+[!~R/RϐzJ~~ h Y8e2djä;m!0TU+nֲg䒨JԨ}U!h"⟫IƓB?E;iRvfo_}|Rڎ1 \ݹo ﴥ= Kh\mϤw)5~!?}mk/JBU4ڋB4F!.77nIqRlpuq3 OVTb-ɽi\-zjE!vmd"4#,q6xԟosvԤX8FP_F~ `Fswe[\m^7,:Us3712X9HpH7qZ36 ͓>o*wEUr]uT8(MlܴY:[,[+Fl,Fl쀞-*B=qݽϏ\fd;S9J TW|p?%O/͞ PYu!+m..h&4HZ:~rə}0~m >!ŏoрdc[(#~#k3® .2U!ڙ24pP˺ΰT$ǺʕZC Ƅ6W 벞7`%ɂ*υ /AA2v p --÷טbbҜxkE%֚E@FyeIQ J" J3ZM !EGS\w#BMv c82{HEg1]jl_ 0$O(*,#kO]V}"t'V_x)@*[H- b&uJj@5ĆpiƕEYۛo$ҙ8z&QqMDvy!ӇM?|c\YM%$-gtqAډ y&r HsDԌ(M3۩|yQԕsBd%+GѼrt\L_fgTzx7s=jm񲂃w) VaS3iIZX-Yoe}~6*bm*tXp/mWۉ.m@~|6x=^il;F_lc|6=WYisЃNsQI8Vu`k얂#B,j?A߇-Fl؝{שsNiO][P {KnK#vq67`QRwƶ TGӅ[ eu?ҷAF3۱z!QUrͣa>TDbeeCAURp%VsYPUY“ZyJJ6i{';!(E1y :%g@3"DOբ`| k;}W{'̺l7vG/'3y0])n7x5ބBm*,I17@뇻O6#0ejռH [hX˥x6Z`4c;dUW7`S:^/|Ő E+(ğL\*3ug"l,rޑY43bvmΗ5x&v,$Iߗaʒ*JRlp䔋\"'LzFL)E5l%k5 AvolDBbǥC",3hվ=4`:&WW"nA۾ Մh[S8F)&>5l=8wQ G%%Rˬ=Uƌpx&]5_;"P0aJDW Y6pKs_g+@pʻNF0;hnZ:K~-i;&f;<nݡ#zjxe,TTt, Ξoj%"wgPZi rirVFriUg sH,'(!!\DdғAM:K]v;xpRC[ y"%SLAPk~DK]v;bx:vJH+цLD17,\ z~4 KA5AWύ E=?ӡOՕzO*3Vai F?~>B -ƵV86@db];[p>,a[7vϠ:&3'iُ?.{ϿT>-N[9 %YЩ}-lOGY$%cYؙ 1X,d.n,,5Y:ԉe19s0SJ*/1vJIŰ8Uf0 !jk3AĔ1 O)f66Gr+,a;*Sq$f !wάB!ŏ|n0t~qf"8ѱlj}yuLUv'i1p ]q1﬒Q#3gMEMt2=s dǯ?ʔ2JA̺{g[WԕFU~?u sfa hu?*%-Qy5{loz^.ZWih0TZ\!7IpgC vc,eō6 .TQ]w[j9}!Ryt&V _68j{-kWUv1yf#(vrmF<xےvٞ5&Ǖ'mdQt2*[הbpxネ )G"!F'gBM\ gl#fJ=3yM9{X*̞_7pxMwfRK0wx0 $|9LfpS!G?)\f1t IFͮq%B납ΐ)n@hW@z9:ή!I +'"CG$PlCX]YU'zIFmi<k\2mTbG̡lh阒I(v`p{Jv%HI  "Kʞ'' ӧQMGayܼ4筽CrLdo$:9Ecs$1E9;sU4t93Ὕ3o/l99mw v F~'%S7G;l~jpm/ۖOY^gꓛvD`.C^!A~:ݽG$^03)9CzMψ]?6Y#]ry$=2DbRpzk0o۳s $\\OhA$h"th)KϽ(n*` AV`_nv=NP"ɀ}-9$HJnk$j0DCd)P[YT@ P l!y о漧o6El4"P:TpJXYL+l_ٝ֫{mt'brnc*8NS8BeD Z~ɠ]@Ԃ8Rz38ZvG]r6<>p:/ I5K3h0 jz*$/NwUYlm%UK!y9IyÞ41,)؈7xT* 8#"D;)F6@) W yCfݡPA͘Z. Ozr0q0n|/gjh.Sw1y?߅pvl /x-Wkۋ~bE}f1 ^(j+=_u'V!+ ֍M5v 7IXQXc`UD_ MeL:Ϻ| qoM]=~v|PbW= tk\rbr"? OvlYaDy&ņLlx|txkh6:GAe޾\ $CZ͏؁% |BTw0&#v(e-y\i)WR0RqCS@g}poy-7Q>o|D f:Te"ᏡX9@Jꀊ8g3ȵbbo_\2Zڊ/rX#4Eajk䒈΀(MY)7 ZVPk,ĂbP8asoU:) e4&2j9Ƕv.Ɲ+ە}RUkEPB՗;$K0oHE+A۝kEf̭<C՞#@%Cie5 2XU&n]bi+pd}@i[0e ßvϦls8]԰5 ! C.MoG1@+Q9xEĕW3 0X_>7}VҔw7KTkm]gqˆ ǨA*iz # fG[;|r׭ZrÞ^<7(s5(.i/4zCDQc4}*75ZoC3lD j@Ũ^SRpHxXAjUe**Ne+IyA-RJ#0IK|@?Dtĕ=k/RLdZ4Tщdw^:4CevtOGO[x5:ő D`Tw)NpA=}W~LSlobiizZoI#[,.@"#pǡ00I7PVŻГI:%6 pXM-jܴv\$g>;c{*gj-pLA_\)*! PzW9 ު. RPtS i]RK( @h[RH88=cBޢ`d+Sj,j3˔t, ȎԜt?#{8'֞@EgĄ*cBXsbM,RIJ*>4D![hsC2g6^Ġ!$OF'yl{n鋝0];LNF{]|N'cY}xd<Qn ی78>  EΆ)ځ|N3Ҿt:,"5C0R5DF޵q#29n˼/p`{d8ξBFFNSI=tݞ_*4)]ۊ)ԃ@T\w2ǿd&4FЩAA>>nj' f,&[0`v lnȋo"X)j82Q$&Dqif@ N9Bx~Lu;Rm6VQtKw.-sJʵY>KU̙3΍wٙPT9p@]!BdNAs+Z4JWio-dW +>iJ/7saUWW]\JZ }[ 9>6zs$^Ԓs)Ks.jKRWH5x fb(=PEtL^24>>~b ۸) +8`B*sftf=D¼PHH8J.4.j9Kq Xq 1$(MfPDba9׌UlTNrJɞM̵Bzka FNwZw_NIT6ge cB&F}W $ITce{ʑ85hgXW9͸wvȸMym .zFs> Ќ>IngkzRп:jPDe 0'flCd9SP* !jA1jۊ&VA-Rއ39ӓ sN8>g"1!YIc-2iRM(syNGJYEjyaP T`5x-y*0IXѷ7;~ 6hwE͠K!` jǐcĶxλZ}l+X:^v voo@]rڋ@v%j=aq+!Oj)016b9%4tG]{ NJ FGY 8KNRYґBjv;8|YE 1Qȶ't6ժu?ж3j.88{6 Us~J_,ֳ[D&7;&sU?o򏟿\2:[|x|{Qޠ:TGW='s2QSx>X 큵] ) pMgFaK6@V*!` B,$SS+Ԙʗ,+.AF[zga2A%3+GpV1UA(轳$4R-PV2 BG%yTT4iHԚqf"cC,\37ŕu&/nŅ£|ˣzX>_fcq3?bE~}?";QDhA!B K}($`*8aG 04ᐔhN !"hp+#PyE,F Al җ n~~eS}Ve{2W0M4l '9'8uxKSD#J;FfFH\#L \ <~{rOBkss9FÓ]M\[1 [HcU(^#EyuCS{#x@lń{=%{Yi 1yq1=»ߛH UgRo!אX+ r{O+̞@.-03veaIa@s)NP,+m\+^,뼴Tf ք/#''Sr(:'S6OR'牋E2BjW:׶5gZgUW\~Yi`mc 2Tjzّv9upBl=rCPI3U8FPJx4MU Ovb Ehxe./95ECcH mhҫ ?!`&:[=Ȩ[} ~j1P@D/8kh%"9U'y3at4'>'S2j(Q"ѰtJ*m0RJhi4omh(`|hʃTO՜s1{ܝۡE$'q7% ;(nzCNuak#}~i?9?_&_'0b ,Qo~%-b Dp#t"`2˅5"_7ƫX.ebcfBZtQ=8)0yS6"YI)* =!NJb $tY^3.k Rƻ )^RC<$IK@z鱧{4\0! H˂ fMFdH(wY%zoDcP…ShQPTJ &y%lr/luѢt=`ǴucVsm&?NǤB9F` 9t=Rm ŵmÕjb7\-̰PLuBm* D: ANi<ฮQaT>V0ҥG:E8XՅ+"⊫\!DD9Θg"Bv090u  ?hd.@LxZr"S.A)M9 36x=:WƚDZW2Ab݂z0iju|$&2+H(8>`,K)Gǔ:xdME"Y 6NWal*%5MQq1d*t:VaFh _O R 2 ?a3'IgIo k# <>دf4Slr\ZL̙Yݛ vsV>!TJ,sV\xѸ^%]q=f9WIR6&T+aT6&zP ](t@ 孟7&@ŭrx+dOʅozniürS|YY^Zy*v$ʗJDŽn Tm:u,3W?/> 3[6V{ +]*b7U_TL$1$ѐ@nIS>٨{٢mJzxіTJZttWГdgEW;⼚^%޹tͳk_W3]6ɍOY"d(s'KԊU8hdtĀiR8ăOypơ&h`M\N?ߺ<>:O&6>@jV AL>(QQih)h`?F:ך"? 'SzHjFO*֙$+ <QJF%*+!sOўO?}[:Fl6%0ϤG}x{{{ʽҍ_ rg#(5tВ"5KYm<*c=yU"6&9<UI;8xNޜ[yˬ"+^,tUO-}$N 1 ÍsV }'2Rr[VsPa!\RIW[MhΡwe]'B='1@xojHጶ^qx!kB+jSwmMnXtekFj\Wa*vIś<%"֦ꕺdR ub$vM|8;X8-qa[jm;l|f;7߹;tSPld*U%=K^+ZJW]hb?Nd1UԽOMHmK1qMxYV9c )Mg NQ$>ÖOo1 4ڄp NۯW^(BFD_D% ʒge;cA- $@䲏g<:@fM]8O@aJ3J( b#G9M(*D236C\wKc2ޖ"RfeuAӑY#Egekl =N*7h{xQ>E ^ M5 RAOXUɪzU RNS ε{!3&H*_>|xi7.^Q  n6꥚NKVVD+Ǒ}psBR@8Ĉ_apDs_t[DZS( $8:*fg"XpY {T ]#!(TJT&R0^(S3M=I(L9v& Fd<&K_aQPJTjJI ((a qUZJF"#Œp&pB/joFڢAܞߞTLwumQPH&&PH# !XR( $Kc\ ![ĵ)rV{@p`a?QwE`ֶS.TI8W<\|}/$b1nts;\gov {;S) 8=oi1_F+@L(WNq>ʁ}C,/vQV(3t0P5&y~~S7UVmB^|6ڃWǻW/L {?>?Yqwo0,xK4缏k,]7rygA.l O&왍Dy"}&@jA R,[|2XOr8ͭϲ't0lٰ}Mɤ~׏hBi~]D̪\po}d̠SD+Y~,?0?QCn/7T#gN! 1շzJ0ƺ66oM9αFS+bնUWc\=TI1`t_D!J8ٹ 83k5Ɖ "{nW|=đs! M^ CoO;McӺ* ,J iHW6$)6ĊSbL $sZ,gKw2˧}a|p`&i}IOLQˆzR0&*n7NhE {ؑ))cL?}P2E 'S|= wghp LgqrV\@:{3Bŷ6o=Dgw'=OgpJ)Lݗ"lޖdw7AHPmS6Z߸$jxbq 3ͷ5LZjw=3-AtY)^ܮ%XS+t/7jծ\N8M\[5yh1t n巹?;8N%Rz_=tT%!_SCXwf1cnLoX3N̺E׺!_=v,y:/}|P+tNartݍ-p,r 8V hP&JhI'W|M盝y~fŷo^??y}~kײz޼ev,OӘ` ̚rG-fN'zwjֹ'=SD$&ďRӋY'L?pZk!U\:1RwcARFX10]R[K9Uy|qhs"+Rixfƨ!d\^@-#hYŒօQNV5w[5=F<$DB! Ux&@%!\Q#n"rF~+ǥg* R*WUjߺ3(2u]| 9T^?,u!jS]%A5:@aD%ԽֽJSa E*9` uKw u/F0lF%#h!CMdF_I~<+}|VpvFv0I,0*s LZEQ`sQLg:U:aIedKdUΥex?Y-Jg@]®JT"Jct:t1{ ڝK )ׅp]$E"v qamq3{G$Jݩ쭱=R?|ȖVG2='<9)۪y鮛/nw|Y/JVr=ƈP/X6n97qҬ5shK'%2G<Ȧǻ24+uL 梪!J[ <"z5i1 {:Npmd䌤:cBr*'c}OD֬pPvNzӬ0\hЬ\OgV ?;CWdeVx$;PȜPP,Q%#x2Ah(eNQGfMQGTuD)qrEgtJSkYO Rdn|DgGdQ'bTM'fX SF(";΋4fPRVPX_DR3kKA*NkV0c$eQT;1&ƙ2S&&G7ܱ_kkL*P5Id5F_O1f+Y2-dLv|.Tp6)~FC&NU\R䆧z({ԱS({PdL쥄B) ᥅EG3إ@w\Uڪ+%`5FKq1"lm؆C(rSa՟=Ol|T$!X2W#XSXor7ZUFW-) 3Wj⡰?FҮ#U SHFE]WB!f5_0N$+`L WQużfB#ʚ.^0k bDX*.Tբ.`8G&NםfWW;tc,]Գ凪=jVc]R:{jnW(NHTlV/NhN\,pDV!Ji@")%pb5r5ê⼮K=̈8(9G3e~PR&;9;:ocXȅP oq/^:fP%{'緎VM~L]!U\Pv5TW^T>UZڮ`LRDH tY*OgvfoHۧWda~'YþsIA O;D)e c%"YeW39*!JB1hs:MEBw8f\+fڌ J;<@  x"*bp@}gƠK? ?}x/`I2dw.k9#I'GcQΡ"HQS a jmcॢ^*Z0!94R JEHMlTw}۰X[ՊƭErX;ժW/ap/Sݼv7wڼ1RۿiH:>S4?xM-í;7?F+Kr3[Ql0*9MY{gI^e+ڳqr"En QÐ/m|)>Ξv*{{0Z%QO4C)5M޵nBºb3ubbκJ$nn)f>E)@ӺgfͱDD?{4+ %y0hC 8 Lу2dI Sk+|aI`+6)kN&LVs $Xנ(?Y~< :.4"Ќ:(쌾^J9Q5}!0rnL?].= i"PF0VV>J9gZHBK.Qppb7A觭,)$e!%G>"blli8UuwUu^z8~"<{U(KhM%~^Ry9ME9RcT [.ĖaJ =\R=lgο s2 W?rw]+wJ|%]J~rySq. )Q0gĜ@w!`PJZ)-WԊ Qy4?֑9m+Y&N}|V~6n}e[O_*2_I@?7++~tC1Q쾎P'jOҧY](yd^MC"+X)o?,2=mraւ2b3ڂ<./ӈ@[ r]  vpWsBJύqq~L-7nvSq!n G#A>Y;J~ܝ_e=! يS`r}}sTjhҼhsrs3ɬFة˟,0 aN~W},$)aYɣ:Za1>sPX qczG*BcKa!KEh(hD\ZQcDUpZ)Ԣ+iJ]zٶ#(\劆]wEJ[le{ih.ۃXe􂅸li\=\e. ]:iT a{,](pТ1&ocsxز.YRZ>0".~7]$Ӗ{{tgoC\ Ң@^|? -:aZG`ɵbCfhWRZrć1ćOaȕ\#o3 ;Dͫ szyvִ(ٛ_P>!U_Nrjt;f֓s%LuQ˳஽Զw8.0Ě~a6=25&?z{C j4'w۔3wՓI _.zlhoyߪݳ?׿= =irIT>1Y&,*8*7/!Nfsa>T-SiO^tf%>wp^ 4n!Dc8EUӻ `A(X5aK),+cjYsApGIDZITIw <3Ym7"f[ z+{{Yv/S8+ ÿZ&%Z+N ƹE]d:"˘V8FiJdr>3;B5rk9N»wy7\2wV𝫿w,yRdV~wખE|p[ΚO}o^=os+ t$o t0ﮮ8Ҫ3"̕J?s臻k|> H{iTӣ0W9w^rK5Z3f$i,%JSrXШb:%+"@<@e9r8+^خQ̂ +" c')DKKDd: "LxҎxa0RlHt_\睛Bi\N~}r*cύ`hnN.mpͱ.m[*#+|}EUR9Vnzr׷ZvCXP}e]U!*np_*Z&Р}*~OH:T6A#WL`FA@^nʠ|y>q%+@\"C"Awra$ޖoS(6 xdl5Ub3Ety-($8IIۘ8PBC?:,舗!/ /޷^16:B" 9?h8rj !% 1<>x NeƤD C9Q`P6Ox4MRoG&ЃhIc) ZniZ8yBe:) @TEB7yݩja2-5ѲqĆh?IİEۯ_\|~KhfVSƜ ,wc7ϓYS}8i! RMgH[='uLz|򾬮WUfo.뵼iƎP>"D_~gV2ukrsI ^^/3>u]s.}'y>{qد,n: _?Q6!.1c4 J+A:D'p(Sq z ix89{.mDg^Xæ&:ոHH(OI ۖԔ,y_ykjrhs@lr(*{nf[8osw淰rZSo rTۊD۹sRw,症8{tv4uݸz,oȭ6]ӿ:m~N0#=KUSPY+7m - ȍ۰$?#(LdNa<ܧ啼ae_zC=k,Qb_mDz'z).`$2;Β:mov`k{%}W䳹N/DLJJ,%J\%gZh©|?PgXРZ;0V*pL.{MX䭻-ؼځRnQT#N/H<1+8kl"hy Q{bh dɣ[OI%5'˞[Ts@$>ZUNLb`W~Tp}uMW8D-S1=z`xH.w^4C]-q5 lEw_;6A2K RQ,1U%k%_BR,!Mn~=UHMPgc@^.t`4b:X+fa5sZ:kC8TePr+'䋐(jO+PW-(-1ei慅Y\ Ҡ<|`ȧ\\mjjIƂ)Z8Δ y72EB:!$4OFAy2Zգ"mP {/SUDf,!Q2g[IIR"D*1F1PMDhex(yCLv-:#Tȥzۚl3ZbB1%:/ 2֢]s!*E&Pmh'|J$u0Mނ >dt@ PKFo0Ʈ !TLAobPI4) &16H/]9%wqGWF>(\#vnђ#6pg'RΛڿgUĈic$SR |8ߧԨG44Bh%+gHz!ĘrAх:{)rup[oӆ[M$bŌ) 4!dC pv-]>ŶMmbJ1>򤧤~Hj8?'$(ti7-v}OЩQʼFe_Ng/ W"z+W"X t] r%| ;#E:P,]=fC ~+3xnw2pDwD-pO,6}WÛ軖_eÈ-4Hض_MIO]F >x.]m?#%0.Eo;W݈ *W7h{qX:` A0cB4PhMh .66`/4psn /¡:Bȶ|[K TS]|{H7v HĶ%%B"۶Ԓ } $}1(ap]HFjhdt y&˦o|W7hrխ/%;ntfTmu_"Y a!D{T#q4ʏŠt~GQJ]nnCX+7ѭmJ_Sؓ,y6Н ,=¥=K K.zmnTp@libػM姓 wF>\JPϫ }HZٸ6+V%|e֨郿'1F-@vk/A֨DL/E9Ot8C` 0&n(=_:\G)u{5+AW#K_go44 ;^c=' S[O,I\c(/1G[fHo:}s.G?kĬstfZҒJc jws?fz \Pwv:|7|M'lPR4N^baZ|q`zBW JlF2J=(p?6Q۽/cSK` vWOtb0~ M92jթZ`9@B‘&vM[x6G=|c&EPԷ1E qa*gp';`w^ 2vOwhWvv:s8]YB B67V}t?HT_ˏ>ioҧ2;ClY#\Aj.5g]s<)$NyYO $H)gCTPK q{%4k-]]3K8o3cwHJ}gb*TF}_mF!5iU`h*}c٤ o*)`k cRk ngH fD<4-w"^ 2!!,ǶMkY{6>Sq7oI>n?KcpDhɻJEKL_A3Nv}A99kJȟ\w?e=<亁b:~5KvٹD~C8Ձ#'ţL?h^nrݡѰaaK gnmJN^;]88:8Dl`y&h G1+!$љ3c&,}:"8cǎצfIlQ('ljKy'I)&%3yʾ7@ijfAY+~u4t#ė"6:mr$>0ehN;%>}Ԍ(.ʌ*mtW[RJhEJXJMf#0Ӷv:-'Q_QsEJQJI)Ә8^n}sRyR*ۉ^L%TrzcK/RznRJ'F4RC}GM|9\JU]ZLE㟵̹ v5r LGoa9 I?A{˃ LH}ɝz]μ-,騹Y}`XX),b-QUl7o S;CcFL<]إWH,;@ .?o(hK4$n͢VKlzet>||!4T@tX (1UU-j4Ŏo:퉚z2m 4 gjza<4ޅ7%gn0&yʇ{NsȎ WrDoL)_/)jMJ(…D="Y>o|$Iyӕϧ7Ϸ [ Uf]5Ak6xmiqv{ZϏ-ފ?މ5z{cq)o+_WCn|@@C)`oz@謜;cn3GGJ}#tFeot)g;e;th.h XPE5ChFI)Q1򘽋r)y& h^Z2ĐۼU=v^)lt%5/eBqZ*AƘEOE =мE>Zn}FSQ|>#Gf%Џw;:r %APSNT&q0y,%t7Pw/ѳ K\ڛ,p8VDz@ N+trQ,5߳n!5q:8Vo.MDD}QͣW7jqO_ <ۨfYݗ{U:ۨ]M&T j'.q-$)0xR WAsްFyUvE>WX{]o *0~W ~mm񠢋Pkщ4XbԹÙ 'bvq<8=IqlQyQ?qCZ "R‚X/~+dO F[~];M]gV1}|Fjm9B7VB'6FoK:uߚW+cسYaHOOz ٠x}˜L˜v^3L  <&&&ˇnD5jS˧F2SJ7f CN ג'c?4*|2 !e=WCx&N;R/il}0>Xd2HԤthQ &GQF;Fn d6zcbL=bmĎƪ ze2 ,5u;alVhl!e-v Y0y$7`Dol>;3bGYg76naL;(Nfsn7ѩO}>}kzit탆H) 9`i Y608ǃh7eje #Vj $t>}b³օЏX""u R+TQR?U>Z]_CzGUgJy[,;7s,XcΕ`.1>²}4%%f>|zś@hpA5PሱPule竊t\N~kxw뇏ΜnolbZ~`9"9;qތ2HI%6X=A1Wgv Mq@uEFui= e$O֙ Hbr`]@] f% ҁ]d]wm%,53=(U*s_UbLS4JgAKb>}_{MR ^#7R{D;GtHڴ+?^wF:kXu _ :B9LiVLiOkQ=2 heQ-< ԧ\iAD㗩DE4gAHÉ9dE_1g2%B ɲX_|nW/!VX@) Tbr w\HBF hpBiQ}w b(,.A߹1o/$5|sqw zSKgi7#5 }R)e`ܻ]D/eZ(Qg>{SͲ9]-f kJB´T9[ +#=3eO&;e~۲,&/n>gڎLbYt{Qu.o TCEpbvw,~vr&ʥo{|-;(!vSi@ŝ0~yB5(Cm)T{X+JchG N+%X0(ˆl'"rA(-NwbAɧu^۔nN%l5UȇzeuޙZag_3|Ŗ>n:٫W-o_5[p+)hՇ9,9ՒA=ʶ7kL~ZN`G!ޏm˫k=\qwL+xݑGߗ7~&e0fܞf.|'cG37X#GBD=EBI*i 8~0VE6DY kJHk`Ԋ(>,t{ hoR鍴YHڐmC}hmre|kڷW'Chqtٶ|W׳S^rՖK;^O֡A.nW$*ljxԼpTt`[{%:倭?Ƅ& !5)h$䥥v?0_&|#v8b䰋:H5'R/NSp٘s%;Zd &40Gq)ګ~ץveyӃn~cNH;Pwda7JyPiu z K@'uKuT8R#\ #f"%0ErkXJDLOʀ"=`$D ^| g|A?)R`kq8̈ې"xɆO 4@_NT[5>묅o\>|5^!/;"7mGH<`o௪ܝ.ů [E3\% R1C>Jc^3|RE}>ZjWɇcgUt1iy#FLw5!Xx򃳗-}?d9Q07QvXߢ @qaDJ κg\@&Pg=w=2y:t4zz\=1+C~XE]n Z8l"KG!WDq*{XcЖ+pIȐ}DӬ%y10 @;#ӶikҤ`tפҔꞪ~D.ioYr;7'gEN䯟#^ZX n. /؅ſHJ&H=!EX+4yΤne&0>iF( 1XĢ@xK FDFJg$8Ňh@))͞i[!M^wJiG{qTdhG[?dݒ&ПYk[ >_^c|b_|yg<{\VA"eAIOSt9yk$sbb,z=p\ .8.Tɢ#QtD\E͕Db 1N 1Xp b(eMEYW2%8Cb,9D": !XdKƄ|JrY1n}ģ1cYcCcSbJu:N 1F9*WFy'32nBٺ8Wd$ˢ"dT$U"f-%鐑A )y' yaCeg)9KV"(hh4"a(} Ms~E pp)}pV #G*̉JΈ΢r@Ѭ5Qgv|7A !Dw-A,y:Q5ju(6PE)E> (б@NrP : <-:Q!h#AY/3Vdc}C }CQ[YfIœuE4ɏkbeD,QP#u[]%bU F_ymB.|A(6/ G[Bڤճp)~wRB߰7?{WƑv7C}Ї owcz_'L#1H8zHIcAR"D8UuUuw !N}я[o샣$DFު7/O?t6_~4/)N?=?&;%>ߏ| p m*0=}w{7NhZZ0 p'EA)\q_^8Ky!WQx<Duyqˠ |DI!0JBs܅JL /G`B8-NJn㭆]2UVG ߺH!"pœۻOSU"R@ic Y,M}Fs""7R* ƈ¢%V@ZI}x!K ^@m[ ^"W[bA9%z}H:+ ",{RRAdƧjyrC\RPtK.A.xT7BjRngp%Zf1#aK-Zö(0-:!2DȒAwfO(Β쑆s?Mb#`iB]Ij  ~U}@┋.!DhdgjXݻ]An(G5T<4VݭN8;wE|ٟXxÊf@ pw ^]o2x` 3`{w/_|NX"sf2o+y82_vlR.5#%jYjUl>yy-:>"{6 %-_)f6~Q؀Gkd:K~x=Z)[ UL.K?W?gӿ߃{' ,+|{0ԙ;sCxJnH by Q~<#ffgta?Hb-7TL ORcVDgJjϴv!ףIG9}[[sbN\gnS/&( %erg oŊG^}[ E$"p[O_ ,l,a$eISX &H%,hru;ٺ4Q i;SE!Q({,bh[ŔKz~0Zy@z3>g=UӦ 4Sz (' `1Ȃ8 s`2): i@+sF/cŶ-fOi3\%sUX2Wa\L1W,K%c*9ijRlR:#YiK xrF /s꾨VXݗ- Q^vP14Fhј GW#.tsثjc eb((C0u(3DփKtZp14R!12"T!.c*ÊҔHׂ⢋B%@FPTPW'Ba`r5@T3"ԗ\eP Ժ#KZFR)4j*%Li0L ndH!a0 J͹E0 z!)K<4}Vm56B+=hϦ NT8_|Mv ۺ|ߏ&AU-}X~F&$S`^c,S9͜ϵ*U.gOa  3XmpE F+XiԞxҨ(( %eF'XҔ* %CZ)l85;-Uʑ3kO2ׁQsVNh6G(C ~5#w:TnDlRhNx|@j>" @Ndc<:GTs [Z+EoHoT_!ϱ R0RSin$8N XL{2)G(gQڻ {΄5ҟ%:a\؄yo|pmX dTb)53nd8l}NsEX`"r]…B ,ޤ& ﴵ)Rbn]uYʀ01`x2߸mp MPA<#5HJ^=]'_ȗA6f;a?;qjщ5ieԤt3EL3-f(;\SO+);일r$ z!%;;ŀJc+l}'ןTv%8~Fhvm_oHqPkBր+'M (̭ؑ|s_n_F KQ>nȅ&b&K6,Qoa*.M|~`\򎣆\#j(fC#uw95:XElm+9Uo\urqPW5j`;%$q<_%9\!HBK[.aE]_>t y8K؂U7U匶۷D;]?7ٙjd&Q1:2&Umn[OHqnιu4}ʑhe'ҭ=W<;n $"Ō:cWWBӋ =!+!Q<ߕFP[ tDLc",TZ3L9_^F2ByDLqcRu] fPX$v?Rߛj/,JT˔LfyRYF)1B %2X !) څpT )(޼!RC&_[:!MZ C䱺H^H5F?ObD]3#3pfm'#HϫI<_jcKz?-lg?Q Äބ_06PX0!ꑲ+ϲT|0E¿s`7L ɩohғ=Vr+8 I+Z2[ 0BǠb#:uQEkrܙv=Xքr]Ȏ\(-:'KPIiGpԁʳ=#F(\aJ xEyiBIȃAvށX862'Y2<6LI1,=0SARZLZJ"#a3E2'ThnPJ ˸63R%5ўIO(y-2j%Ig6eBrR$3CϏ>T,3>:S0K(LCThgs'4j,q?mU9mI%8c,oC!V!Bp6rB@Nd gaيă'7>ov>9'a. 4pM*2 p 10Tٷp%zfb|23 J'w~}B D!w_3I@}vl7+Q,oRkQD_ۅ픏k{`?f-=! +8;8+PڸCԦ[+P<~-O9ۂLcH.Qh%L6wф\#vB?n4ǖI4,TpGavI;WP,cxHe(0.nBR9$7ß([@!)ٴfɅB\qO: ]JT1hYPHF),P)uiO:E ZU Ig%˃5(]]s6+yٺE yξnek?>r 9gX$ JMQ$Hj&)TFi^یׂVkd>n96I6<>yO-qaG}@z -q~׍tJ|`.3VKElР%p6;ޑ[3<.OF 4gw IakI!z).G-vcRVǪv.{J/v]7u]7u;kRU-5MXUŵJH٪)Uި)(؏܅Qhi'}%V}infnӲ4F\4(y)o",5O&qs\ oMO;ch o-y*p/Ol%V }/>0c aZҧ&ޜ맓{qpJyI 4Yx;%b׀#RhsBx>C& F)r,``ԉo}gSfSm9vN-CS|^ֹ"V[!8y!lCWuy9TMbia3J]JV28hZc bxjZ4 4y{Jy /Pږ%RrF0Qp%4(DidFi1iPZvA(3u_Ee.j#̅saCd\|b jM5u:?t6H ]7B8'UOJƱyr,C^ySWM20bub2M "PmʻhLB08;0Npv~#dvnE2Pmʻ}׫ oM5>ǿ5dG䘬Vݯ:b:hMj@GomZHcMP9KG3 ]t"9 >eSyIثxID Q'v!t.G kYj%Ǐ/<]~p6qb'jOۑ)')e`'?£OҵֶڔUPۺ,QyQR nrK,Zk% MptѦ8`6V%A|͑%9νqRøp?2|wsSoݧP lCAH3Zmf>"B˃\ lbO*i!mahg% `!m)'dY^Ȳ:3JTn*(]zAu:.3`교nx.gfKu'o-7{Ɖ+QY]X緞3ʐz*jl=sŪ9Tq6@>rs9Y~ ˉfÿ_fC=8G'xLr^Z/pLTF|9: 5j倪w#~9:fGݾx/G}`8".zA~CO7 0|n(ش/zO|6X@]bEry*;cu{E 8iD/aB~9xs!{m.Ĕ&A pf%{{C#M z=Zs*RlH8I 4/mBܴL׶8ZU5n4`{!&揆:Zhgcby{c̊jXK??M_uL_uݞ6op]pF+@ͅUQB3r! s[TR _iWؿcpE,Jb?)cjʲZQLV_swo^=~ZM񓻱ۂTtaujv.4_7u3_7u{owaݯ* ݄RJ +nKdT]%p1^1..NTkr3p+.ͻs(4YJVX-r͘@[$U<Ǥe AsNg\lBNVn OW/Og0g5//r#@sM_r߿j{y 1R̆DSv8+:5y(8ۈ(UH`n|,ju"Ksk59 s$u#xt"׳*ْ S1f/% n5%ع^\%P&*o_=Z{̓WMpB}?oooOm.nxɸHP@g60H`"H57rmvnar9C1P݆W6D5/ J\Ud2(^ 5Jnz$'r%UܐSJhiXaUP)R½L\AdpSHTUAΓmΌ((pŘli86`?qH膎]f]nԹ42]- ŵA]kQ,LC 5u}[nh}ϟvV3 m3\0@{I^{7+ןnV2Wgnp*325ehL*)jzu'mwo 7W-ђ[$ʜ Ukk ,L(s(9ܔʼYz#_!Ja xƂ jݎzAv,#N[)IATOnWMy>aG6ZHwW@cڜn",A>V91A^d `w/twdg +dn_~?\Ct{OoX( þ=k3b (]sç'Ӈ鹭C _%#4/齴Nx^)c^guKpj뼥;o9}&*%0$ ՠp$ؾ h8Fh T,% 21qvJ|O,2 zK 3>;6p猯q*$/++,,δ\֡VU<+߽ {e4 hIպS݈Jes.Ђ}1RcQ꼔`X9*j)ƾn0MA # 2SB:q9OIgLL'g2ʵ͈\^#-S\N1秈]t1K[JV5KLG}gP>1Gȏ[%U;s~b<'RN|[6^v 6sPS:!9Ѯ=M;MdtrG7c4J|NcDatZ)[D+m#IEb)*ψL~ v0ȣf[djw{0}HJ*^b& )$̊/2"22vRN~;eAlݺNC` 7V#L%9$yGthWnw,A#%2c!1L.@"`$xw.blt.cSյ[9 t%Rv>=DcL0X D6!m&т@Ь˙D{odzV\}aS{ )h2hd"wEBDpstE~NaF :&4dPM6^ rSG4i9=+N#ph9)mm 5332h$fig0K:X E6P3*n(#ܸOy=%Sm52hN}W,{&8d`{jFS=^j#FXuʓ֒x`a:L}0;qC ;Ro~k5'1`m'3Xfb<z;ffj{O[{s -U1歹LjрAYuKKt~B5f??Ce~GԛzJrkGa2kC4 =?MfM0txW+DH?F@ŃEk(' 7q Xxj7g׭QAR.@`s6h:w?ĂG76<`> 7㺊 Yu3ߞVN-0oM4ZJ[jې |G]^^ş @6&д;b1 "ٿrPԂ(6?,9/ჷF;6x ,5IA@8*PV12ѠJ䌾Ƚ1&)1%9zYG`WEe:]. dm֕_բOh}jd|W!}}SN0b.Pz'E/g{W'$Ÿ7_}npU4&0G@xlΫٗٗql%W[0Vu.kCJʱDU DSJ:T\UYJb(XYj-B$ǽqa.W Wr7yW@+%#;2xq 8$/Y;qImg*x4Ҵߌjj 61G ebmXH+1Xikd G DFk-~ץP&J$!qfu_m!Z]uu~ӝ[?D=btgȶI_ } Í;BRRZQ~b,IÍ:O/GȗSc-dUmޅY,מ\\wWo<Un|0)ft02ݿG Zst7/='[OߐW>L\ewUge͞]8|yeYX*w%h7E'mAD0 Eހ- >n4KbVNn%2|/#_ivpDdm]s׉X#t̃K:ԇߺ}L_WCThoqu8~EJS(TiUU@!E|VWG&:sda .#IWJTΥ`243ˁQVrd@f;cnt!%(Y_E%>ӧxcJ2 wڰv[s:՚awSZA9P;%J,KWP?C .{bteOSH!Qd:DB o}fF%f.ԠUes_Hl-c̻B1 -HƔ/6?riJ)oؚ `sOeQYdPa,/!p˾m)/㔫LxUw?7QIx+9rpk:Izq;S9 eEo3.mD(I!eN2Ch()NNǽ)'\h xobWAT#[^lz{ Gb Yqd[-Ʒׁ4#4ő#rxH ;)IQkmoV91ry PzsRU奮:rc*eҺtZ͙N߾|:lz|(uEQ2e:EZOYE;D%UhJ 62łJ*4EJ !J&=+BI|r, JԸK߷uݹ*AFmJ떺agƃs:]TZĆ}չVKs sY2ֻT:\꤇OYbȆkqMnxzp0I LDG!zӟv2GTf8뼩ǠlP3gA̱>~|o,2m^K0^ WrCBe86WwnsG {L#8҇L^]YO* n#yJ0me#q&V6ASn:lxl0;ZV%ކ` +z>6j)̋bwrV@Y-: o.Y?>v*2^64D/-L$?MUu6!+OM#kAж50rz4)dDR/)l(B&+]H `ؖfXK&w5 +$!spʐ#{S=-2Ԯd*n.A燉aP#6/:E$dpH\T" h]0>hlz_N 0f{.JXQG>.š҆C3Q%dc/%LZ{}nXJ⻇*×d7*'dfș317N|߼iFA1꟏7a2.iqY\rWw=KWAjA qw!ʗ Gv"O.q]M-͊ՏMmTjmjn71{"wq][G/SvW'K5-NFK\%AkX, `,@IhD*Y0(,` + +UQ i ̢*`#zTVb:oϕoFy"jUQ "X!lI-B"[ $nʝv|W}9VR8w[k׊Jn6&<4:[,+o ^O cD-6IOr"K#(G;U5Ŭ)>z^J 0|+$[iJq@[O+mJV1TA[@mDwiCԉ `R^':D}FZ1[k4A'\|$@t .6sAFGYfVOT;Pn#% Ӗ:EQf@qӒqˣc!$J]sx"XS9uU=%9=sКzG#yLkLZ,Np8s19?&_IADUhuȮI;d#o2ZvmR?<4pVїrɹ[TG eR :q}7ҵZZ1P7>d;^Jh muϻ/a)2*+w"6V0$uRf'ၘ- =SPQS [PQPW`2ŶOH,/Hmk_6uybCVo*|n> PePe/z(+Yx;B׽]E ؾV]vsyo"'i,> J_BM+~baKP/뾲R22īIN,%a,24VOtg/ x0*+8Sgp2u2v'Lz@.q4V#&65:gTY<9 @.Ng?HҫŪŅqɃ @r)9n.%9͗x2uKݴ/CZ~T$ޞ Mgܴ A!^\TkUi~s}-B 6lQO鋴 ]*kK콕)F%K`* HW<2vF][!ͦm8WKnhyt͆D,d]2GON/gyN#VjU{݊H4BU i猃&j Z2ZǶJ@XR _Z8\>oӸh9)~^H7wmYXH&\@'T]A*?Q{;[5w?iF6&e mܾ h@A$OP+ٻ6ndWXzJ Uz[vbfV$'~HÛ!fMpFa))R0SHimf ^z]C SHbˎc3dp>5lzv#~Uv+!q~"[F8!<=<;8{UJQ/*QI c)h3p;W\ʁ6J eX2.Z|ۚ >\ U@|1Ȏmɣu֢p6jC#Q%P+DuĕKР_Z4^+wQJ} d6j]6u𗌵eӼQ־Vd^x t&e*zɑ=@H=['j1S-\>M%s0?`fe5Ȳ< Rd\uvq+cTM؋_.fx7Q{_}^\ W u4N?:z9MwRqX7qX7V-ÇB#a vJö\gƔ509i@GBib57|Mv<MQrb/w7x}ӤVزgo4ٛ~A {2{EY49*59U Ǘ6 .` XU$rV"LPFAXyn/VS3Z,Ysʒ'1#XW1ֹG+*|[UoUf3\.~\Dm تtHR!ɼ(FX) BpK ҖY9l4*@Ie uepJp+ EGhJ3Tyf (Ysf+?ߋpA˻ W(1AX/c2UXc(G(J@ŚJ# oJr H4'Dn[V( Mz7JO6Y#)>'6t/wdb0 ?=b㟑}0,)~Sȋ{ࢮOWYHd?%_U^O{֚Ǜ+FGOot''>< >O7?ܙIF#x>0]/|Ϲ }0냨]S_㖗0! pKbA k_B~USޥbP`JѰ*5E*`frH+!MPPҩ,C`j5 \'L)X <ʃv_EL{6 X3TDL05Ĩ,) QLRgjgf —e7 K`;K׻ 9R,;m zKm_.^5 z˅0]ym\׵%L^QB \(Q)&蝲|%Do㺔SZ[axlΥ$EIrLY{$H6L44yMk)'̹m<B"ÅM^=w6쫓Rfpx2ZU ZTJ0xMfMZ A9l) QqK a(f1BŸaZ Galp1kA7b oN]=G-4ҬԳ:ȘB+MҴfE~!z9~xxE,%Ub_~ux|DkXmW(^)ʇ/NAx=jIzDYrB ,8[`?WcO*9)ív :*0)U=>k'›xÄD{sY&q|g 'RʛIjC&7E&BI8SLM`=!ES4  Zеy$HzƿjlcZ:eX{J x2MP\% Y 7BA0)+`'x|w ʤXOx!˂sMxՔKq}Z FHN>H@h1w;լo({*MsRC4q55$[vԆ@4cI"0b, GE8T'3wYb:s60!$1`5V(z\;{0}!u?2,u M.\G(׮@)Q5[jzvxX<)ǨפVy36*k@THpceW}M[Kh&YT3# NX &ճl Oa1x7S̲Q伎"w%zkH}m\bMʷ= +)ˀvT0$=Y=^ Z"d_ 7|?jHz' yGr=jSG-ת툋+ {2oAROS?ѻ9Dsw{EFpT7۟|su_ bȽ}b+NpÓwA{+@:)Qa46%BSްcXM>\_%>ΑFEi K|A\ ?ɻ]'OL{|T3ޏSs38d%۶d /i;QKH.bR=x:<ݣ ])xo5syp9Pr}o7:0 Yc!tW@*:lLO(^I!ϞWS[iv8Ղ&oh`]svn?ȇ81%Oe݉,zy%1 1j3H""3Gڼ):ޘ][čI1xޒy ĝB3 "0La*6_Lta * )liq+L^uy7_j'qf1I䀒h^hT un>}_Q+_X^Qn aKA jK+vΞ}r_)þʯ\@H~ u"uY}R'RUHvYu^&Ts o0D>S/vuGJ|ˋ-?UIU"lyhP.5X]ɁEjg˖t0j_޲*V#lJoƈSׂK J6za#x%''V)׷I& ]߾4@h T PvŌ cJeB)ϰ`% h_RUcJDF[Lɓo(iP$V9Ӥ-eiYKRc/tۂcF$) ~C6( Ah9{I[MjqSۗ s`M^-LFf!)U%Izʻ)a4Bј•Kfy=bi}C:)5g⢬`6~kCܧ:KJHա]%&55&2:lpmZ~6r&pގuq@}9! *i1!ȕf'(ۙQ:u vwZpѮ mchW~V4) QH׉z jBRحKA+^mN,:ᓖjM:&wT?ݹNFCUpT`;{ӟӝv2({tkm p0EG\ކ Kњa2eM*E4mɞ](:*qHM9Jq2]ef$g;hP1iGד"^#v8+$ MЩTt>!޷§6$dv]W{SNUjz1by'k´𜫂G hP?){qMV*q)/WՈYИq%,3h),;,6~[b)8Kˎ1΍5\SWY*hFꎊ.j歩ei^¬twx mbN1Rmד^N,Tr H3L)|>" ʁkyJ$rJCxJDi ȆY%&bˬaٍZI&6)GAraN;,SR Oy(pf Dhr` Ux5ǧA 7<׷ͻ/Osӛ $<5_/zCf`$${ܾ[~~Uy`2{|~{ADU{=_z{ d|ٟBof߹3<,@i4#hwF&<w=d6"Wj/] v?l݆&VGQ)Kҡl+0&YhFSE}1(qdPrVՑO$;HcΪܟ",慻(!J.99i"<Ԝ9 (gJ;gb8h,{wm=nXyiO/.AAȔ+d}HU%7Ң$bt%t ϥ8 7ztm00 *u,1\DL@eL?$q)˜N JM"OyC]I)psHN)_"H(5~ю QL {rO[w<%< "_2ثavP9N']1g.n)`#/.>>T؉>AJݢt)kަ F8 4#%4UzN|Д_x):A ά߃@ТDJ #0Y+(/2g7vfqxȱX357!&3O^zPR[=K"/W6 6^9GqEqf7.Q *jTJ (D SQ2*U35x٢ /Z]]}> q0{Z(-KUI5΢(WO(LAa°3δ|>]]Dv1? . f(0'6{Pj".+0! FQ⯧#+N(?*~ ŕ}踺GFέELz35.z@>vA?@(B (lCLsiwv4.썲h2_/2J$OEͦ0BlT#% 65ۨT&);𠪫9~2Yᣁ~fˏEe -M~>߆m8߆m~y}.SbZBPDRj4T`,U3U9íĄ1C=-%7ϛɇ:FѤn&d|.zb_zb6>moGdO:ν|i[YcݣzQIjC*ԟy?rZp@Kkgu+EdoH ǩLZcGƬL1 A/zЀkQQ; D"#rxUNiaQ2`TZy55$T`jeKIMd aE醜lDWI6漲nwկf>,2x ݧK糘(o{[mւhZ)ys+͈beMaSҳJhUYaI~O(Ȏ biה%Ae+]Г4t^vww@|_ʗUٶ LJ*B\[mԬr;^#?dP^@ KX \HmVU\!-S*M5-QRdpA2U٦)erqѦI9ƶ* [%n J(fʘʁEtLEzЛ[ AJ1d& ?"\{] }»6S$K0o?vo:{{0eICA&5{W8_^L2-/bCbɁ?I껏S{vZu -ێ^fvFITxd]pJo-* 0'IթgfE:0Rx8QU;|6UtL)HX-?˥yk!gB/Z*+Oð?o ٱ#Vg<:>aD 9K#Şo<ߕżqoV4QXK-?G޻L'۾k?GPHOɝc@N|Bw`U̟Kk~$}ysnpl'pT*8뛣)^>\{!>5/<:3>mn{zlp?t3<4`8fE #I\F4!NFְQ4Lsch>v`PCH:3pPW8PSN>%zi0A;ȻvUר%pB@DYF!yGBF_p&֦NQuvpuZE өJ7$QmTMEĬ52I9Uf5wRe(8n+<$ҿti@cFT\7kɼ]YRT0^2y|~z[zzMm-ϪO+2%"zE+w3(VOYsouߴC;xoahex&C{Ls:IW쳇٢ F#Y{coăFǀ'q%H"K)=~Y-w 8wm-hԅ ;UUN\/# >\Uw9F2QBYl5[-R#Т˿#ݸJJy4ulkc!>=x!'+_[8z3^bG"8}5z,)#0T3ޫe6*u59ą=B;3{S{iuݐrKRwo'\,_1Rٺ%JU;JR|ɪRHSY-GZ "<4bVQW lTuD+(@]w.v(JG-X5wY;BirDx`QNԤ񐚲dZ.؂ z='EͦKT侧B$R8G#/1WUk#X }(:G|宧ۃ#uO3uzÇ ?%q=sgRߜ;YmٶCl[=g IWA%DWARQ&pBĠ[Jx*8wn 2N '#S~,O̅l/:X;=5Nn yuFÅ H$ g)Ƚ:t^%1TRTQDh)5.yþ7|͚/ږʧe`^W#<#$2<,iaUP,rw 8zJ̧*3fifR=}<m d?Nbg(XF1?ޗuUoG%ɟ%uk '1 5p,94+#G3N'ZkFI65E 1]Qq.D XY`'rHW!8W! jr!WXlj:Ш^Tv>eCz7pnfyWbSp  %Ǡ1~s~hc޿s[nXH8?)+1ɯp\Ye[s\k ,dҕEIAVP)*VF9Ƕbz[S}/aQFdlsw@tϙ$pI+"n-]THCBJ<}ʙW Q ݹ EI߅TWEHc9id%S*8j諁C֢Xlܶ4(U_Ju/+hj :o!L:gƾXTv bg :tt6mSo;Bk)MKkQK[%jPh ZPT$Jh @)NM~ uɆRQp]?t3uY'3Dxh)׼X~ssOhbְcXc*8:k+j䌀Fp 4sui%U DQV픮D ~+@vN@JYv ۛ&n1'(t$.U]JQD[&),a(I|n81J%◊є((ZY p(rV6$ UD1HA nJ5z{;G. dL}} v`ҵ<Ӂk0+H{-&0z㮗"etUh}hK#k^y~ ;a$TNwTR_bYM=$F.6, ip:mdw@PT f]%H],!2\#+h(uD .")]!IO}EVdP>%Dc]1_IvZlbv\:=rI`p~R.%;Cs6 eUa'S%yQ+٦ S4 8x$Hfaip¨ϙމE_{FtӃn22H( فt+s7s .T,#dlA"IMiL}]lI bw}{՝ogߩOkE)5Er! U/՜T(kdk4SEjRB)5e]t %XDܦEv`d(YCbIYh=]~_ܙ"[ݘ۱GgKH-:GXEb9rHEvo]ۮgGyHscWնCӸpHɪ]zzpf&#SbG)G9;;hX)bq:)фRSrO-cTV΁^1 &zP7p (bs/P {켺L}ja %8g[0M.1w.kh}3'P) ˌ U{GLf׮8|IJFmJLEh&== 9{F7}&PS";(B٠y^5M)پyʽW:J`J-owŐ"]Fơ)=k܎ߚ߀B&/bNw=78Vc-g *yfN'Ǐ!q~ }^,ir:Jdz0n ()yl,o 8k/'ȯM{/n ׼56#H Rʞ\Q)pu~ nOb5pY]ၠ.]Gu!Y7^ɮ;E2Ano!0'z^"H\cX:ʰJG,jd0#d)gNs;6cɢ h+މ{|'^.| b`'"+zdguݗ%j.$Wz O+EN Ԍ!۪ ~+DƲĈIZV?ς B>9?OaJ߷// rvߏ[se0D?}ϝ=wysW=;@p;4eY$L ØH2IYaP0lpj|J)*1 v6?'醞 7K4@@^>5XYm]?Ƌσqv,/'ͽ/;[`-ux[xSo2Lb2*Ɲ*dNb4"eQxB?NjݠdA.ZT3 옚Y4FAGtLgR8iMb%K2UfՂ̥2&ifP42eĨ$Ӵs&|!{ B%ZdJA. Tr0ACyƔNMBj')B*'P|aoR(4gd[gQjBwB`28,!%HV5x !Nc+XK-m}$.Ad*R0ii&46"I?f:ቢ64>?`=~p/zKLL~3Z'343 qyH#x?@:vg? E1H«"]؃uWU G&Q1Y A4( kG$2WB4GM6)|KS7E _=nWW -3'xÏ? ͻ`"oχ -aRfo0\QQ{v8Ƴbe=)fQ{Лw;fatOO \;oaX7|{p=}F|L7e!+/zB֖k=,O kʷ']VNn!a:^I]bd S ΛJ"<5iD=Rs݂)/v<%v5|]x8nkKE`|mï(o/(}C$%@ȣe1;7L2yARWEAYm~jc6X`Ђ=#H^]1H?l㣳Kb^([!aZvwWO&m|4,片9XoWsWưϳ4xR#jĶb]D3ZM$Bֆi k\]ᖺebd0zYܗu3跨F-:x6䭶vXzXkՍ]R5xYXFOy=7|V,QƜ_e(qzV[@zg 04[H{ii$F5.0jSWk}jǟe485k}to"kEAVBY0I*9rTrCJ^ѽG@1mA5y f4A0KuxZZч3G*?PĮOQS9 `X;r/n{׸c% z~^?LrD{l}1ho_uR4yNwQ82\ ajsTx_ξ⅊y;2&8yHN)ȅu(|kv]ĩ%[FCoeB?૏yf`ͦisF'w[!lNeBJ&F[^akk9k#wHMNo[0zO?np3o-Ndl-߆o ry e7O0\?Eg"ia+?]Gr z 3yY,'[וɊ,M4ƦճnsBb116xVSn9@VFٔ})p!sv/|HoYN%.seNr$Hh1\̯n6ξzM+],hTwPe"S"U#l0hGZrM찢^ ɐ[kH#B(u*ӴGD4GDAD,E.HdaE.2Us p^0MD58oξ(u_ԲoC@!u5NTXHX%Em6hQWgԋV۔k=v2UwtmWuv_hPgiDg}Q*}fяHJoK.Z7>50efU*E  SD n;Dlɂ|yrR`hժ*|V0)*܀ϺxJtPxm'*i,_W^+Q5hQ"ϱmѪh/NeawTusuɫ37Ҙt~ʻiRHb116xV)mP$>gݪ37(AEaeU!D߲keN ՍiN+)cddRFLgh(FB̉LkH@]HT\C %UJhK b`>RD }"Bߪ`Eب(+NnOEE*v[{M61snP@[,Уm!Ȑc6RK&:! XN>->VAկo%-CC 7@r!k)^pm_|Z xR$%-OH s~A(TtVa]%3̿jm] MF9]@:iQ%XH[\*]eq&bե}\gs%)bR0E]ƝR_v8eHlIJR(Y)ΰ4NW'!)kqFHS)q`H1:0J$$25T&FFm?4_3ea*Cz>-]uY{s<>74Q oχ -\>< "b=__~~;d|bS s3O(=@M{3_ &wpW߀ 1"Ron{:)f raoq doXh=JP9QqC5BQKEx`s/=SĀ*p$Y, EI9OR M)S qPEi֨0 | '1HkN dqe3pJp@CL*LBL|. :W;}{}A -s&{zc}JW%;n`Ĕ[@!WDHƻ1`%ca-e/IU|>ɿN8xDuU\k4K_b;/MX֫Jw$ (qj$DFc4bd(B*PMlb B8P! цpeJdiqIH+ KԘ:p%_a_)rH\aÆ1AvU4@/t"pJhKVݜZc|TCH2UPR4*Bft#_ _FԶ~SF4 Qwvˢ.<k7 V!}v+ժrr]~<]F$w䣇8* >zOFVl72FżNg1ﶳP6(}v>[^Ҏ [c.>[v"m2JF(ѨʭpC<:M$IMZ!Oi)e[^~DnyV@ ӎZeizn*h O4?yXVʉߔK5^*RDmUKT54 G9PĿ@\p^l-en "%%igUjcޕ8r#"ԋ3H~bk=~+[$A (T1̔-awR` ~AtgSߤ ZOJ>ۓ%GLhD\JMt\ ,mG+4XML:bLt)J-f3kDP0Ik9+-ۗѕ򆚌.*;/"!YdUN`>7̇̑]"z jENioK"YJ֜ԛaAOBk/Q<{BB EI>Kɥ$N ;jYJ)e( 9,1皡Ět5ەIvB'nw32N٨`ߧQ9-޸p=YU|%>Y?lʩm;({w8Ω-ׇ3,~5u:G Qm/hﰾ]W: "ـB1Ɩ:`D # 2]ebRpd6ofq F<¤eTu(V߹c$մ5(K+[z$LW"]2s6FXqi nU/fh⡚kA.)yƴ]?o)Ud7,7%Xb\\/>g&/ipp=k}p *`>M@a.?J  RdZy1#u{ 8ZN=Ҵ<0kt408f% )Q)y{sU6RY' x'jP!I;5m(POl}CYp_"V0 6&ɗ5{U4. b@vWWDL*]y @n;~*Xh:>ھo/LzC_fV4F1=26E6QZ0IT>m~ޑ%u9Xa2 %2$fa-#A6FPtL!!Va!+da_ iy$:V<Q0jXrVnCYYXlJ316o35DqN+R/z2AJT3S[RdRbyD(-d#L2H^[DVȂ5# lPˀb~V2PH2ryo X'Jvaom} $VNR6_r#f; w{X4f|ӓJy3 cC)i #")1gaIdpdθLrҒ3'EO0U*ABK9.gKH罡N9ki\LE0 ,2 @cy1'fƩhFb>A,…7Ʉ cSGU'1r|3qHw&ϒeO/;0),Z>M707Jju='fG~0uσ'fc\U_DyA1[(Oeaw k\WΉ?~u h/ﮖH >լ7lG$` "C 뷨7vy!/arXh & !nT4^Ҋ%#}Epp![-;EڹDc1ձNh+ywŴYS}yPp@kcU/ ࢄMڞĢMõe6%hM#J\9\*L[.T)P"%Iu$Ο]hNb#bz|yxӽ{bQ!Sadj'--$u{uu)8o28vi/ !!bvZYͤgLw>=n^_{#H!uz=y|Ε>=/}7YmN6Nu{9.+{g~`3 u%J{9'q2dT3'WRki+VIJ*óAxdƚIX! yrԕ*lAE ts)JLEV(ήp*㉢JDݫZnx1b6i;3̦r%น,ca*bgHFݒ,s;5sAXЊ O%V\9-JY`/ y^-%+.jX>h("JR?%%nR7RN`LmgK?$Ho=끀iGOs\<ږV QˤKL^-R>c3%}h7~N&1L#$ԴqQUa>l]2^뮪^VuSUI˲{ipOI HtMSx.azG$Ljk8Ow,4=s.[o_|rNveF0[si!)uy(LZ{OZN꺞0X=[,$=\W`Z=NUD[TJv^}po\}btZ-F['hT!.yz5R7GU-^{g (*"'7µѾ\|S}R* Z$"P9 w6Ww,jw%A [*`?hm- (!va1<(l=Xz"  |P:XKqtP}ՙey|~bowH5 ae1S0%%<ccPQ” .`AA12Diy E :]XGW لY)XcSϴbi#{JzyXՈ0 "p5ET]߇-E `EMZhk7c1_; F;nrxs>y^ {7B>L^PL[aec׿?n4l6VenS~dz2z,OYZϦ\ӿY̒ͭ|M ӊ*kKhMq|wS#z\ RL'&xUFtݲ;ڰoDslzxw0w trhݎ޼[vGS[6L.;ׂS[j@..G>SV2:;A@%b3@ ':EL1":cq"]1G|mEEEEu)VʳhooYt;˨ʎ,Yk{7esV4(/qX79qst -F[ 4J4\|rm>^UWmpY` }(m#nKU(8~v{X~.1^: PPLqd}}:TZQZm,M>lJ^m5mazqj8fx׽3R,)ۇ0w-_B3fԅ*Zlmg)TSJVKX#%7JRqޅ*dTɎv:ʲ@f(g;4R̬'X%>?Ǟ!ը7XptAOmcoV^KAWZ!`VPؘmE$8VGRzǾ=KJҬϪ.{ԓVQtåO76diAPޟ{%i9@>TӭDﴷq!!q; UěiDS ؍;+-Mύ +ȿ ޒbt LN[_:赖:'S[X0FW"1zeݔ 4AtGoyJWӚYցhL3*'>\^c^݌㗾&w{wp('8+]H[Us?}S|p!+ڐU4sU.ipjmȊNjC6Ѭ + í_b32O5歴A>8'W#S5HṴLˆP1bM8Kӄk)4QPBwI0Q B .ל8(+rb#pD85aLizkSR륋m ^`GZ4N36b a1bnǒkoafht {ˮ(cMb D$`g8 biVrf#nc2VS]7!SK~̍L.$We$;/EppOW,<=ćc19Y^y~i+/ipj"8f:G%WtnYx<~MOY6[tkgoWzqI!hBi/=COЛ-.u'Ff@}x;v\>9DccZqaP$)ef\,^)p@HҤ!>Yf6/EL͟Z@٫pʂR\ܳY/?j283FV[lSPpV@i'OśoWkkP?խ7wؐ=L6G,gp?l1f0.79 k&o %A/׵̚_Ń3d/hRKQi.\ CP$V qI+bOw/9s5kxy,p)SWɯ͈`tdD#k$Uoh&c-Je)CK4>RVzFpFaD ];uӨf!qY:-\;݃>^6}N{Dz1M9 S\vn{ӶC ZTwyZ6_{w*?@~Oj{"b3N eapŠuqn5BK+;TP|+\<C~?kk]H Wndu?' O1*]0t A:(6s]˜C{v2k`:ڋ*0!!78Oxة'ī5Oe}T] N=uHJp"W6*_A%WHN>%+xO%.z?H]qnhtk C-ZS#!me1 sfL5F#r:DŸs3k$ {i'u {C"=cKy{cs#w| BRCxS'k}vK ^'p4+?T:B1 +)qh4I9[5NL./Jx܉/4R8ܨUDH,n6ZC/|0ь>+pޮNE%DQNu?}ʉ L|Kj(bw/MOqs: -B. ^0-|.tDf 80ǎ9,e͑I,IIψ5T!>rmrGQUfpO҂ NOwqѠ. # >"ϕsJev{cMSf>% (#l|g`&鶤Hq.c~yGsVKZrXKkDB-5:wg?;F9Ses]JJبB} 7Qe/ P}ʉ =Ǎ8 e\<îx& uURᾛ_vS'„b&ƒބp&UYh!ѐWK;u>5[ppڳJsSp˱%fX󄙏/F|HS8q'5'(c҄DD-z\!_*_'c.,R l;{x !x& 18rxl_k=3Xʔoxee!'X-䐳Rl9[2й{ds j5*Z!@paWpA \R h NWS[amlDD3v8{(I"ǗXI] `U52LQ܀vLxSb+`iZ=&A`1].6e6D$nˬ34 $h%wy %=a'8xR ۍ0$?ox˰ʅ0Gv$9t);J^H=cJP: b)qɐTpeFUY#ّ3lL gTIcJɃ]Ic0q^Ρeo>IQZm rõ7GaKe+nflnEixuV8'F_+z߳6 {(|~:~|&};-f`d @x`6GW'WCy3ѰVqfq 1 UϜIP E{{?/ 3* DwEqFhռܱ `r*oNzyDLnMڼkXӘyM4|@gzHkм{@DNеa+Ɔ~PmNL4|Iv#T."K9[Ĥ(5ѓ -6Y}~v0cM (N,H6!YVCTӵxdחkC#7ÅT?$5j_Ï]=`WrA̝5Ǜ&4ÙMwҽWSbtUĘ%z9MҗRt YX~U(|:P b&L;M2A9V*l2W2ZQM#ZnĄ:==:Pds< =JǃOEM:Hm1`]:U|9?&嘃ə zw>9""QVک(˄y>M9EGQ1Gc!P̈U0d bfLIrlwJ,S)MP\wooF1t後&cC]Ӥ}o ?,O鎾W+%S>='h1E1\ |~ \ޜ azTYVU ooo/.@3 S߹0.˩߹OWW|$pm Қ Ecٷ'pQp^>|\Sey`IySYu]JݝDOʁ14H 04$dòAN4@M8-ӛ;Zm*d9뮹&)XUZ_0+ M(f,WJz8ДV* DqyA5VRTd)m!d~mzn,km P8 ǔ=>0Z(MɢN<+'dg.E6jw²֮9{8{vDcX캻]y]rJ;xvR_iP*(]JAᗩ;}FԤ͋5}o/Ј4ܫHTivrbDԄh-*!~<68զ|B  -\c(řWeoq&   55RPuP`\D0j*]=}>$d{p9}EGFzCwKe$4ݮ}خ9ӾW'RlBƹDu|_@@p%G'͏V\x $ ;v٣'nJod:8BHA`YDMYF#'d7$!%r$S)&.\x.\ 6S-gdoz8( c|fSy[ ;_DcZ锻[ hQ&қ \(UpOo.L!P$Pq'G#\ )jF0 IE9^ѓ3wH\| jBIŹ2%5a"Iϥ`&]gTs1ez ]S[v16̷m/ǝ a)HUqg?mB|Zޞ{;6Y>hXgv€_t岊6Wn'ٟxv&ٓZ}E-E:sĉ)2Erk^qN#7/PI 8ѠH",iI1N\Xa ۮ Tgz\|:{jR"R IC9^^84 pQEoEC+A6P76%8Rt_%2$mn˂g!cT5xv['a$:[=3[YBRॉWf%o9hdmILnPM;&$XVSL4e3NI,s.S#c3E'1]qXhY7!pQu`uLH%c+;S "1G!Ƹq &.PƂ$'wGJEzk#~QwD/3w,} OaWB522[Wώ#O6T+%hHm+Az=6N֚RہmJj!Xݎq'Ji&sg l JEdKƘS"#IE~i3 H*ybd;`{ǜg+Yf1CY!8JWt4Rx9 PY}wsiG&ozȧJY6e祹ydzM%Q RgJ9!#!x1zht㏉MgWdKw~hw`sN2MWn}K5;(zsm$a,cL+'/](U!{CZCUpW]Wo8b{](KE4Ւ1vTgaڏWRAS\'ײ2u結zЯN9/p[|P-e=8^ytA{"̴u|8A UXpU`V!9U 1ޕ8(ٷdJ3"ZS7A=DlQѨ?W#<ӧXj/r֫,-[ZR i!:d뒽'VNPE"B!qu5q1{✯uD\` (Gl\{ j|k+ꜹhD Ri?<ԠCln: FJ Jjh6};79+P0ls¹l &.ywBo?tZB x[ɤpȐZb4e5(HWr4Z^WjZғRvUb]u3Y-q,>n{db%/QqC;~s;Gas52gn=w'CCVJb1&PAQ<7Z;*\\c&p387N{hէϨz%x lL#i|Ayyx|2 =)]Ҹ38rޝOoǦ8LCbju~ 0Jp#8x% ?\|TydO*WR3 "X%ê7/PS( )6RXfHnC 4+⥹2߻п lfqWzL}~w}~, dC6(fsb6(fs\o:Rh gNF0 Fy  Z#F_|>>FZ-]8ʣ}tm1`š†Dž bǤs09Auz7|h\)}xȷ|Q̷|Q5rN>P^ZfT a Inq9'A!unX0O5$Ԃ.]a޾?pNXik85be#lLrmuRp6vr{EౠAK .Mj$M__ciy䉮O懔tזRDT(zIŜQIŜQ5'S K՘sDӜ HcN*3Ik8hm5|DGSYIؾ|<G(>GQN4Rz$C'`.v 1:0R ^[k2Nf'ة{i{pզ %aʀg.+V.;MB k^*Є }v߹QV\_a)L3eNs9)XXMZA=zH <vsp1瘳v5/ՈiB<.Hc_i.nm;5ۢa`J$WQ|aQ_yHzr gcC̅ 跼k,g ;GL(B"Zl q .4b:Uo4Nf$v kV#D>B &՟Vػ l`\ag?{I jwR#aT㽹2?YK_Ujy?j~9bouO_y,R64ț| ?f|xpa/Y V*$Q<3[mhBJsVi@(,nĔɦz=eBo/KV{|!liOM}ҍ$4|$?,q%ՄݑA:J1{ȫB͉jcw yG)e@qEKR#Iֆe@#{n5f^cLk̔'+,H2GV`1rp)IuG 3z_cc"&j~H Y 7죍BH JXx$ YUÅ.PCg`=Ç.jBcZs Ǻ#iVqC؁qZh(.6i"Śhrh&{˰vȵA^QxfphyhCP,~0xdA:\aѐ]fWJEx !Ri`!)&ZrI~ymp N([E 9XȱE;gscAUN[MN:̀!opf6UDKYlAz LװB~x6Mܸ0x LjQXZ jNj|(grIQn:`n>7ݟ3`ChhGvm6盵SWr~5IAn)@a8,}Q624 $uy5^Y2_ q?&nqf8s"al8rL.TW Nnsi4 ^Xz{}gaxE_#}mՈE]$Zu1Jl5uu"cDb16]w7 9Vl/_.Ҏ}kM\Ww]l=M1n{$b6{[:Ll?<\d;92b1zp٭'ȈV2]{: Yٲc\,<(eHu҂[Ki*}Iw?3F@G_9ӛIQPxM1Cm bz'UMw"dic& wI~U|D_d D=|4|7yB u{CeHۡ%G l>(>(Թ4衹Wp:՜q5HhS׊HIA=#Q:ZA@ =aO`F@~fFXyp̆$l?^yhIzRhGa Ac,>uRpC׎Spx9 -DqL^u`W˧oVZĦ{2c >e!$ A~>D'{ !^˔eѾe-7dEɩ-]y@1~*@ W8dpe qA%Ҍk,G 0:Ռdm4ȇ0 uLbV@SɡaZ`{f#&hJF j SfdTK$Zo$n?^9H(n]r?, Ja^bOG޽}Ye]7g_|-ADLV V47g`bLP +hGE4K*ejcli7.([.)6v#rgܷv&j.$䅋hLIOϪT}ur%h$&D {B+Ab<ZiXw|kMt R /(̺l܉WFWY]Ke`I%Z.Cҿ ZE208!ڶt.Eǧ:QW-e5g n邢H{qۇoct13kbC7T/'!*qGl2+Ffxd%0=X?u#w`5*{8ՊWp$y;=x.)o?7ţ;0;.$䅋hL 5~8$Qb":hݎrTiL޷vϽIn]H ˆڍ.P>w8UwZ=uOn%(da>+G\#ymh*SEq2NDCS Y_5záY)βkoeR4fܽY̦`5eѩL 4+ LhZe7u͐g:*#jP(ŤG.EƶPo&~ןӋ%oRLo- D'-ozMHcBz)#ĜBej RJh vܵ9V < S-Rj0 iᤠs HSmeFYN ]iXFbGVH+\oę#Q QO vW,*p;~)_ί-~o˳of~q5O_D!Mj|v%ɧ%|YَJZo=R,6 Ș†D #F$P c:=XD`0f *j0u fYG,I7’b\Z kLcPa1 $ 4#rV`P5Ži ?E6Z"+{TۢzL b3*QtlŚNсp=\b6Oڋ&Su;ݞ)/7҆0BZ>nL9Fl@a0ct\ - jS!؋>9ۛ߯==?`X=['x |M:8|z<^A/~_(e=q4O7faHtBaY<̻՘#0m񁹥r_rsaPp|O_~}soC<5~0`v[&0ʲQ88G2:Zޣl9WsSs*YT?'N\C_vN]p%/wP^;xÿo鴠S!i'T/hf ]늪3( IqFtZϦ ~7U)uQYVQʋ7cNj dY=pL)NXO1bDESk>`@}L{ $F% HY8o0^J:OBA[aB [FmDSL+wjE5ВSaG<]p3@)`"I:lqUJN V4G ՠ, :L  `u*-UB^H xBL:!DAK jO d0a$t*5\6!E6YH.V<\:$«6+"/x6µT/vS8>|i^!^I6ݺr4XXKt-[ryTaF*<Ҍ\ŘRA] !OrBgfFN]ZCT4-6r%6]lC©3uf aIk|!ћ oyLv_E7|ًn {}?1Dz?>f nϰ2A;ӝJ$<%> 9o14z! >snS^`JcE?#o/c\vנC#d4aQاyΉ8Zk>TryC[0S[HdyZ=4!w3liVris6:>(YpMmLB2&4K* d@U#O9tڝ?C%[nuX~Vx.p~ "JbmF6r2yr(ǚ0œk@'J2b`uTG+DE nLJ;}/w+i_'`Y{O6y^#j̏x~hmVG&=| a."*wE]*(z|BDu{\׶<ˑ#7/G=Ka$-()3^IGdMm'}\. UPɩnBHغ܈0@RQ<ѳ˶$_5Kѷ_+ B z,_M=j% p~p)2fY Š~+*.H;s+@qA)2q,E&+TV,5 !͙64 oH`*^ S)z* ,dsB>jo[FGEL)9O5)Lբ9ATILGEVTdΏc)8 rAAWPQ C')思Sd6Vk$F P,7fJ C6[+>I^4"XCvdaBDiV8N B.s@CwI("DR^ΠȭAGف<֯S+|22ZXڛVkFH(mƇH(!amK10֮'E;yPW< Wxr!(|y%ըhrjX:\[$Vàb 6~,C"m\;fsT(]RϷ&]@aU3WMOkT_5PN4~PXe@cl$C9XMK1fBK('֟UT?anx/|1j.BLzT-JowR}ĖAN!ȶ */Yz~A/W[uP!@!FDjB+YvxعFYJI=sVM%/9#Ty`u56i;L۵wF#*iݶe& X5_SMp(V H[0Qa̞`aUQؕ~Xw;W K]>RgTq/qla'SX7v3ܙ@ݐK3x?5@7o0E A*9Macz~~j`L??5F챂2/3+N +V:g{o7P'2^)l j{O~tXH3:Q'*T)S5e >Big3(E?)Ǩm'Z޼/}A/W7]?^8 -$.xY9TM8Gb^NVH7e7Tbyѧ*RuVu%%6_@ISaH+2V<- uWB%*_Xmנ4 #O˚R]kUn:wl zN֏\AWBFD&+]7$2[ ~KZ z4.h5EPQjvV#?.[z4՚( tG'~r8L?|Snӻ2xkt&5Ohټ7f㩖Gz}#;I*+J%Yt-=V.7$!_֑)EW=|BVѩGv;uxސv_+6P5!!_֒))zoqk7ڭ.!Sv];ۭ@C[2%"g>;uY%{/E5pY뀧Hb`eFqEЋUb޷zK^nK`^>dzԥFwqXN`u0%ꟷ496ﻳZ"S?\%|Y]4[Է.:u5\~J d#^0f+}QpYl"oy.`zsQ]E`\juwWsFM]1)Կ,!`Ő0ϰ绱{)/2ޔ eE# 9keGC8JOٕ{k1C2 Żm|j$j1)C8/](7^5xӟ5P~~O|ڰ}GsRE Hz#nyɇWۛml?,+m<;wfG<Z"P lx i|2~Н?V}:|?_HWO?GTQl2 7u&\k~Ʋu=ϗ|K(G k4ȐxByDȓ@Tb(:ȇw<}wY327 f9 "dlgz㸑_q4.{l6Kd[JlIь$83zbO_Q`XYX,Ugnϗxx}`K`\t )fLԯ긬#ob_YGvu4輑\Nlk^lB+ǗCLי漠6Ljnn>4 mrc~@,ӪbĖG},\1Z,=Iyˤ꬇w%U[꼁z$Ú&p]xzpn=~Mv6stXj?Iz2ޥ)5]Xg?IˍܸHˍܸ/7owW7* t]%\X*g!H#PUh*\ҳ*ȳS_-RSetmz}4OK~e\ ?|k0Tˁ5K jkeӣPA5KC*㚥2]pw+-w"48-$<)ֺ ZLZ\UNEYi,xbܖBj'Dɠ%ZFVZ-f(o1ñމ:֐|OWÃŵ=zI+,[0znCjU="j[ r,۶фkxTᘤ`\]XDؔal>yݻ V+9&O~K H[ш'x[ǓN\,n-lHTcPi$ѐA&=\:h:Cjce:$Wt{1EKZ1jo Qɬ5scX*3HTtu -5">?6z5θêĦZf>gGiniaaRMe0(#6`Uߢ #sXhcd,=Qj֕#Ovpܱ_9rWZkv\?+bU0D&r72GpI4V :>>,_LX8ڕv%7s!mpw ͞,w1^%i3,W+(W|D@nbK1E0\J=fծcڬ]i)vp]xzpnڕլ>;Q5IlNu'7іK[T ~*IUHyNOں_ 1Up`1 2Q*\~ 1ƲbZ#y1XA:]tfrf`p 9&8/hZxI>-v14„aI> c4ڡt^Jo(0Q0Z~#PK  ēYǞ#zuCjD'JvZs}|/3P+0޾s+sn=DZ23 Rq5Hla:v<H4(!8T2D}DSɘ}Lu] :Q,(&/KTX!8g^{YJ-1rpVNk2A8Zx7/pΙBKj54"-EGh .*%`7<4xUI։y.9+5'*4%٫k:Uɹx, 1NmԮ4KsӾHI׋tOKNLHY_d˧".C|7l. *Џc `McN-=+w%h}gt߿9O5w>oKʣ,5=&ADgEj+&(,|7uKPN'Ph#IiL\RCbQ8ɞ-M:"XB>*r"uuzUt\o9\(AQԲPRʲDZAn:Y7T[:nO&9Q]VjK5CnVBPXZ‚ܖ f0>;㭻W- R/N}h`GcVBήɳ]{cCi6ı bb<_Ҩq~!." p1PKX1G܂="\f㟰~Ih@ok_,!͍Q)a57_̚Gb֜7캘Aeyo::jUF5g{Ei^l3fPY6RVkEi3{<2TEhX5gmjuYF!v-[SK_ZycF%#(-RVVQc20+%|eq^ʛhK,Q 47= YU<6q>9IVmD$AbzێDC3Srĩ {8֗TeUT]qK5" za2L -r+; ?NΆj0 PKrHe6hVjcVFVEH%+F#tŊ3MI$'[r(8mR>凖msAdV)& \ Js.HMP+\.pRJf$䙊 \.~2hSƯ?ߜTP2)oqƛL9q¹U;2Ȼsg{C1̤2i]X{K$etf6V~\6;FX꺺s;B DaVGoWJk@xޤwBnT<~5w7k%i]ˊ+HO]n֝Fp,<G>AP=9|sc/~ Tc9Þ4VχtɣuSRr% )i* 7\WVeJU'? @>MSZej LE1wse~W$͕aYV<4;r8EU}Q9yqycNùU-䢪FU:XSe&822_ŒyD2<c}1l-&y*şO!t-f5ɓlA-P.ȧ.$ƋS~_4Io#Ҧӫi~\Y2佹=Z`G-CβzGA 5)5''gKJO?Gw czʉPWR:*]3FF3hTwwׅҗB3 b;ZM)"ף8wo\M COhjmHgju@tPլĦ {=rov3ݱgشAt@t$.:fO-d9ZY4r~rM@&Ԧ[/|i#m>yJ\e cv^(VJO7G>tdUйh ~qwChhBC4bUE [;x j99l-r(/Z.M7SSPٻF$W fKy~O>,lxg@#OjI&%oII;umb*+Ȉ8*i+'-M@*g>Qwh8AL#=TՋˢ)0Ţ{r-: .qQ`{qWZ1pͭgvAR<']Tk'JVxB8`铣"B uC0)#"xI4ՁD"hDU zJ%oBvĊ(,,-XX4YʊG7SvUZ=bG<71>G9W3 :!r[7R=8nt/nr ٠m88$$oD)z4|窽B68 Gc孳_OGjW3vDQ fe3hf7tw~./l.q|UHL2dWKjT󽞮m{H]oys9Ϙ&i2鄼i\d~F՝2cewӴxn]u2+NpZZvFKPh$[Æ̨9DRRC݌,*܉`8$/OT`ti7 ߓ}Շ ~%~ڢt W<GƼPB}ӟG S8nF4mߑ(}7MG>=:q>=LdV'?/VhZ/y;`[\0ûnr /V Ťn]2&=kwݣ[F~%xy^k.Ìht0_,Nе\ \߂L6]B תy:`9yu#$vAr^,dcF3,@`6N,$ÌY0\RTSQg>d#RiuJNs@PQ621{10c ;ua8[Ϗ;ǧ#DuwNDWyt qA3߽{QI/{1`>ŷu }/u/4TMUF\?ʷӂ?+ƾ;7}E}yX`= D[i3A՗QV+Z]N}W> &I2%OQXL0^)`i)ZB+-eZA9m]ݷ80Ƙ -Z˛sU3w1`kcmNivG/ ?!)p h$lN5~e"Ѿ\$ ޖh]ϰo<+3 .#Y#kvIm0@$ :׭b&"vE^qqx-"ʹFq01M BOiͭonr%pA #9VbA*Jtg ۏs;( JGbuF=$ ١8ZAFCTvA^ƀTg(OYaWRI\ EUyCkZ ,q ;94c!:u9.| ̈́a`BڬT{pvWNW҃:l 2<^&}-Y|Uz?%̉YϜZX4&̑Ϝ1qs?$cϝ1TwD@Z&nXD/UKkXSfd9HZQ(咂EZQ4clGYw7^-6TZi@Gy#*]] 6 u7p]bX;hQQԅ0sv/}@Vbuw `*/;6p)@\w۔BЌ`*mex̕D+V1 LA*W@mv>Dɜ t<#%R᧽U>ax~rETg>#P}O_ )Xi_q~} vMi [ӹi_j' M2 NB'V+vjumkLG,osOש)`zp\6sAix^ةuJ}[rqd*5ʾE=6ɘjT;FI4;jGY- l+u $]A\vF.ydIhYd4fܼfT!CqL#-$xy hTD1p04GҡނkTCQ]ׇ4hN0uBs&Mcjv}&ّW<YM7%Yq^%D_o>]};qOu|_!\CW =ccMa8t>m~VFO I{Ƞ+iY,PEƄ.kw;yK;/yRjPI!O=;FJ琅d!mTh-02N+ A"ְڃIuH%Nr4J Sm5\^fr1kQ̢҆PAiC"`GWPHj-(cM!ɓz0-L5bht(Z5(/˨4dHddi B6^kD6ilwuhfX"a7⌐[oYLt,A *f2'%i(:%*wIWؾL2}^z*C4Vg!-}"5xivNiD+m$#-f)#'HNh)"-s]8G83gq(hh)J@ f\E4&0}=6 )7"sm|3SEo3^=O_@p <щyDBʑ֥Hs Cp%3t.mv]7a"NJFp kzs Fo6'^jmIg՞lnP`MFo nq0)hby'@XvM#fD ,;-dN[vH{nb#Zv^q0)J.vEvdnZ֡tJJޓnՌ [-ym7+*Ơ `S ReGl|D~+$:/ Z\L Vo1嫔+L02WLKQɨC"dEXgV1jmͻw"D $7Zr9T[pKvdս5Bq'h""XH:__ٛ`-9F^uaμ%"r;cȡ"ߚR w+,Hy>\CKbk{/7{H _!dqvmՋ]KV?EX%]K@򌓆(~HuqB rl"pWOTjjF)r=_Z .n5#Zm8j6Cs0ѹ]|z}~d5xb|Z6[ٵ2lt<Cͤ=aM˘7=2aNy{ՍϬsg$yݮTK84t7xP٘A2>sP8zhq9p׷:r$d9(BQIc`g@-= AŭChyXiNn$ZE }؂J͜e*Z῿^, \Kf Xik,FYdp&bN8#2eZPC#{NE56dNh#Џsas]o'&fkxo'mR!&nţJxxvZ)ӟU!ݱnr-Ҏ@c`-D2S!V^%CERT.y^912)̵~&-rsf`0٠`]yu3[I[t0bZha"Ujr1sQAb=gGPdrK7&Tʑi%zzpږF`ddM݄ǧɁL`dLC9  &.i#xn$m#A,SYa m3TY͸"O>K`J`4 1+}x<{5el$B OBUWREYy*fp@Nd+إY>n dkS0@ I).%J{<'#N;:'5l ,-_ғ[$V˕adAN@2pљ d*Gf*Y^gۮqW"8C& ATEרc34i2VDuW6 )RK2HZ\Ubңh~ℨϊ$e%7"$z,jlD>c+gR7bq✬ UȟCi >[^vcL9LwM_^M(jS/6yx̱ zwm,U52;;v &^?^ytqr@ɛIm%%iJkrL )F%n=p}V}gig/lN8;uPYsѻg5h%ǽ9ڪ.S1#&G/O<=rq?/Yy1YEbCӡ)/8Mׅ{;BEbVL]R鴛0M_nؘ< =ؽVc%gvvyB]ə29I)v"MylO]o"68pΝNʻ|3NFKߝY@׼cʐKeJu3ANNB6L$_ťt l{ S$Ab xҗA]9 hVcX,J(T!bUVR qQ V"[W讠%eEvu#igMrSI.Y禘+ŬG@ǎbq|sz|KY렐P1aL%hV[aÒJ9c%rګ.XraĮĮdĮBO[=dRɻa2)M& )όr:* a" vĠ( =Cqԍ4 MoIT3I!SW:bRW鏛Zc)0(d1:I:i3V#/׏kBΛKO. zVެ-F]WH|n~#Ay='߯=H(7z]\́3Nh5^,o9gޝзl;xX,]sﹼo7t}$pm̈́01?AuʌBY-OfKOCq3-YBˍf3ƞA՛ T z S]1J&S#M&5eNne&N!!/\D;Ȕ$NjL81}'NuiS:O2q 'w4sGn5!! /q*fFz]Қ齅9-s.Gf$AQ*)8jO*HChщ7`Aaҽt+42]rW!cULW$=R R? Bi_OgܾGrwfI%B( `-aFQ )0,],K(l$4KG`,C?spXVr'+!$+EB;BL:@v j"`Y"kLI$}i ñ"lY<9GxpyC8ơ鲯7HN)5Cƅ B匤SIgAW{I_O!nu EiJçHBit9 ЧkOz (r+Ц#n (#VYr!Il]Շ& Xq=,1Å0..f\ bA (F|]2Z L{a 60WrGnb~Y)9Z/˫}2y;}rɺ|D3 |ip~:/Ϯ#͸%4}.ۭSVv4wX49[IC(?a weI3)DZcM6M`'bblsX%)dCLX&&̌\l7'1'e ^yɕ]Q)fv8NK\!*N7()ˍ" q3HqF{V/PyN(9 /= 8W9&L)u<b3n(1EP˙>&"G@FE72\&/%Xƽc@V$1[Ȋ!ig;"I\RQMyF94J^L rzA VZq{A ;Ԋ 7 )' UʂnZ>THbC3l6AWV+_yApÒ1_9c*U0W7PiہjD]ybDY{.ڕ(FPhkcoPDbZlhX&cPl.׉7B7-k}E;q$IkɑŒ:83 cʌ+Ix(8,6-=0-:Ǧ*0qۅ1-MGވNzLC2SWuݰΎoq+Ŵe)Ŕ$3u aǫG,8.K9w!BA9/+/ '  IyN}qˮZ(B{oa/Qd^Wxrh+~_O_d~A50q+z6,zYƹ!<K*\:RsVF], 䨅1PJ!ЇAB۽INKm??-Pca\Yo1& ~;wLyV5`f=OV 03(ɸ 5RWKqBnD}y8ZҜ97zNA_ uՊH{[2 n'c=͊2S۬x3An+u3}VY+(Q<(~ R^䤵CF"PL Xi*ř`Vq%rRiPK%nZAߟR&>I88-"VQu-/> ղ3?]X,CE_3wǸ|}Ǹ|}L'˹/&m1.p`)>B%Xdx>~ bP j(eEogVu f$9˛먛Vꝳ!yggge痷\nF1B>ӭ_n"\A,JZ9IM =JcK "fUYV!( f&!,Н@U!J +dU z%tHBPWãNoMN%`+ & I>v SCfxᜆnXDŽ FCp>XIU)cY_ȅ7T%Vr.XY^5$5Bh^Y֖ \ !ФNFed1(ۆXECpm,i\^ngvQ4H-9 024U&alSQyqC9T7jLQLV_m (Ž~׫[WkQwex] Gn ˁi_}k 9іXv+E5ѱlǗ! e< mǗuf;&HG-6Z Zu-9x׼cҐ[6>yR,>|_Wk}]UE*xŹ}dW+Ȑh6GcPݛРba"#2" 8`6Vb0ٛ׼ݨА(8HvKC0<cS9 l$E苀NHV0`PU BdL(<ÊpY`زugoٻ&7#Wbkm`>16I+ 2D~bІf5@q4P>P@!*˪̬Nˑ(,Ȼhk$A]D%6#4Z[*SK]Hfd!̆jדͭdY'R>5#αH'Xb.Wb _r(TGʐD2`GD|\ͳYG_JL Ij'.Y"TRNl9OU0Lh)µ鸂:ǘKr4f`آ<["FJ`YF|5-[R2~:-gI3g sa˝W*f49bF0e`QeP-%ʪ,2;C "2aD9)FeڥS2L95:Gfp2 O"o=&6#yY6ZjfUΘCJRnM迄aE?P#ܟ'9%l=ܟ (Cc %7 ʵ`[h]n빇$xqsS]㩦.?R=\>Os3EqEp| |p"mabeĿSy< 9}ߡ1X,, &#xqu<"I婩aGSd3}EoP24z >4 Z-Tzq,8Z,@q rGcQkTVUG/無}uB>Qa6ulmrS8J:2}q2nDžu71s7a<`8W?*6 8RUǥ/,Wj0q{%B2Y#<;w=V& EBjrZ4ʌ4j!4ϼK8s੠oEHck^Pr./Ec91ʔ͔0*)Ny&4Q)O0c4Hr/抢šGn+X( !#9Sh!3bbG,j1j;r~ d,uް!OkU]jn1e--{-G4"osmIOK"C~K" ,+ [I ~: 2s66O_aLLGC[k<~,Ѣ[558@pg0zA:~q7(C+J~X]F;ň =SGu'v_ W$:\yCաz;w_ݴZXYeڳK)CH҃uε`M=V&0rJվAu`A1B܍$Ԟȝ381{;ok 1u@(5k*av/ҮuaO XE/; Y‚yrN x,|&:gIfqbI;l9xAQ[lFOBي{g{;ҽ @g 53%(9j2H~k9l/}ctFY߆7o͍sĜ4ބ@m;:G.LU kXGZ2Ԗt]BZfsnH9M+g|K AxvlG#0gZ:kEӨ26Vͺypkȟݫk>/+Z #%ѽ{LkZ ?;G.>p9˗uGl\Q+ h9v+Шt9y$e8.n)̹ qLEb5e -kDT,9ϧ6-śH=<͹DzxL9N9z2"ue3H_K\B Ukw/1!щJ-JoN-#Z#({&U.үn-۷\%i vSee,R`sb+j}8ɣ* M/7a+8vPyT"՟d DdTՑP*C= ɲ&]Bn_cxpҭW(eB)SZ˸FA-1I1@BV(Ĉʅ rg-02X)A}&fVކ?BWW2Lon:\^p{]`FÇ>3$W^P "339*SB^F(wQQ.2Ͱ͙vzz ϫ表g[՝UCqKc&<6=Wf?L9{گlhkOLn"rq^L8B2' N)qci9' Q˳v {MR+uXXYѓ%KXi9ZWxjEN׫<"B V)KTD3|֪nN+ pgξh<:a7%SKz⥧6T},X2M[GcuUEJA,x%OCVg<1E9AE:zJҩ;5J/.w7=TkNi;0i9g sOk'Cy捐L P5cRs$S#esS(3ꉃYֺ_@ 2 \p: M ׳2bסԽv="$w؍>î{it>M%CG >߻-R0߿%o|_ڿ_XO$9 !o J1Oe+J<:3LVMь{Af>}O?xŋ?3}'kG6{jJF<9VƼ9 3c"f^=e{휃h)ͲP!%bL;g;֔͡&Jm~sn&c;4QB$-mk@ldS,=busKvp#43M[p߻q720yh{P࣏568±n8aZh9 , Zńژ{Y3hѭ BR-T 7 # փE]p$ #%crj4>3&!Z9F}fB2` `J0ljFgQGHewg;#Y>'=%kŅ9gD^U,=5)\W֤8לXKgFR~*:~ 8B|# U}XJSkPx RZk5Slgd Z% C]d!Ӎ9~ sB[D%p*OT6A~zNm^+w=n\wAep{ǛfBsuq9]ԗ9*r6w׈ ~HhVo jT'c6JLرPk4\i)Tť`x"7Qcr2Fհ=b &6VojI5ꊂ{PKٻ8r$W쁭2[$t]Ŵ@S\zIVdGէ'+rbx$" {]yF;ssCBc TWQ MpVqZFL =CO2Z:[HnOjݸEE,?&,0[.ie\9s7_O.o_%wq Um#mN^V߿v'{ %Dr_n~XՇ$WJyW"+Y#kE?]6&g;Om?Jh虴,,$? Ue5Ft ~k1y}aT]}C\pۤ濄(duRP (<BD* FyKQqioZ1G Q(J-da:SBL(G,;(e)r//8ʝŕ8Jx)$Z{8WJH3U270xi3]["*>hd5=9!ۥ),ZkRCs =S>Mm}1uᮮin+Mao nQ#O/ fx5xϘ :2Xix%cJ*h&ThFqm֏ fe\O#vT8Y5ϯmviv'a8R鷻w51 qMJ5UɗB\ 皞 `Ҧgg 0c. 0": ;Uy#sYh̓a#` 쫳"~%{ù%".{|9W;l5A+xAMj[c+$NbǨ2J6>EyMSˤ*j"|c#v?< ~XvqOqQ>l6>uxu^ ieGub$sB6P6*$Lqӣ=u?lQ-4>M4ᘓrmT(48rQvGFxG|: 8cg,bCOCCImD#6d܄ɌF8De{okԓppmntdv%C!{O@Q4TO3(As.l:i5& =3;:hfg8]U"X3WsPv^Kـ<娕BH=-';a {e+ya pB0I06kgR%4P«0T#yez.0U''F7,EcA8V1QJ.?2,RÑo\RP))\N}T+OX@2;V (k>^Ry17;Zm6ePJL+dZ'=E1 i/i;¸Q0{֖4otTZ*=]rV5ubCyF)D̙ԬQ/y{rso#$C[ s !Pc Jp!saAdd;EDTAsԕQǦ R Dm[8W̐i3mCWqVByQD$ @>UY_gC,lB}@9Q7%܂G.%ն9%u;nlxF0w-"'hC!<0iu:_/cc]~5]9ݮ7=K<ؐ.|64NcáVJ$hˠɃǠl< B1f}g]Bq J=C. !6@%DfTq@sQLP^ >&E1vHKs{y<LQf;SL]:_(f(##G3!PB9x h %*J!pQg%Ppۺ(!LʠeUB%AiU}mpFέKҪF1ljB\R̤5QlK)n&{d{œ|!VpN%Mk%*ZDTh]4Aj ḙft [:qEddf}H'Xִg VR'A]9k4"0Yсy LDLjww& "o/)=uW\a}5EMI8;ΟjQ>jaF0Z!w,bo7J͓XNvlrf^V tYۏ.i]]r`՜Ճ __\zC[|W->B(|wCl!D gs4-?蔅'TBC ,ٴ6E 7u8"Kܜwo~y3[yo3A!@ 'Ta9`u#^"J-23gʐ GL>Å!=2Tpቻ ec~~OWiǫ#?7ObTGBDu䣺vu-38D.Á ~w)FﳁܩFWov]@v&<,@diiUQ"zIpc<9-vjEnVʂV͢9ٽj*77e=Y376E)L~nR5ݺbPb:]ƻ/"j9w\gnlʰG{M[W JL7xs"l4ֽtлa!DwlHM\Kq#ahYc%o8B KTM)4D*$ 抿d/_v*,Pr- ͗joֿ5V~wݮBj'gIY\>#a_%eyn<$uR9RNľZtb\j? Q)=A2w4i 5!-eؓװw,1q1Z^̆QC3dIafSHtAƷ<ݣ(6sK$w0F3)2'@ae9Z"-*dd'G$ei؀>9.)(p~=P%?5 na C6օ|J͙`)nzǠi~PHr\4Z.U%fsqDƫM%ڇĀ\CIro!€ǫ*@`ǁ=l;'껢.cG-j+D+ [y}/y*#%ڢR(:.\f5(;Ẁ4皍H*MJxQ)2$ed{ *Ĝ.ŀI8#w0;kF si;*+Z$EѿSb2j*Q Ym`q}9gP\h(5gAzӊFVr}@# P{sn6*̚Fxe (D <=R[$Êhfss @BRەg/ 9CzVJNHZq,xn&^|\2ln+f?_o{v) Jp2wpYaд!_H/y @W>)e\oU>Q=H+!Oa1gh. .)][[7+~ 0GH&`w6dgWme$u' }RΣsd+iu9M*bLbJG0㫻 Ny%u0n }s 2ޛlT`fdZ   mW^ jSfB0Jwzjo-$ǯr / VO|E~1`X&䳛~1A3v`>z$Z;6qjTL.p dGRQd%RM6Q+]yB&?%ݔO%btO&gkn Z(0kٕ+HZ=Yq7ztCs*9-yDz8cb8ScjdŞiޣ؟sŐ7[`; . ]MH>U%#>*69^HCcq?YNt?!,${$gpho\A\wg2/ޖkŧ-l13[3 :Z"~YE],Ut=ҬSgB\BKz]jء=V߄E(8g.b'uJRbK1Ӂo#BHp" 2D`Rυ8_IЊF0G"-HW %`1@2,I#9TS!Anf4gnEԔ[OJJ4Dȁ.E~Vhl{{c\-}PhDEMDk|I%UDe`9a*87Iat6YIƁQX8ztR EY$ӜK#;dPSt},)jh6h&9.2.* eeFI6q\ΨN4%ߗf)U෴\rSHvMI9# 4B  Og(iSkS[2],?cj  S$TVJ*Ouuɪ=, /wudpeh=$K&}dDdn5p K' ,.q >o:*̆>NQZ:hNV HTs6IAVP_ ׻|Yh^8%P\|ׇfJT~ D ~r2A(*ok<8?E{ u^c@9EBhBSLT'_ך|W/ŧ+y/JZxekx^:#yJ(o*kAT1E Gt$r(EA%KЎRԀK 9NkiSbV'Ϲ!iKr Ŧ(M!P=etT-("ho-% *($뼵8) *ZF^!D9$X -UAhґ?VL}k!PH/@kqyQxb`Z2a2XnfQi$f&t5 z~Er Lkb7Qɀ#H28%U.urF Tlcds)0C7tZG D:R 2p"Ths/UKc ưm6oF/ӡWQÀmK:r]`Q9Gh&8 6W,V W`A 52E9U P3[_-SD !HdGr&X\׌ 5%R,8Pp ;hoHۧaM俯fJd Wn/eްg>:d:\+Ώۗ/^.8OqbKB(׈1wdvh$$;W6o7; Szxbm`?1B.$ <篦WqұL=bhypS쎀C$~ϟ3eTs\nt盛].~tf_b]Od n)U/1&9m}BS h-q̿x#/ ?#(3yެ(y77af=vC|Bv囖,?A^AP%dp7@p Ttrpsj#P '~łGH v`h 0ĀEͽ j(^e JIXeZbX:8ٶ׿ۇDu*afZ0l@D;ɼJۛ aܦLVՂj-X݈j4: |H^P  - Y@VP!0\1mVjB! C/gLWjfH>N'8Fs"Y4Plo5uv42fڼi˗ˇansmS27ndDhU䩡'pO ywճY0ޏs㌑covllo9̥g.J٦{5V;o\D['TJx=[[ JDmMᬯS#nZݺo\DEL->ڭ-%Ӷ&p4[hvBBq])S(SfD=2{L%-lDŽ;֢S6"k\lV󧻕DMSe-ʦ|1m)F2JX*;r6Oe*soe`1)e 1F c(ٜ(FsPgl*Ʒz~-@=ٷ3v2zws=9^X7*6LǀLi3Uڱ%:Sj".joup)p)Ahߙ6 {S)0e:KPiGj\ ykW-Vx?DE5m&jO CW }/`}n(Nv֠13Ouy,hg6q<)6Ԉ(!z+~ES^߰jȻHyFy#2==럭<ԑإTPtxܐ]K UTax!-2/ia נS=%$MMyp:%ucOxbL2ȕwD=&2rLv# n3ʖY鄆xR23^5rfX.Y'\%+61zM! 5Ր mbT[s<8D"RŜ@QPhHd9LU\d_VX1L(ct w|Wr6WZd~qp^9"!-UWgdh "tok(|r[K! WLv(Sh"הmlsBftob Sjnv`[NXK6ѯctI8)wg h3j"}A$m A~ #ZZsO9m[1ʃYmˬǬEojBO0Nkۄ #K4)N4=zpf'D 8 h)yAf)iCkYK@o@4247f]G5CM|#ShROjGy5Y}hv&Csɗ,;BKbSs6@Svi8BH jtٳ'Khfg)g~Z, G Fє~.V*ZͧnK #ld+z5C9 όK6G#w&=b˜H6$]eHp*$Bhs‚5yu_VPG$~/5ipnC_'5SЁ6%gnu-}|=*qGqM7ٔ7g-זKFPzۆši\!{f:l\'!)rr&xTLi,(V{hA5Y;ks/i4nOea 8F8llDdyo' Fq6m=-ms ,F,!+b5BP"rVIh| "MC : ҋĩO<؉ʀbm@TJLFCq.v#UPbp3lΎ/jCXܸY$btfATYͨFy%DpPN0'-%. ,"8&6TKPG:i탏wm=nF4%;w}XX N6!뾍v/KZ"] $cKG^ C!gɽv>gd_*ߞϞ@\ɻ!L>ϋgl0G&tpK!rZaY*^#8Ǩ=ꦶV)~g-uVVZ<{TF ڡgpi*\yY69@ Tqj5cM(B-jTDup/:%mElb+P!wdU8J^Fuh1VHU0JhaSh+ž<Rd(eq ,NKt5 @FGCͤg=XMX>Y1{ ˻NZ4[-C= b5#=W,́Ę+P8y)|5E:|WvñA+}}O>WN[wrޞolr#S/?}teb1r'Vk1Xr.P8V?7G!9B _QV\M~kM64kVR(h{ :ND1,knk鱧k/7alc.zK2Nn3ŝg8XnQ5vo=b7s>].W~Q<j(BQz" /eaQD~9nGi"S^ZUJ[.t^($]諶>HNV˭lk"i[/ z,Y"G ?n$IJkQ_kMVmIP M'`=Vސ2q̖e0#) e=*}%#zU ٢ԅ +k};::.,fIz\6Hr7/AC Sq g[Ff  !mV48xƲ}FrB=/-%xrYR Hp @T^{)4&P NsfRxUl )[&JOAEH9j 2lrBdQ%(X3²b= J縫xԓ/ Aqr<)_v]z2R/f0O.E> fAޫ"4sQ&3@Z'3*n9{D\IJRX->?>:?sU^yV^?ٿ!im{u^?\/~Z^ͯH~^#&NǃTo-v2SCRA%)={"UQőC%/vyoH`)A<;sQJRK#v'y4Toh~OG&\Ü 3AҲ;A(L=bPuX,QHo1H1KOT.=cRXu]bՏ:iG/ؼ}=\#')CX- !feU6 `&诬UME<;{dhU1 !qB츼2YPƊSvG,* m?T5VDtw LH_4WOz*zkew:~vFa;oܻF8o{Ab !?0'G *$AeL1"{(\nІGMt5,7+,%k RRD %ˑX|~XϲכYʑW†3g)+/ytT]n--< GR:iId ,G\)g|S(# Mupu;/_ I@o 2).n^u%ySp XPR"V4"{1~C=;[x!O }׫+zӊ`䢁}~H<2߱M)[9#[@4ߞH`>/h櫏$I@`it7NFw6.n\tUc];=ܟ/X?xfc.~{&7WooRޛt^xl,Q~6XƇigiŢDj@6qg4ilN F1,X%7Y]JV >bM_ԨXӗ8KQMlRٝMJ-W` HkZuo5X !\邴^;G_aR=vo6 'GgRM3&Vwb0J90T}1R1ٙfJ[ TN)%㢥X8F2ETz9p;XZ wz=@x4r4[}p<yUƃY ~1k& 3#, w!EFnO{0$u t,EHᡨczwP3Zs]*@RwJ>6RN}5^kp^\f9@#~bLzw8K:J )`PZg;:N֝*rp4!EG~_eW* 2]5]Ή*On2z!F/]M7%GV95pA`m}-}BO tBj+$@JgEKIƕE%3a1p%WYG1a1kfˡ&y*BQM9AˠU|CzDCMGH#F@:oBE`d-Vr @ ObsU#xw'F r6!c.aʁQǙזx*1ށDi*( (( e ,Ǭ;CP) E|YXc46eRsJ0JI(cYIs\qJ,LB)VbM.u)p?Lkbc9_>}^[I|*o%.?4MH6ߟc(D|ƻ0(w FBH);K(9�;VlXŞ߬RO>yG k=_rz=Nox$ieQX3)̩?>ކbB1`=' 䃢]wo!ߚGE&8F"@չW~]Qgaj}1]oK~Gڰ &_gfDy]RzUIi9`G$yѠ&|Lwvw(J=)gr$#_I.첂t/׋l"8S-?}wҵmT;9[-;Sڇ[x_VVSQR- mJR:C-_V߱ 9JMvs:{Y\BOl7'a$?\\aR zclpx&Z+Tg/O #H=Wj- *476;> -t#qw&>-$EǏ0)bXA*Փ9yxӶU=(~~om*XVw~tyʉ"!1wWC8t_nO턄BYWи^1Ea{J+ %`.P/NWVIi$DMi<0Y rW4E 5UG뼹ׂY5'CAdjd0` Nw4E"'a7 5 6MA%A2{6y Hd1%-PP٦$ڬ~QƤFdЛ1:v5+NZlcvZs+-%n"hk89&<2:9XOX$ٶTau(a=Qfc20{^\V 1#4yrI 16ƔV ȁg+I'Ҹ|JON&YMѽ @;j;jFeQS;jqȘ7k#γ#W\{Gc6`e3Ó9cHϗ]"dɰ#۸DLq{6dy`C^؏0O?^aM@S1OOrɏ|knO>z ,ybx, l%icc9]~ۇf?u4ew:0B ꘦ ʔ$5dH!vޞIu=L.6?3t[Ġa7y`=\݊xW3Oq6Da p\eر0vYk۟z_?Q ` o5>ւg͸*ڵ}NNݢaLGrHm;] imGC%xK`8'P@8 qMLB]JB]jrL01pi7²e^x`av2Nq <*A5)󄇔cӍُSYG=ή/7fͨrߩwgQΪ>}!nIc:B N:%|x1hk=;8ۑ\>Fȫ KgʉM+{o W {ͻ^$}; ȪSUjڟiz-Z2ܾϞ1zE}gĴSd>r2>ާ1@4 =' :d̝͓cϫnzw[ڪ>DlmKI }j3?Y?)((('4w7+kE :mSHDO,+P06x?x-n }ǫn*ye۲iWnMm܉J,(%G$2c߃yUEn:RR\ָ6:[+H촎nhF#Rl@KtwmX|9ؾ$ vn˴pG*{ N#FƮ${ܕ-đZj"yR$sSz&(F-oJz)]96v|:9P^p SUnɝv k>x%+ҁ9q-\+I\L0TH:ٿoBSy}u肺ͿW%]~XpgL9r/;ew'P^+dRr^\9˪)b ?u1b~׭\Dg+d)F`!䟵T NR+9FfxF }VMDE.}K5#b ~&!*sy璏uOQѕef‘2hO'`sB)}9\IRXڕ&S 4nS N Rc)穀%ZruR.]]>RCK[r.J/]*.ƭJ dw uFʼn™u#u >K« Zr XMq\~$; zuDJ Q,8a#-4MFН:d[f|y)[ⳗ漥^+a sVtk&PWYMRSSC"Bۘc@.h׆ֵXbIlJb ($g)cdB`1fw<tNЇkۖYj,D!Dj*jdN.836+# 63ˬZK/y2ǯzEf͖GL3QΞ! 8C%PPNdy9ʒS٥SG 0WWn#Fѫ{.+oS_ &u#\vmGK+ H_Wf|}8JΠ.^%ژ%UjXJ*{}<[g~+rܱ,-F_.!+Zs> '+{_E0;֤Čf<]^m%pC(YY{r֚Wx0i=3tT-UZX7Sxi)i^@GK.v],XuҺ%#cCVI]}ˎHcXd൩-Q%c_cʴ^+ (7MOrz{ xuh۝䦮iW5-8b9)f*Q3k0Z.,W)pD2D &, Fā JqYC]x[NBlZ sDZ&?i&%e{I69h$W+q /}B [%?}|]yM1̇#(l9,}E. fz*~@mS !ᡦzR;{'=dlB! zPs>i2fv*tj6y& mh,MJoFl>/T ~!0=G3JOa 4iU68}_|Ow_UJ}^믿oԢ *isU~͞-˲OG9aPU)k*9&j :|yEv-ۏ?p[*LAhjͻ_jTh 7A,;j+krFO띩n$'Kٰ²g_сSv_͖`_d,T谛ETD"/$CTJAOq2w4@Z 6,!ؚiʉP9tХ !4$'+h8O%ؠc50ZxZ34\yT/ jj27LRIP!+A PCмOCo%x.[jq%oYX׺qxoAp_^_č!|3^35aa^B^>#5QX7]ĎE#W$ά{K3)gumbŤJ>!𯜙r2mͭg΍p7, j|Ԛ"0n,E $ړ2\hZiݳҫDKLL24*0WT[^Q4F= EG'UU]<?$#8 b]╒G\/)8*) ՑRJ )m;dʥt%Zr0kppֲJ:99ٞl=G's#2DNoQ(!rߞU(ɧh謨($*Ʃy\g\W y{FIA(.) m*b˒w/3E,Yjȿ_Čco6EYD3~- ӬqX iPb- 3ď{@!!T "+|!RzP4Q b_ |#3sza@aG&:.A`2f(܆W?A@(`ྤ1`HYW\ʇtE P-<)!𾒀doO5?h)uv#GCФPy_ل#*P+dO`>cV̀XSx%oY RgԛI."SJ=Jj" 6nUl~0 P=͎F:^H1._E6tٚgɃu(0wJꅜD"PLj}wIF0DA)mF$qh0#2,0_ǵ9,+X9Rk5aÎ~;v7F|Ё [aM ĩ P[wCc;Sd1vt4j%JwUZca'ة 00䦚am0 f3Wn21gH̷jq5*}B?G\ tnfr,X@~uְ>G!^|/T5j}Y^\Wxͣ]?lp#)kɠNI9-7Df9Pj [SZ@YXZUcO 58ZZoFMΰݬzAigNWw娖(63%߫A[xCh]A[XP+ $;:$8DtBX!kj:y9Eb8e13P$szaBI6וC+9TVmئ]4sٛN_.[K1h]қ[;H@AdK+Φ!_ʙ^]B >0|kK5$_6M!ycwo~^\:|Bǻ}'] #es:kz:xGKx㗿d(R9{#gE)?}GO^_nIbڂ[-rR.QإzEULzh{E|?Jg h#䞴ݳP2n3,)zcoG7a;{\юGclqa^өraө ;5 )&kǤ[ (ҳ:بө*NA t[/ mR赲Cp8!rNNYϫbjfblXnK1 8/ÕVf+ny 1roOPng{e.h@3mQX{C(ls2tC  8~qSnJ?vF'k3ǥ8lN΅qk4}8u#l25Fp2ըkBO[4&IEƿĞ]ūɾF,@=ǟ>_|aڅ-0c&VS܌C۱?0B -?+ZX5o*m +3j 7wyXWVL;iTvY-ө4/N0īD$0q7ZQmB;Щ.2WH#5~&.(\n\V䞼yFksr|y?H79ö'[ S$j}iRZi㭤g"@%AJOEҨ,8 L^;Y2{F Jk)t"Ƭ:=Kc7c͂{ =*Re9zQy@Kj3K5x%f"uc^p b"^L7!_FL|j҇Bp;R>z% 7DuFgkNDn?zp%CW}*+OD 2N1Қ괬xQ2z /Rޏ60tnZXӺ0\w.pN,C1ш/mwX a-Kޱ[6Ab̌*nCLj ԍr:pAni%0·.űW0"nK4]{TMvOqⓉǰl+/E[)aZFp݂ Fpg1 mAA1npÅ-~ nf0x+{Q7l8."zm~6UׂHPn:CkėclSv(Pfۍ!ٞKpy=^`C2=v<0HNiΣ_%~:}u-;|yO6빵w'oػ7ӫ]E-=F}{k15ᬎj Hb,hj8u^ $ɿg8W^A`IqT?@;M@DX40 Li0 8TxV)U?gc H2 pBS2@B e&j$3K  IΡEy8ĒMCj" N QaZyXBP#|B1S y8.L:NHOR3zr#r3q*`^$Kz+ "1$*'z*I)ӖA9(풇f!{$M6%nEs^[Id-7>Ȑmy> hG?^]7WP1UV'F?(]ݏxSIMoEQN,-u2?_5~G JVύg6+& qr|GT%:5=LZ-m-$Q8Eq/"Ҙ4'i˸AS(K4k XM #*Iqs[&]n6|3k~Ҹ@h4hC /Sd6`~v!wQݼ}󦱽8)L3“׋Qnl[2j͝;4Ef``Ybi ɔ+M;75(iHO2)5&#"SS,`̣DIh #D nfjWx7C v{Nz)881d-<$"81C^ x F>R/qMN"L.`rƂ A@YfVdI\` oG$mAArAo8%Xa%&p`͜fKJN1t>b|F9|^穷VuDWHK$S=4jR6B$3JQJiG#* h: l4Z\6/f@|Js7$4Czn5/wh{a"xΩM|v[G3YJ d˭=P;.~J!o'YNΙ9$nz+I)k8.TS ~\F(=q Y7#;,bR+T3TFgk}VtPt6$ō޹ o*9b/sq HD;V1ژ+r3!S\5Y]`]1fsA*n\2TJ#H֘*@T5T%#,*f~-G;w68UםAmG\p WOd% ׬)ٍǯVOgEsQFw[!P jjY|R8*aIV\*0+X->%#4%$]j?7HmB5׼tjj]7!^׏G^\Dhvޝ|zxDfW&ѠEAF"{wqtbgJ{ܺ_%y2/w0 ; KW:URER9 }HWZɷIJ+1wx.{ɻ4 0K '33Z/>~I4~7qލ6M6v?M-?޵ǻz\t7Kbm s|أ#CPE+΅ؕ:S@,g QL&8gKyȥ\hNA!Xӊ+o wѼ797x&(ߥҭP&×ތzv&UK#yT|m]H.PJ[̮?(>tyOFhZژGgc1dۯ ,|4T8<=7py~% 3ie-UX1(< `xfٔ綖 0E=Sqz蹮4RmX-~V%4nYe%t٢Xz77zWgk)u+1̂$kD",Eè**`2xfqZA׭d 4TU"f2sM#0,$'i@ `H]5DY3V|szХ2'oAD_zv0󛨽X?_*q~^yUW vv=8g`g)k,Lc&9!Ha 89(e6:h|XJ ,Np=q/sھ02F%_-N/wzԵkZqF$dBU سr{۲273yckJ !9߰5;g!N1ؾN,ƀWg$ǜ 4 7.ϩfe"u}8n *K|:kSJ2^E[WK0d#Iꥳ\|ҴHkv]|Q9/ClL֖Ot8pK2ѮHHp l @P\]Uj[nhk=` ñHh_+>b/wnklru[`6~7%PpJ9(k3|z|uQFV<^v;Ez%bћ٢ڍ.e'I+)oM;qʔ+{QUU?_3eOi1ۛ@t/+2}9/ߺ54îT-v{<:ܶ`'c5+xӃr;5J$@зDQ;e}rzkr1OLNNY՟\^Af\ZWWPePeC ra f92סSh=Mz%D~QH@f 1`E^#2]cBi)8H#NJ8fdj k"<7nhSnmƗrk;(!5Cv6x9Du  &x-HAk3õ@9#lv8DYHλ0fi}(cW @XU0UGhsV[@28vbIZiSVq ltad^ԒZԢSs(2E !4Q0DRSBX$3jVC!"v$Pl)6kEB:Dk{43 Ķ@Rf2A9 IiMsJT$3M@.Hj TGĢ9(^^kSZxHHͣ6qtN A>9CuFuԙCGeJYE)\k02i$tXX) )q(k3.FJ}Ƅh NsQH C樁 *k"GMYT&Bhg5&aH V9a$ $y]Z0BNHR0x6džEhҁq 1Am&5F@jG[Ont"YŹr)/jGm*WQ z""j85nxO'шr̂@#`Q+ bq;LyDu 522X9t,0s$n#QǠȿ7*Xŵe=uԙDdc%*OݝSFH~1kEÍ <`cF5:9nJDXKTH-*:BB%,?n?}ΏNbx;NrBBLŞDV[N^t:EZ뭸{OTVcY*yCnC$BS{45ʂJONG4TG rt=ӬstVO.0:)J.D 'stT< 95N^=Lz-K-4~$~UҠZ{d49Z2Pc[w^KZ)+}DX<=q$1 a J1GZzjn5VK8"좩uƨ) _Y!S B0"k#1jNIH,po$]7숤c,O}ۗyOR ml}J:>ٵi)aoogo^6r 1媤7uo'zw|ܴ돓ۨ׎(jmh,vbC;b4:Aa[8 fzxSv7Eo0з&CkoOg֥A"ga,||=@"ݍX+h;ŃUGBY+xuuZh[ @2dq_" $ b*Fu5ͦ7#Y;BxIAAJ +g8V"m7v1UeTcB=#w!4vd c_Ʌ8ʂ2|8y󲤰5PA4t)u?w|_|Xv<D8&@W r[HPB 9.<}m5\AOҸ׶2N@N˸~ɱ]s|[񎙾ϽKAc/&z&@ɿ-R`e',`F?"_ṱ-!VFsTҘKWBHX*&X,)-2Hc Ϥ7 FK2f0 (R;K  ,R(K2(%^@1Gksc.&U($)|,jjH|H\ Z֋|lo?%:qJ#l ΩA)BHΙIʕ \ؠfHJ \;ZIT0%Ud=1 ĀhWyciHt˳~ٰ(نh`͎p|?E%ƃQ~N!+h;\byׇY06E"$`86Ew,za'-h*5}VO.0:T$QIN͕SDWOn1D1"p0po#dʺR")i|E:oy4Or+'gP;7r]t~ _Fgx[CHH/|rIA;L+ܣIg|GV,-'"@7E1AZj!'w]/2G"q0:T7Df߇ tOX3zv㘶߆w`솝 9-❊0RaxUjHP'3?u{aӅ%0p4do+T†?:݂nnQx>°^!˘~{(A&)gi"PI)mŹt m!; mUj?{ȑ_ l` ݇p{ |؉_g3,_VK[ڮVw 3ZլY,U$K}1d b.E~E#Mt[-K L o\1.eM Zvu!zJh]pj9q(-' OmJdp#.ri!JINPS< #([K)D+ 0MZ#-aV7Nâ⇣EУ&GW_7+ܕlR)iCӫ Si4c ƅhsi FdCĽQ ANܾ|gY`Eƨil=ҚZlիlu0#H",S95R+8sfu<.PĘ2s\T(f<Jv)( 5)p ZOZO(*ƚtGދ%?+;C' cSbV}k1:MфrkI+o`=!g ;v3Alz{)Ӌ{"PpiARq F8y$S6\>( Ͳ`Ev==L{zXn;-6.y11dL5~^HLbި"H!@(`I?3}(6K9?|mRLc+! []eָ w^g'ἑ IV.TƫJC 3ɼ^Ӹ}8:lo/>1y9tP @ᤪ@$[YBr'^|'oð Z|Ҧ/. +aznk7KySp.m呶5уti-cAo[JFGRf|i#iLFİ7I#0>ฌ?DpP:QWԹ&.}֍te4"%i6Nte̷*;9ʃHEiy3vs25S{T_{bNˍudip' S"){uz7%ARؠ ١ZOWjwRHFaOW$)}佩ҕ1gvRBk"&h&ț'\TBRm ]Rv%P-Y6 Vcw|7 ARC6BBɽHGe R-Sx{~̯dTk$6}9tk}kίoMRx<&sBU&Y VBNDϏ^\Z-{{^]S^` M,"$`Sm|Ig +bAnX)T4:@F1ଊ*MFE +a5NW#Iz8s1JKĻrg+@,JɢQ!-KӮ7rYmyW^2+Wgf WPAxWƬ<4DiN3hQfz+>Z7pѨr:o8cH14C"?(DU+HT *,Tu54HK5ȰR9fg}mwnȌf`^Bjyfk w6 gb<,Y3]t Et=T-%PZJe@vǡڊZ0r|U{[7|}'+skҐ @.G J6',7zB3Ko;ДJ=[ zWMOS]Q5/l72Z%< ńN?-i5&X{WxB& iew>>Xx7 4kŸ+Jiڼҿߞ_W,zK NaP3IG' QY͍J=|~?!<|[c}_t2NkgY߆RoY\c}z|E>D)Z((a1]UiVY6U O iC!T$bHN%É1JB !y|Jz˭:YU2Ed7aE2\A5vHB9yC9y i6Xy`9~˗grNRИ H:p`AU}}Q/1XbnC79j%풎l}{VzO 20zk7eJ܀j^е6SvBEK e2oT28I\dg\i\fycs϶%B.g/TYf4kJUCՌUֵd5UL@6(Y{ $~JτO;k;=nV/GF*nq9kyUxT)ӆ5E|T$զBpÓo7xw1p ȣ=qe1%SSS܉sZ19]plgg(q}h|do&A'PJ6UYwH:vsh"%zJ^ rHCM<"^k}!0ը5flfRLHJ& [HV`kiTOH?Uư ]h6JӒ Td88"󗞣Bʆ$#,$Ջƀ%iVk]KGβ[oAN*T^r+!E3ۉMQ.Dioĝq` K|@AxrJ \v3P&)n]kS 7U h['p $3V M.'H!ϲ ,p t׍/TR"o:#S\Qo2iOnJ_"ܳE_Z"^oww] q-giV<9Mp}{2992^,e_uRgYe#ۋñģW^l׋N_?Y6b/~ X rIhCJD0X3qAYӇMGn<#WY|AOɠG֖j;Vjuu[e\ 3@ǃZB6^QJZy1+q#'4R9@RLn,Pt )Ω4:Grsٸ^MbIq"Gpn_"drm%v-K$!B %=:V7manMEo-BQQ7P%7Zż^XFhÒBl2ibغ^jBwa_PԥpiYFrJYXcr189QG*%x4EIhnkU]O S1Z֍uM.qXm`!O]EcbYNqd+*#7客%#teNL;e ~~8?zup O@FT@A5l`|P9= /-ЅpQ}_,nAF;uha \9&ƆMthp #-˘kI wJ$M!\N5:T h8C'yq!u /k/ #&I"aF+5d4q%8[C)*7ڠH 3]SѝB:  4J P(P ]YpS|nqvsC5F\ k~V7V B*C]6v)_5bLH9h P?_~9} CbGX_oxݷR sspkoɛߡ2mK6uf|w.3qbnvֻv~C>SjrI=Vۈw=.ƅajso{3lZtL3fٻf\F Ƈ)y>{UѡSځJ( }s^>0km|͚7V֥9. U>o5V㸰*A 5*WsE c!o_3XVa,{軌tFBZ;hxދOw&n!wP٬bij,7horYǛyofNjN'_:eG̿7{qOartƭZ8lXi:! VFz\OK|DeXJF߭|l>u5_=,#Z+>:^}> iLᲰBn͐ ס6o= SG+ xpDZuLT FOB6aͬZG49#C9 a*邶@kÍЌ):8:^|E: 2@Nsڵ" @ 9\>XFTjl,٨//SN}'OhofӻT! ?n^3Ւ\,qE4[OQI'i\%NEJ"oiF>$>`Kme7)QЦ0~ޜBJr"ޙ+T 98E`e}Kknh*aM3N/GZE}擖joB_m@s=7t G!}Ip&6\zޢk8n4*'wd2&1&OǨ m}Aգq>ihVYLR*J+V_-V_BzN>) V V Uq˓/mp_=nGBeq3tw=jӡ.KRSMRҐJ=bKv׭6U]c O̼`Gh(khхCW*jM}u$(6,YZX*sKEXZSn+&i;坫f0z*lэq첪r8Cdf ~4'|Z/;G8(q4t&[KY!l 9۞_{U!SB!v}Xu_uBN=.|.=Çpߏ2]IB 2Jg?a Z((=g󄷌hzil2z@n"}X!8OJDi3"DEby֑Jb(SJ]61*[I^?.sLd}p#_X?= w+վ7ekKkc3)%?.VxEVzUDn^Y_]d6_9t1cwC$h=}_ B]v뺨`)BA #%HC+ʈ5F-uʓ=׼n#]-&}^\97wS]ŧx7y-wRן\x@Υf=0!`6ZP}+U=P cJ.Hpapg,{8x>FMpjdž|Fݧ+-RN$Wc'\ͣ8"*qFAOSV|z~Z$ = KVR- { /!G!VIXܦ*i.*x: u(ٿ1qdbziEE+>d35q0oKI%XMDj/{q1d=Rd֮҆y'ŒTpE Hz2.@WQg" }Yg[7N<" 4`? > 5>8t; J G{-gNwsAM⁇q3a"Gvy%]%;hS>L'ԗ'@fcC&F cO ,hU$IGa@ @bhϱW!F9pʈ:`IQ cʎY9MԟQGcWr с]WTI a < g&񃱏0p9-F|U 7ݞBp:0S}:sh7>P ->bH1¸dwrOΞˁHI&j Z_9LzNFrǚ.qŃ}쒜drcЁt@'%uSy_hqڒ@|M1ۆF>l))5M!PDhۄ!^ eV%-WMJ"ilEW[i"]5oqo_eBn/ 2 m~YvXǙF"|A(C%0:'AD.iJ3F>1ݼSn|LNA"Cgu`o( *"rI]! c""p Ce\䄊[@ qϠmt}ӏMS*?G-0Fu_9#-rIFr<~rw3nw~P˜ھ̩˜ھ bz\P11e@]I8՚dt<)rά5@eɡ8z?,3aQ׍~{#*kwh|0YeAqsKTJk hHwݨHw|VB2߻=܀%e%kjXX4 `I@Y>Jp:`c!e"$z M:CE5IO]- (uH,\P^Be!F\s%.)HhA+!=s^zj]5T^yqT.))r@xΝ4<1rtBᨁAħY$!RĬDe C"7%u%u(*0P4/G0S|6lƴ득lPv%~'9WMߟWO'ի'=\:!~F/SrL>ai3q8U;LWPZQJttg:IQ'lN[¥ᴗ%_KziD3M `5Spi1TIWqD cw"FFIåKV1\Nѡdl3-[n6hh%uhB߅ܜE5GpEҿ? hFE9:(7-n{W??+tȪPX!U圌")LV; wQ lXlbs )=[.p" lZp]B(J:r@n 5cf9_eeTYii:%~2Z?ur۷@@V  ]gZ~N'#TGWo`` BOhj狿aGfѕ[1܋K2:7kfʈ59MbؓX+@Q/V[]qQбDNEv|Y{5 5C 2(2(A d5FQ Y͔h#Έ"AnK+$!+*-P h`T(XQ Dk$I` xX ˺PZYֵM4MlP#rJI{Js!M IYٔT11这|5z&ORA]n VjmZT:7ȗ)#nŽ3Pc&@qKOۛ$w dhS7UWhpR^ljFhڂAWULP?& Zԑ6M߼с8锜J>AGJTQR̻f(P4nD%FoaJvRԲ9gs*{7v6X,>{zdVV_Q5%ܹ`X+2a(yZ3BGa-Kvu_u.O =>g|tw򬶾[qϖXjm ^qtG+G8u̎L@̮Ɩ=kV!VlW!d Ta&s .H Q~;^~y"N`abgwy[oIb6tџ3>nyo%viK32o.ȗ i~㵾:iv[Țw:|Bq-ZXJo2|44K"|)gDYY0FU: B2$ĝڱ=;\'%[4o)RCgx㔠ߢ4l`J{,y'ٕ([~U`́|.ia$L]ԗJ 0!  =)bQ9-Y5p n>m -7ڔ!r.[0nz.s?Gntn5 rN!ӻyN->0d|"+ҟ#>N\cfn[85PUNWFs"E'p[mD6lb_ Ozz00LmzBiکzP!y4a'D5zAPߞZnMāʎjc۫X؞"|HҮP_I`Va/U;L, ٨9 ?kPBDe *{ʸ (a/;rqif/m=0OWY<#g0ysGzo֦>J y f%*g.OzkrϨR[SRgTMQJ>o*b+j:j{񬳢SH3 ) ;E9[f&<43y"r ^;~7%S+*] ǫWW}SNT g{L^}Z-)~Mq8Ts*(kjE% M$}Q `O/zw ._şOz^r0̾FY~퍷o&dg@^o󷿥p_@1Kw?pP/,ί7y g`#A nV(KWJ%9e(X8ua8.pNogY mZqmIS+UGTT>UtlV*W>x y޵.\˳Yw/~y{3Et?3aڃ((J ^3Bz=#R QiOƗG'?G hr}fv8 Wa\-wOJ,RG[DJʘ1JTʖ %sQ(h Ԗ c.({?38K D[A@M1 Ng:E(3B񆀷AHbxԡԀ&ǘYhIA$fNNwR vލ>v{]h}b(U 8Z&]][ļF`y7Њ΀^nCndـZP9G*\{|\1. ȮY؂fO[Bc69ʨ`;+ 9Kâ]7INo{[CFSpvV_^wx45\V٩=X˜?On_vH Ռ٦T[YG*"xRp/\r;@,M$K8(KE>jf0z^N<;<}0H+tP4Bs#Y! 1xБX J)K x\"x—V5>&kjuPCJG& *ૉ!l29SHCtY`%p#80ʝp #1,Z0J{( g*hZoDM 5\eqJ^%NϒBL/J |)~܍ܗMm搤r&n$s|o ?s폯oth{uyY|Uǀ8cb}XB޻_^`2-p8}8Oi1#W;C;}yoF#|>%)5}Jijezؙz=Ւ_@^4JBJ/5|R7()(()挒w_{ۤ k얟Og9 w6kˉ Ѹ-'$u.>ZtFD>BH47oYjA_YpZ(\PHUҨ (!$ګM(*k4Shq"),޾܌ήTjgG;0Z'_?*~ h j)0F6yq (_"s{rjj`r^3]]lH}~V|ɳɳ>!iWВحpPmjiWw|GC*ԮVx&q{᫛ u8"F3JU> =|`rT4 i3T@ZhQ*AxK0&_ X ɠ8?LGT1]RV.d'D'TE;84/ "tWL%Ož?M4'`AgHOQ<9{ϓkAOg:܈RYaVt)q8T{;1W?<{3T˳׬޿ֻg 9Yp=dS gVV^>wp.pxdRpPܙBpm}CM%ϡIYCiJm]YiSPcS0 A16WZd%cL+#;dTɴ@ zg 7!Z<9߀j|?@=? YR2I¬eG;h#8Ni7tsnM1﨣ݞB_inmH7.Md*=5dSn 12 Qw{]n' !߸-S 7EKa[ʜ͞r-h XJtȌԇR=ZU ܦ2j`Q:.Eii"eC"jaR;`JUbSb^F#٨5ZI{Ez$N T珫Eq%z%8+6%h=8.Tl_$LQCJ 0m)sJN AI,<c9\@ɫTLu AJTL JXTeEiYz.Ș6%ʹ@QNlR(50e iNrpU[WmmC2qϟ#  K5Jt[&2-}ʅh\?T}ik=;&}meQ 9N¬Px/v|(>m+: *yAβ!Nj^ĐR^-zo7 W}f8oi2Lkar&)+H2@n(x u%DF-?\;V,3K$$L;F %<8,G1+5B"p:] ʬb FIq^.Pj NzKy(S80h~%E}Q/%C&t#Ul@4HН]QeT-BE[J&5fԯRJ3>\p L) 8R&& MA3}ӋIa)st7 juܪeEJ'Zel4d CRXtY(Wʭf!D6mɔM}Ob}x]6!0H$ʔn]4('ӄft@aª#~==E [E@U{{-ZCۓG !4D(rP@@8B.N>c6&/-O^;M$ET@ !,Jma\)%p;.䌰m RdNڬAbR]Dq6M6NgV&Uw 'e:pJ7\J9"2gmϼHB m =V=& @[Zf,jZ (uDzm"|=J[֩In>OyG 5tCFr#۵ХdCK$V:9Y@]^oFUVpїOwrrFTQSа .60& 3 M,L_}}Sm-(˴DC;L‹$AwdžM1·q|{0taSLaK*H%}2Z}8? ,٬&obSv&*FHx\G8sϘ:a㋾Jn00e{WIn/d ž5 vVͨ%zfz7X%.)PJ]}H1_#@? sKzݰEO|R&nA7o仫mHK%]vŐ%FZ;ba)kDc _b?ȿ6aQ.>]WmlҡZT\4+'߆UvN(M44f\xR!;y,$vmgaYFh|#x+եnNH G1yEh)S)Up*9>}kiPûMҫorH[.ƴ/Al&^kvڈkœe9%?{%P?}W `6|v_tݯ~ݮKTaEzjàFxkξO.}q6`BU~P=;]w[=myA o/$Zn_0 yGRzn( Lّ%dK&$r@D (ͭ0d*DJ&G4VdUrt1f -3Ri-_Z2$ ^iN EZ2VkpTh ߋ0~--;p(e!$GiyL%MNRfDΨ3 -/ -3dg!!ᥝљ.+J%F_g^b^:ztzK]84CL1B(xNZ %N$-9i3(D빡Jɚ9P'+wBK,  C)Z> =ELƼD뀦e&b `v'-Df2BbpHĦ@us,Ek0Qh-˦*9-[C"YbN&86%N7p7NPPl8=("+h Amψ1IƻذifQ5Vجk?۷j,TT2$QqፉBWgpҚ~pֳ(v~X6_67ɑw0`RM O? j#4c';ѡȷ{oZ HOZ*Z˗2땑Cɫ/pAziu)W5u]pCpd''fY|pi䷎9/܀j:F.A5e=S=)d Av@q6y㬕V +zf9Sc״z Z[>MΠj7o)u hq_Ҩ*csuwߖʙ;vk"ly5˧OUʤTy_2|ޗ>|Z OzS S̠YR!j,B !A&SvznYJ_cRnx]Lx`]Lm.ƫ˴[tx;dU\({plݡ}ڢ7498,hX+AS}+QHRY/̙wjLpң,d93E;&m+%@/ВUK*pPj3jOmD``:H=o Z\9‹40n~4ŸEc?s?&JJjJK#<q&}PLp;46#V~}jm,e_҂tRfsRۏ.\xhA_r&O]\Su{rWΓ w5Z"<2T7J+Z4wor3F; {߼swu O+^Zd Ӳ0Fu+?d^ߏŲnݽnw͛{1SE'l2W @,?Gw;a'-?]&4f ՜ dh2qc2B Z Υ00•h !qLAL0^䴻]3u^ُ5 ͨsۏm^B̻m5}/b>!auFXSay.i6;]kZ  rEO#⩻O\كg>84͎>۟vZ rm?'=)Bǃ|St'=ylt[7o~>|د;&@iWFW{Qf~v>}_o}7ziOWwBѓ}k6[_ߩV!nزo1mZ>{Th%YE\L?Cέ.K C[ui~ztuXnj *98x==Djӣ{bQ vv[  8VΘAhfq;y:$(l} C,_ǂ$Sj~ٕYm.~|q_w]+/p3z}ߖiH>ew] '/ .) ;\ֵ -[Lu;P˽rR`Eۧ(i=LSraE>9[s0>^I Gж꼶++(8 DݏwI ;oTL9\v{ڝj:rg1wԧr'mbiFC{R>v _wt׽.~LW5ѵRW}Z,+׈xSnXAxm;n ,o3n=_Πz[ǥw1Z3ђY)m;ͅ+n\/$͡¨X%qJU"yL Z&k~ARZ7BMd#{Lu+ar&j8򎗤`^@ۆxdRNfP|2_KPK-'Q*γRc>O5`5gL³V{BRk?x- /~Vٌ֥JNN5/j|_WcItZH&XRE=#ϝA&٘:!u}q7=UC 5#Մ۹f4Q{΀i Lsbbcb)NAZ9)ZRi`3T=uטbi.7jLaUJ6gѾ7p =gdՖ&61E0KKIĽMzE|.M}ڄE%ڳ @ a1ރr(;hxN@}ejICtueh=II4q-٨0-8,iV".]޾/crLS.$ hmM[ia4:gɘ NpGS:wE;IM>$֓94T'v> d3oyzt,6ՀO99)誫|UfZڍAQޣϠ=FVRlƻ^1ӹDQcA=&Z#~OZbЌν9j"{mPl$lw$߼)=o_'nH󫛏4T7TϸX iT-@E~REa}U dKM;% N`=2Y wZiR(=cFYذ5ٙs IPHܙƣӥM-Ԍ)Qٸ,H:kj4KDJFk3y/ BӤ6+勐<(ha3K.'Qp2D%E.L&}q'hNZhy@vȄ)edph(ؑMZeI2-:HM->ZIChPH"8А4QR<)*yNX)ZI-,5y1h:rPhV:=)+-,{ƌ`>vZ(4'5MiJLЂ_xlhQET9pr IHƃ&OKUFNP6!OIxCmH G7W{ޒZyK"\Iݰž04p4;i]Yŏiu{U꣄X3Eb_(W-DŽvWXCw<N7p ̈i U~ԦdLwW8:}Ìy̎GeweECexx\--lHLRJBb'[IΗEgZ ް{_w "IϗUzu{iDQGJԥՈMWy,} jA;PYMu`q^HyinS~)#ʦ8mZD Ho>#쌕OPo#PHd] C`bdi5v˭Jec͵3~9] kܻlY^ PEˬcuemЕƒ9$bR"`yIdz;x,\ixmp{?IO TY!|=fRBLC)8x^dG#}3!-LI5LcBkԳ2x kr $g':|<7.il;F1&L?2 . '4Fz2#ex*se1R A ZfؾIiSI=eb$}Ĩ8h\CY9(DOK/3Vq3֮dlf סWy4 1ʣr'h16b*eknH. Fg?Il&]# CER"(2N9^b;NjRp̼Iqޔ:^e('7k}l3XN!N|$c̖T=BNe'YNF"Vc,pXP$>Rf7envٻ6cWP|I* *=J*uW|kn+AH-)) ,fw`dZ|+`U{0ޮAN(u)DsFj>Z|nhxQ7$$&N0ypp:?Gxm 5X=d2+`A”~11D<9 {77X[ Jbn}hx*U3$ J!ȷZK6wѡHP-*`@"|ފP%ft=#GuP#( Lz R|&%$ q0ԐgY(aJ^D^2kva6V84\pF+1"C-`"yBHU^;Fb198 \x t FJhlPPxlN҂$Z?@F![U.F PilL /F}3U"~3Uj>Seq0"JIuL;:UTr*yJ$XҚב1lP) ~*+X[R )QR{^J o0FbQ *ito4%b 1I$;Lc"4^ћl彃P^g$j;hi$UJs۶ݷdpwTHk<$2?v4 9Bt3x2|f4Ąw5uzWKSDܮ0"raDzt ¡#ƹ mdeXp}m-Mh w&QU{NFOKߛM(5߲g1kS vq 3G 4-) Y :rYvq S}FQu` #LK 5"&iKMR¼v-s֕foj-;LRH~*&=fJG4',VfS8kXQ1j A#OZ^Q!'7px%qVZ|mYtepnr,Yǽ^cV{#*OthF%݈wAM(X{hh)rvNdPVXK[ Ik֭BTɯKyamŖ6 Y؏;JOt3L&S4iB]2Xe/w鷏Iq"]x5C\!`rR-qXgCP.#hg3f.&ZmA0lÒ^΄bGQ߼Ir6 oX\DEkeT/6⬒(!,bBN/{]ϟjs4`Z8 .^|b(C8Bhjp9; xeL>GSM1#>=яbZB\g8s8[cFb'Ɖ ҀFa:}{5}p?ƇSdF>y,Z1z{bş_:1ho,) /aQ2OIdQ)1ee,Lg^;?Y Fn?O(WXֱ䷎_JRb0rv$,RhݎvbmPb":hNGVn[hLao'j7YbPb":hNDRٛv˯NnmH7.Y2&0햋A~v;eD3ӛv?ڐo\D)QgQ9Rm(FZ`nzd6&dړ2plA"bq <Gix'-mt:\T!<JS@c0k0UH5U(~aJpR3Us$OZӈ=t̉DLJƓ yc kxk^J3cDcUiŴ/ NyþגPAz黔r' Θ F yxǢ' 0uRUOPYz-8;+XiG˜R~29Sv4%I~wJDK9hO}/gd |mjfO۫#O>i&C0_^PiVI}Ym8*%l#mK̝bJLXyK(sP`k,l8#~{gDt<(9/x렌J- /JÕP<`KPrad0vFrGcZm/L``E_ yɸu[tWln\S;C'ݳӶe`[ǧ]i?o^OW1GLZifr'v4w']p5x}}Y{mȼwBFߍ" S KY[eR ՅY0"+J蠍~IW=oHpl$V+Rzqe1E$-VcƱ R>/8c5!wjS22V!"]t;]jD4gX!W+ҩj@4@aňXir,^XUA1fCȔ#ۿjQ~%qn7^#(lV8$F=yݔ˫jSN]BcWC#(l k@%dX9iD56kuw_zdga6%kw M M T7/p9i] 2 gSVk1ZJ眕#)J.%!ѿz~\~~ڀL]Vu6 *([#?$fe1ZJfDbMIJ0(hL|4HN3 9b $%`Il"H7pDP +,HAR 8XPTI`y$Tց>6qK &ᆸV 2St%Bj-9 ˴^ rR ޺@rӺݘXa턎ZbBE#tk9pN ]w"-z<ؚA~D9 :d=Cmy($@g&%PW={/oO>U˞~1Pfw@) dA5Mv5\]i,ǨJ8F f)r~rrVå5jXZJz+ ֢'rcN+MV9 ΠQ+i^qyѼ㨥u V S: xo泮5" .(#RdQoV;mlfz.ZA8qh$f4lH;˭5=ĕ=87HUuuƪJ0,螦cwy;ZqUϸxvywR~OI_m)/.Hh%Ƕ_s sçOoR_Qy *'fE? u}4nZxCYi[Gn%uű*aoUyE*Hzx_y/T<8 (f<; U0 RS_>Rht\OF[B 8 SB[^^n?; _-)"/N LEE@To,MQW81P%E jrWN&`fSX,0-3 vԾۢQTŭĄGQ^VvY+cNŒ%?,Г%԰dmg1S2EHKQ?⺲֯$(\!7ɿ4oy+Md6 ;Tؘo9JD^NH0#1q `Ӿ Ai(X$7b_ Q!Yc8sŐ|V߭Ov7RatMWjf¯\mYrg~ Uؕ+ͽx=-gq:-Xjq|hk ^Msls8>AĬ- DǎIP[Mi"N3KyйTk b0A;`8IK%9:_>~L[)ԈQ%/(0Ab7l-sVx g9ݚnŌq^ny_YA/{d/ NIvdaIXelnnZIfj_UŢR,a&E0ojD v!#xniO NF dYa1,|̱y|q/5r{|0V_>OI:; 淽h9'cPI>/JނW0$<ɬ')yE Yoʓ`ӏ@eVuR_PL@?٣ B]4-Nc ,x\Bo?|xswSpo3[B?]2? Z[O{V ^zW+ғ>6~ +5IJ2UvP-yFVѩ2)oOUhvCB޸VBR:Vѩ2ڭEnKV5Vyڭ y"zLQ9-5FN״aӣWޜ&9W8rx(23gG؛`B!s^(-Y`hwfv3%7W߄aT&V&I0$ KDčY1I\T4 mAuXD).JXʒ{x8Ki,UZ)t$.đ ֨O}kX3zT nRYAj)7$8*1DKwEq}S4+ `\d2%Q ћNgk&`wtk"eαD\6(\!*q47Cּ ҐkPNP6u364#+Qu^8ꈠ%@(LR9Ug#--& Y X̶m ҅]ݵsx1|yj"e-;LZNp&(eDD/c6s+D29!ae*pk<+NЉY[xw *;RiE% ꯞf_(\ BejJXE~9娲U uNHPgY+UƧmLEi;gUpavgfE*@a{ btoթ8O`u*=9NH!hO 5 ,_ lf^1s&HuV]DZ2 6 ׵hՓ)' 8\2Ҟ/r-EWHnrD=dJ9ZespCsq%p;'8f`֣g -cTA۳<G\v7Rqھsg466'@Q?@je/D(|/|#ITyvU&G ׺(1$WIAe]yvU&G L+p kYxd@Vmi CQ-QC Fe:?<*CM(*m }$K,- _h iM$۝ZR)δ" 'ANc9X׺ȭo2gCڹ I+$ 4odޫBƴA*_^$kq1D!A Ts/EegVV}3q9wd8)kiٯ]fFjRFֲ$w+OtI7#T%h[wr6IZ0VD<$NQyB\`k60bCצbu/4;_E_(|Ȟw\S(m.Aȕ&nϰlU'C^Ev9_L?5$ϣ_/8!/,x`a:_ BC8Vl~AlBԶp'3eKǔ̫eH?r59(s]v1g)eKT% X|h nJ)a{_k[buPSH@1$U`ZQ+!B"UPpD:]#QגԢv kL15 A lKTh=<YS(J/fw"*XvK7*xE$> |C^l ͛w m5n1Q* n~q3V( 'vCs\89ZP^B7%*Ie3%*1m62z9Lp`i`}-EJ /FBw dV{mU9>>yr>YStidN@ MA\ӄ0哔t2P>fCW3qi{·>̪E.`6$y\dCڂr`z`s'T8󊝜h #V|]p.ϳM+X Ru"KhEE L}W_ t5r{>qUu-s9ȇ6]ғ@e\@[DQy _+yeEDISނoAK-32kZћ)m y'g[sPy6!޶wZ3'tFPj(s|>tS*Uk0뎜:~#Tv y:ZU˵ ]4?yH& JI9=kq+O oFt$zH>{NuiEe0ˁ-Iעt&+"tΉIc)ז+-+vB]r.o._w֬}3rKѱ,S+c+MWuF)OH)ܘx!Jq1ّD4XQoVo 7~Y LhWҾ5t&0iW{ 1J؅5soRw$5BhL5Kx N3".E#y!" ⚸[vxi 3'kx\'Bl~|XP7-{C2RAn^1HLɀI,&dQJ;Iix|6(Y&&fջODK%)L_ؤB`>LMc,?{_8_oRnj?ɢO?[4q& |<>%dE>XE[lOm+HH!a3 x3Za*,%9sA@藍M>ZX^~\rc`DZe賷"{go /ux\嗟, f8hP/p| _ V,;Y&kR0mbXj]PXZɌO(fZjf f?VlV}y||.quV>|/nWr9bjo]6ko~9x'~̳ɻ)O4a)M6cs04ZpuH( ^􊂁D`=V(lݠS&\$ t:¤Sy jOz:8Z'+N9^ j4] 6F\CA견gpd32XP39, }/c+L`0k l&wPIT16!\i=shx)wuv1\(ĶhMCUŰhqAm~x8<<(=[XaS@(9Ne^[bg D',DHq5(3*qڦ2E @d! d "1#.uX{ bhJ㌬x3oa:gDrJѨUR Tq&EÌc?~7hF\ߓ>gC|x~}N "bȠxƢ~~lXʇx ۿ>| vC6U_LWL@R!PD|aa8j|C}8n I"., QYa9.vN J %>y~Ȏ]DDd3Wa/+;/8$8H!b ˉ5hQjRY#KR 85sڲ'e0c 4LV4A!-*2rkKE4I` wsR ނA[Cqj VM,˕HժA-{CE0-Pc9xZd)qHE6*x@;˩4AaºejSHQi@&w/-Y~4~}$TˌYZMCO I^a!SRPDNf(>NITI $](l0y`oOw3S1*=EŐ`-{$ $U@=YΛudky ϻ2Wy 1a>B.5WG+ß:}¦ ?㣷9vux޷&>Ggv݌$4\<|u=Z3c|Z7{ٵX\xGиc˴ǭ3ks0II yMβX60+:Io2u^t~sE8f<Y]FQi<*(G_?@fk-D:A"O44J[= }b ^8a9Eȍeyjr1l6zghɂt1x27mg2\y)͕x9r\ axDoan,۞qoa{Od (?HBi:H* x1 *w'qCbK2&9C }LtHj.I^dΑW1pbř~ N*$?s`d>%F2)YYgb 2[)@W;$w:Cj55Y9E>upԅgU+; $>/(ZUSt0B1*v4t~sjC?)s v^nW|ukަ?I#qIr >ٗ9Kv=),#߼?ف^{)`#!5‚ N2X4a:i*"7V@ "!ƥU{a"G@5jQg M)4]PtkA {nr-$Z*!xܙ֦Zۣ~>ZsSm$V) i,$sIJZS:Q͎ FvZbN&T\JZ7?ǮIK G<5b!4R=@R/ŘSd+A,e .ub_;%&Dϑ&sPDQM +W èKde(^_ݭKF7suDe"ςRkaqeyTr-:Q3,td:ӹ/>7b p0>xqs\;OII'93* 9|W:onQ-ѰYiЎ"HD9.ɵc<7Ɍ<khx@Eb(iw8{UOd9AvUVJ!pf#IFZ4[άɈ* J([|PH4%kSJwZ53+K#\5Z=v?@%9:W{$n]+hc;#ijv`,7ѕ\-kZ s@K3hKjBDP* Lsr`s{,oB,5JoB)$?]TRj~ r=)R\toGy&UX )Pir"nVuFXЛL^+1*-DкHku#qtrr댰nETٜjC#-`X&Qse똨2CUU}|R =L%l Ek!B=Ė6\AtS^^g5\Ml롻x>bJp_ r+3Ҏᚚwe4긹U"ZxyCo!֭VX[6Ĵ­6ʹ­ևUqO5$Jܚ}bܚ=zًHhǬe[gÃ71i2^9 f],ڛ],&:WSFU\Zsշ%"ݽ|{+[D 1 7E 50:~+A ;l5 F}2{,}to ۶|_,|ܵ93;`[PS猡[S;Bi@a[a"Wsb=jq@^Iv@)g'Aœfܰ;.4g\AOO"R&@\GlW9ɶ5ՊQ)H;;v[*j@XWYuO{z l_ 74Z)7>;G<ً3 ^:`kyrQpuYVm,>VWv JvoeB8]x*[A;WUQ+8B#@`\ۅ9@wœ&]=`|{'N14Y\QO36?KUz98#Ҽm3B|=zlW^5.Rr>ۛmJ#Dg uֹm,9?7X=I3N2%e,i7F*N;n]1#:]Dx*ݺg.]dV}=[n8>hhvw0yʷݺgn}H3N2%MWZPuŠt}F=fۚvvCB)y‹oYn/+Mri2$1{AFpShQ'LTW+ [v6fSTmy5_Wir{/`/}j{CѬKi [Eeo=7_9-Jk+hS[3|EOۻ^ v6 DQG/@@P?{50GkD.~=`K6=8*=H|-'{ɲ|xnGQ Z;YUyJ^_m!Non L 7 dkawE& +݆=3?>QrDIZ3ET.e֕Yy:1) 賽M'B R [2s-\ń[N?HY,QgYΪ5-uaSM$8* ҂rIGiIF(MDy>J_?_䤈2a"ܮz1qK`4c\L8Zl%WliT]iԻI:Q&Zxϗ+tY]v`RP%*T.yGAhrQ$D"x+  V5ҧRS*AV@=r8tZPBqYӎqYerA_BD˜JV !=7HGaFyRpFYS6R- +͡VoNF/OJ_dk8}!.--@t;jBCsKJ<\v\Gqvq;*%@O/VxP.a`afBE y\)·AiO0^H4g2HR)xt':*OiI4AK{ZM&4Jv9#|.ro+->na9SmQO,Y>|%;T`߰7?Q%4rWb|˛'0YVTJ/un*?_\`(;9)ۣ˗ͷj"5 dtaw?Liъszi<}VBOFSL±V D/xH IPTIJV٢ռ\13 X!فz>qqDPȑv? -}bJy$y+e4Mܰ4zUk$hPNo6Y/|E50a$~3-K|;:N+7ޟqOJɷX[;UPMbF⒌GJbG_O+)e<gxOM.oH` d $4KɣEd9i6P3:]wXԩ!RAgZsc|xj O1\^I . F"\c-זIbxwAP=9*L\KjD1rJx#^|puťU3E )zbЈr!bo$r\t!bnAF9:P-8-aqBOLk  ^ h2 1Q`dm,*)\ s"mȠAkJ:*w[N~HSr:d^PVOJ);eLM{҃ >ŗ)o51o]6-8_2Y߅Nj ځJae $Iraqt[\j&|GIҀ+fFO&B%`A8X8C:&*5u2f9G(NǚLw3OG=|f(u2rubɚy7oG?thv{à :`?ϣř58lFqI#V|~p@^-k$P+G50haiZ[>zHО뾎v ;zDO .ɀߐ?z@M8 *5h>z<5!|k&pJ8a$fOȔ+%~,gYϲU%p6[Rf\$qQt3)$i IC,i,8fw|Po-ƌh%z,(JutQZ͗+!*&zQ+| 9 \1>M aJD:h~K#L^H`Q8oKrDZ(QR7s1{uK %I^W3#jB'N(UtiXVD"ʡ%s,j&˵Y| =N)TPfɴi{QϙT,6RcAgz^9C\c-ouwUJ]JʶMK WV?QVwiQSz'Sv1#Ujch>w T*ۋjgJ{ltPXWCbň )M]Li$Gu^2v1䰣nR|"J( tA0XYD'DC@+Wa9||YݧXo5h/@>xc)@<+)tsAU"X#"xf Aj| ,~*g˯5J0@»FCp7w:BOR:H7#PY{0sŰU]҅>)yNd\X_o|a<؟ !s_ǛE1khv2WPx<K*b\bV_qyȞ%>C`̞?C|P|[G]+z ؓgey>=)x#ƈj'v>OH{b7+*lՑW#UyJwe%[* F֚t$eo$% `>(ӕrLj4_ 4_H5o'/S|[^4 N%?{o'0u)RtJG[hPM'3XBoH_gt@`50z9<< IL7!UQQcNk[_&H|1 3ڙ.A­/c*әP֯/kd1jK݋Xvi,6\CTusĸc=@l9jΕ -ߣ$JNHi„VBF'߷sZYW-I tIDFai'9Zǎc^ 4MPuw:pK_WnÃ7aGaZM48FRyW]W;{'#US* K* ;IפTVdb4jsJcXtnFeƹTܩ"#"3P:RXY\qeg_9%o$ct7g{Z{$SB=BtqrN+Qay;M g(EBRRZ9, b{ ,0`'}3IehfAwVk-њQR(T4e:ZJW<粈GIS h6LAo5uP; `pɻ؈+y 4x8aS0Y2ެTRIs?+frTj0s/>ugUqhZ}]]v(Av:&y 6bQNJ^u1Vs>L(zbajBDnBS  TWN`؟[T]z $-0=ˏñK~S{o :tTӞ(N݀_@ܒkg.P~C1bTKTQ Y.t ?k"W;s1P1Y VF)T%Fws99l0Bp 4@P+;Oi1a7i0Mc0Q?' 6IQ({$ҕr$pש=n ko k #l+YlaV +3>2`-NaxM'&0X~sˍNAMnu*u͹ b*+uo񆢫$v5JO>Z;kFlK5-C%[4 JMI2OcDiqakiq!o8̸F GL"utd8O?]nqB \>lvF7yfƿֿQݕWv*@bJ ڶ>,Ѓ(|:bgw:;OTYq;:lyǛvno7Hgkaeg NRd~noDbL=7M*z_\Zv~> J @% Ұ.|F^u1@^ 3C̓dPH:tHE>@/~A@5cL:N:9Dp'v8k{o~ߺX"f߿/p 'g~fлwȁ}7nwA3TuA󋨗_"VܔOFlM_oA2׻WIaIgB5ē•qz ]?1qˆd~n72}̂$M f"S(=ޚs$N*/}ʫh%* i)(=\#gAnz¹{rnwA#<(O[Lr;Ыx7k`\~|rN6QC\SN:޾xw%`XsQ_ғI8mx]k/1jS,g}v`PPDn:} Suzq21p_hXLo5}@2 Ԍb'ZW 䥼9W]+ ܕ|*JU-7śR 7z/|Gֆ{.gw{8P\J: ~~27 x| M oߐ/'Rg&sZ2fs4ܟeggV,QõTDa8w-;wWphlj}üö꣫׉+o_ŊZ!wHX/lA sM޺ u'rQ>z3 316V0RYRGMhτ4@Lz_Mr;(b5Ͷr< LHCZ,m%~헯$Off2C|& :9eP\bO3JxFp:Vgh[=#Rcpc4_J̤00p7RLňb"˔}|-F;`"XI0Ʀ(NGK-IOj~WP  JC,?Isצ39nlIN~n'<2GR SeykHqRvHR֍0 eMa#eO&ͰK&iD%O*x0JS^4PE+Gmx)\LB90c_=@H9 H280qZḑx⍟7~*Tb59Lq;A8QC9%O.ĕS;PT2fU\3 6)!{!)fZ+|~;+yɨy@j[Ɗ$㯲rT$=CZ8/D)`Li-)Knr[uт .@3Qo.JC*EX]  "-eގ`mۓ- -jo)K)f$C72#DB-UYZf4tmK&9*4tH2|έ 0YE$d? X7 >6(yCd0j'ET\bP_{'S*C5YK AQ[B2Ct%q>v}oE̓u^wG(bE;EFD᧧PmԠ_o;4|S'26@4P[Gӽa2^SgUo@E׈vT¸Tl۾)1jgI$g{X:DVW57z`fFMuiQ]/ wWwX/Wc`[kܝ7 WX7-"P\r,{ Joj& rN0Xv)m <=BF/m/v6$ ې(ЈvCF6Qd1AaL.3;m,TQ879Cr(D37jfތy['4"4d":.9^O8g_p\d:uzd_%Dq0K茶$XЖ CLb[j5Qp -CA'%Zc|j>cN\}Ƽ59#rKQ7 3vCm&ϘG(I t}rd#ZF}ƶcFؑѩdZTֳ! DF=9MQX J|vkkH-}u5W梥筥+;7S0tK--%2Je w72ʌ%eqeeVT nj#(TОjM\ Uy3ͼNhAA)- u*Ӆ.NPQ@YZ,B†QXר~Gfb"@?4( 7T/9k) Z iTSmxEKRKU6\2FOԔz኏rpeِRT{ġf$Ɇp[i@bweqG c{Xwf1b2EF$$po @4ЃH4YYyT}5c|ItTpRXc,ůM<i3jR6*jxưy)ZZ:l[fI8O/klqFKk'/(Bq@9^MV g7(RUb_gM$g7cq`,\ܚwV3 zƕ y|v $ bjt 4FL^@ F#(0&h g*?Sޛ(Uqf4 dp27A$Ut>VkoQxcq]Y >=5[s0+tSޛ -u|}V:`0@H$2ec̔4sx 8CQC9 Տ1A 9k5؟6q;=n7pO-d!yt3}BZ7LCj AUQ- #WnfIouFK|د?ƚF2IF@tNdQ*bz%-Z-3L̘6iQ:r\ʜ˵,Sʊjn+P3(.D% -:~:UVzZIٸJ2PQLQBBF);hru`IA q ZvKnְn%eIiJa̺hB"ӏv]9y]&@էT&MU$PǦήI?Ǎd?F٬h\"$.(570xC< &aW3.W1.mϯbKHD?_(Ъ"_z__~1奘7;_u Tn&K;`kdvD9'T:Lծc! {$ 4yʢ;0ޔo\Ӎ4%Fceu<#7Y+}=O^/>^/^{=cn[PI ),N'B:yJSW^_uARm]"ƲHa5[G,q,qH)x@4QVr%v>d-!Z$SdQ"#O1R"E'FY@ՁD6&zYRǠ\%>FGgA@J:;J?ǜU;G~Td6b VkQH'+I(`+޳FTTh=4S> $b"! (y "Xjq!1$V٥]ڇ8[2ߩ.R߂{ ո`>x^joS(,5h w.d0r4u-RE*Abm8`|*A".3=U"~}bBJƏ Z58Qʳ *he%_X _`\䞋&}ZUD͘R(2I}䁒crgk&=XV뽺D S0G&&ǽ1i=,̓(<#mj7 œv>ҩws!In׭ YpN$A TХAFbibȞ"[6h*hfLr`WIR4"(,RA0Te2<ʤf۪<$!OשՆ?p Q^]o7э]x to'OZC]1W' sJd.g\e%sokdVr*[.DPZ=9&Hd\z ᙎ9 i"L#!J=0 !z6r`*<ޙ_ J@KRn+ˍG+Q^BRA\`:ZQNÍ iBZDyYh633Ifh+M'Dz,ٻmPEo)_8JMR@Wr'B QQtMb֊[A]ps1 8ZQI۝ $)00 v\5Gȼl ^-U+3Xfp>s?ru-vYsqjo|-zskǍgpL+\ۭǫ]|8R0B݁ElSl>scdP7Oxlz5?e89\|S6ʪQ7;$ﱸKAޟӿn=<2ޥEKC4+WA:pzӺAQ!@TotnHROgk # &9cnCyP:Cź2\ɬJ6|*[b nt5#hA+K4iד!jv \`co!Oש ~F{hگ 仮@xŪn)hY0P#Z!D<;튳bY]!*('~ݷ w0MaH,\$R]"hmaO(S )G#; ȃ eIPPqmɐ";,}<"#3e$,`oSɉ`ץ(t"'>[ ;֬eFI,Y(!({rM\/Y׌r{ʛ_^.s zCF[Jj&,xtk[vW;ubl5ͿkR Is2ŀg1'2 %)pܢs[K$5Pu5}Q/).lo?VfzcŖhi ֘'xUbjbf7nWLUSZ_GKp|UY|lK玠L[TJk5]\7՗֥7qU/OTMZ?1]j]]z{r)n'c2N\#'u] ں2 y2}L(=2 =yuBf 1w hKnJpn(g}XU;*? $;YjS1:%ƧEʤϷے-ےmg7#e- \k&|BCN&Չ .gKg$f@c'(v)WW'+DFA~'W'7(JƮA-ާٳ8orB3D0+`vwʗWl^wݕFǵmlOwܫ{cuσ5!  4FL^@ F r˜Ys 0E.*Ts{e3bP:D\OKG  $DGɌ2|1cfABS<w<(ʱF0&S>>9Bp`X<& TSI)t FZEE?@ކnP[GY,zʃRt@D2m}= *А\E}uZYO?oT#2;d :xwyӏ޴Л۷=Lmk!l+2ƂKԊ{>ǂMX:bt%s̯1K8ub"pyTY vz-[.Zn*ޓ`2y|k { ^(aRe9P!mE`6Q'N`jFؑ=5aOרS$^SH7j? -}um'OӦ]T=US0˰pQh~z?W뎮 }r2)&p y\pI7 ٨ \!G\3G,M)9gGatj%<'mTeb3RQyO-e!X.DO^*eA ) csmeLH)y<ʀ*GiĜ}9)q5D'kTʄBm[Fc2(TKFӲcrU.[i1xۇ>]Jxҧi)$oh Q}N5ZT:--DǡRTR-fRhma̪T#Z+L+D]%`,E{6( ׮hU}ԪiwI (M9pbxy)leC;C6EN])>"qWGjFaP9jÌ!&j=xvzŁ߆"&a9ߍp`LADXQ~Gb$NGTFkC=1 Yd%z2QԲGNp\ӃMV)k%ouOgu1!r|ńr6.%]NlIJrSKmU*䤍gV,/.wqcNTQ VRVō2JS{6, /fޫzu ЧY,vv$ȗ ,)e'dV5)IbAREUzD$cdQZh<(H+ ZU0rDD)0<-fc;-Mo H"y:D,c[OMy?fVr]p8$۳F2„1*Ef:Jk+)u 0+l2#6{6)0:h#Rڨ]! "7б)5InzgtupD* )#z\EI*V<;NYixI&]xpj ʨOQ^r1=k%_C}jt"pIS:E! oݫ<mr"$1.1gf/ŸJ}jS *ߙ{:,yRϲY ?<ӯBSΚ|> 3{>Eױ'MV]֕ !ne>wT}]{ݟAW^C=/ gN6/dᵠl7Ga2euŮV,L*\BkLIo6fu$1 XIIw=(w? 9;~6[WeuVvQK}̈| <@W4bYQgIuVٽ)-ؙFf.u}҈ .@`GT,LA>EuUL$En0f!,jSzQipɏ#]u"|6<{Q2j;\0I!QhbI%#riY"`2* Zms6A^U!?5˧'vk,?T0s#|s|pQJuʐC8~ҟ들QoK|{~>xw5hzɇHm4_!HR's{3vX2g 7$s?^\2bK CI;L>:#E5y0cM,2^k k$!(Г9h oK$NjNOn/H6F v]arl!QwHv (_ibEw7tF@'Wqt}^Xaۋy9\|n_޷f?gi8ooobSd5 )^tYo:w5W_0F!{U/WA0nwpE7Ǒ_?va߉3HA# 5|ԍjWթ`xkps;v)jaĶݹtWJcxm B~2187rgH<4>k7:\t]M ȹ>{Mg]οm:x<,ҧ^AN=X&o- 6 }C˗Ӽj lCiv6:os;8Ό7|7pg!= Q&]X\IBp S>{;H ;݆ҠDtv;xpmx:vvCBp$Sq>nQ A }Gv(Br݆j1$ 5*,Hab*cO,:u~rd 50W`HyK˓|lL"&I+g0`]_'+nIۚ׉kPBSX`Uȯe#qtfQRѝAQ2ޘjV QsV`/5Mӷ *I 彥Y mmrF{id`)˔7uM!.q?kVެ^d{Fֆ!< ]Lƒ±R zB2&F"Ha`:l"FQk{uo%wzP+PQ;L$DRJ. S0JRzA-$Ld-A3x cƚ N$):LNW}luʊ oN (v+cU7K΃M/ǧ@DFwU}|N2^#~uŲ pPB a> L22  R;Ħ`-bSvЕjJbԵΛ$ɒYf 1.bPÝ 8Ig 5o8j {A! E1z%m!}7 UBIh-CmHIɑI00Za zM[ KL4W&yg=A-TB%mVT^m˧~@-VZ_D1:ҴI*EJX#enAww? Yn^eZ_M tFr>ҴS|rW}sa>7ȇwP2UR`kwYu:GkvDg*.LQ2ܨW#FӞ緼oHB,,DJs㏳pX_Jγ%Y|t䋄X7DZ/&Jѕk9F/x5٠d$gIX#zkkHdDfWL(s}H$al Ȳ2aJY=>2/ӻKg;mG78 p}Мp Dl1E xE2_h+PKa'[Sm%]Raij}T vQ?E9!,bіw|<=_V UXA{~0^U7lv\F/z1q_~"U/uDA:/āO!p#]Ek]4Mft?{F?l+I܇z\ejg[R|&?גT~dK[/dJc ~  x&' ''F!c$Vwᗼ@-Np_c- Ju;&:BX D&|9:@wFm弖*V&"Ǥ {*J(G5i 7oArAt?C.$OPKBKZoSq+{/ 7iI Y`&-(!RjiQP.|ھ\e?t Ξ>ӶMߪzGG8JISY/0B>_Or~ ^ݾ;Ъ?'4Gw :Zk.zջ$2fӓK3c=3aQx8&T(͌--V\N$鸛6:zx4oɆ\ZOZ[QX7 Iy{Zd(s ~;-u7Ŗڱאڷg%3^-wZTy;]ӾaTss"Ώp풯Z2D3vz9hgė2v ŒTcz-J0_"jP-R-Q=qqwiǽ?*a*C~l(l݇EL!q$vڭ+JD;hX2wݺ'j>$O.]dnkvÆG֕%mOi Swݺ9(n}Hȟ\DdőLuϙeњ2wKFX +XsȍIEڵyVR8̈́!A9`,t[ HKp,WLۙhL$S ҙ jlRQ#"ɣDe4r탥\TUB@^hU"P1pC=zG&4 wa{(MqVNƱ*t"R9A-s TcЄ􁍆D5(.Pe$^+lDoUOOiѹ|$܅qҊ{ӬV?urm*;mߞx5iFC,!Ndr͛VIL̳dnOxy.q6ij\:0Y'"eAӸ+0]HT$VFWy_F]9T!ib",s#H;£LV S(} <b1i8 |{odUW\m\>QrQ=@OIpoNKz(u4:Tt 2ݼg4"pDDX.}΀*t$ɄMg 5OG)D0}KtCɝ]|j554\)eN pz_ f!Xy7pMcq64s}wS#9`Z6*=_ަ.>No>Jz}\͟{'K*9)Z?7$^kKml|%4A߾\juM3Y>Sx:2==?_v$'t3Z&Fc߰<ܦ+( jto$@wSXI#UHZuCˎ(j Gl{isZ'< IR׆另DfEx, 'q">^WM[&u|cZ$M up&%7>!@XMF.~&|lIw$~z`:~[պ%u+tQIעGh"Z:&٨aR VAx_]GmQ:D32p#6[a5E!IPf=g3ia< 17YI |~)秦|60*+3hq=!cGHPڧOVQKC.E0v}-6@"Z6m"R@PwQJv֒]\h/K%^\čࠡ0!2O4 HE'Pr0CJXfͺ.Y<(QZԕrL rYtSakDсZJ3,YA28+hXpiY)}l( {.1GIBzY;NK+&6 ӯ:MqSQ81bIY{B^<^/9HZiU3D{uBeXh9XiTRWibŔ3xє/]A (͍t>9N[ZQJ_€ 'vR^USp)-tr! P0<9 bX^d's'ezʈR`b$\j@5RuXQ°JMVx%-xKJgĭJ׈[=jW b8pON;dnBЁ/Wez+pMJ4Mlڤ.C/y#!3PA[ [bpQPve=Eb~s mhL'|u{q>靣^AZ":wf͵ۂv5 3Qb.8dIΥ:=qi<\ .Wo` O> :){*CA͖ZRVK›࿘rWcw5^Tq}C5j~q}%N8r}gN4Y'ߍ \ w.jT]\rHcҾD|izB9Ȭ0Lb`3$ D$PJm.j;ڱ[q6_wb/KC,$!‚3$% ^iJLuj%l }KH ۫#tCVZ]߾}-Qh]8k!S]'OLI2Fn҉b݋>Q:i֘k7.kr79qBgnS6o |A2µܣ{V/֭VVrZ%@ +LHUYm&n6ܒ׭sVAHG(7wSl$e!ZKJ7($Fit KdWZF-u> LA(.ADO?hԨJA@L C/ [|4{E|lD[(TOlaWڍRU:;i1D)uJV*$@<`C mB[1%yDCMSXV$a#QqHR͘]a02[ItEy!dLẀ*%iibmJLWwiBdRSXS# dBYbOb(#K(GD D,KWbw#b>T#mWEE⣀]5NKtQBcixNޗ[-o&Kð8b$Y}4g @1{G:ZO%AbծtӉ~χJz@=n#GBe3(*ȫ~jk=y1,n]&,RRĢX)~pwŬ2BxNnУ}Nyzɺ%ef~^Qq\y05 7I6@5Óu𗱈|z(W,^i3y7-*$?Iu -nQ\ *Q>*}yrvӧ?7??[urtq5C|gPrBG}7ю>vyϣ}^W+m^NN>L9A=pcUE?}e_|L~~sH&r~܏N;j9.FnGަVk%g)z1>2\6(}uv\'N˧W32ӎՌ̤S'/e"1rJ'rTO >of]+pq$N]' ?`u "be"Iؿ9Yה)6/"AV^߾L."7 :ԚWFsco27fL%0 /byKU乌S(D~.˿m* u˂(D9SC;0S8OxrOftӸSOSV8]i!jȷv+*(:rAS% F#T\%JMg,P͉nwrk/тݡVF 1-4+*0)my  WV PZSnjOPk Ɏ¨ Jee{VK[ýsހ qq &$X.Y@<]$Biϙ4-BC'' pl51[o;9N%E1Zjc,C   w. NT2!P*^IrP_ˏV$@%$P>$^(F{VZLVF!6Dfvbaŗ6f =k{c/dic]ݣC8Z2/|w7:\/œOC@n]\@Jg)!~B]w(d T&%N,4%%pvszS>jxjy7.`gS_+\7:-t%;J)!)L7}؊[/"z \/2Tm6s\ 1QMuj{d1j!}_@FXw.)ȔĨƢ"Hu߰}Xۧ-oЇ6Wpf4-A՞gysY=P^^/QW vNzbY왦hLmo5xP@%C4?MSC9l*Y#8} R{=̑5*4~;=y(Q4ϫY0 W$>WIpe5 SNbY3bCɂe1[E&?7N'@yP~2/$|,͒i!Tcgu 8tN'K)*rB|WԔ4k5V1uB9!yE\L\D{<B AƮ' 2& 5<cJt5Ox% $oR˳{ߍajKHbl 6SqMb cT#ٜJuL")r8'>7 ?&ڍo؍I`;؍M!j$!c7`8JhhEH6}B;ŵΘל 8 v!w^BMa*RjUe(>} I,Oi- Ik^ # BXOOA8%X22*xz3 P3Ѧt,iQpրԄO$ _ /jvŕL.m &Jh}ǒ1ЉѽCC}Vpy!n7[$LIz\Pf}G2[ˍ40L!3{Rv:C)aNg4X2g:jgu 1go;QIFa$MH؇(oZJAI)WRO)&%FPy7,Xey̹/vhj7NYmQݱ3(d `XV[pjɔ]{g˶~[">' mP9D k\&b$l`=zeLO_Sf&zrbT4~roo|IYgrP`?1C@Wo-!sǯ@& Xo_pAݻ6 ׆-ಐ >3:Lq C~v6W}:h'3\ țx#4~vHU7/V(bc+2 x%4 Jg*(i}(P4.K7;Sbԇɟ@O}(7[}W&ID*z20sA[i%0!(xlf Hq%0ϝ0kotHHG>%J#.7~JCX , rko* xf~-˹/o% m݃$1#("H^cl4ô30W/;eO*-ӎUeéK0zmQ#&[YTAܔJ=h-VDT`.1X"!DZ:`2U_J8eJ/qMm3*BEC" DEZW9 0KR&\HS{Erwɳ)4y J;)5j'KEsä1ARW GHp7(Q㉶@Env;44zR|nopuE_~R{j=L&QNd|M$nl/kĢ̣Ji T_xhvզp7-<$ 7Inp@t$.5PiE@i6*pS?1:מ_=6{Plv9t3:w(1(ߦh̠đ݉3{?f!kSy@rA@ZknPĘ!/bĠ\A&ұC/ۗznmx.lqWOoԚzeoys5?8s8v^|hm`u- 88cș=@XVR{Ai39z8!32{^jCV7`p Z5ǵ +{Ql,lP`Iw >߸M,gRp4ݼ'uτGj2΍|q+ǔ/{jefNKL34!t<> $1kC%Ż`lJ`~M&1P*3w[rpS2lgr]=GMQQ9MG5,DUz?,;bhe}?l9 1CQ_!ؓzV 1S.hs0&%')2tˁw1oHy᭥Hf(T!]P ]pc$1BDxQ~bibB oR﬿;<S`\rv}KTQ{YM@ W6( -) ,!Sp|)6yGnZT' W] Rt4VGSj}(0pkArȠZ̃:G!A3!(yngDbvP#ŎmkLIT)z !4L=c7쓸5՜QLNxO el=6V9OIN亲mrZ a$̮fEȿW![U1gw7m>z}ۺNI|vG ̄iF']#1=׫s- hJϮABYVr2[;1&\T+g2sx)"'PYm᧮)W-e􋽜  V:0mˇ]jkAW uz? xd6TGʠblLTys<c]{ֽWESq>IpF; .8ĠHЩzͷKE}a<{΃R Q :Ho*Qh ϛ%@C:q=6DZt&]+;" ^P S,Yjq+I˸v̉%I58V8][o:+F^/"i O3b/<,R=vnC۶fMJ8;:RXE_ ;呲3N a+[I9n帘sDl@87Sz4lX'(r88ph?8׻C4nGz p}%(T!-9_OKR$j$t{xmxX:Z~|_==Q=I 4ѐ;>h0%AGu2p3k=8 7VA[` M#PWP\{,Rl&EJyvHg-Q34 W( mۉ?$D0z?2͋lqnD fH ||a<+Ի<.G8A؍xs` 1:Qzz(sDiñEI?(TӃ.`p֔q\҄8h*0#"1T("˥^T}띣Er) Fd]?%5ϷĩiBAx jy މ*EfXy|1Og i@1/Q59|jK=T8S $E94 9>k . ށ<<큜EV0:S~JTT6xdReY.dN))8j@EZDž&V4p"QK"Ghl B>Ft/ 9"I?wjNYkV6F҂6(HjY1٦- uّbdrL Zxɤ؆Ě:ƭ HMۚ5b[ ivJ7*?#KLGhٌhI"PxV"ę(ykKiʵD6R" ÎL u`)V;98%R  Ka|ƙF5GɎ|t6qt*2*'EmDʔ@7:<׌2[|B@r\jf7UNXGĻ丹e¤[;q`WF`K؟wwoB_Jw^!EyMSD+O\/?}@f2u_   xCQ~ PrLE!tFt伸YzT** _oM3}@q3WGkiUA}[.w =j~mvP^whVQJ1xFpp\K~qMIHoO'r$1bfeTН `뎍@؈zPӧx.mI\xoP==Q&Ew2n=7 E* D>؛^$Yy1A %DጲaIBd6aHA& 3[M[-¾ 6Ln+ՐFhe@ ĐZHϣɠZԍT?,J$@?R:&BId/~[(NDMh[C(TAs"8\,:8##zvT?mm}at)U:phozwou|+~?uY5_>j-]Wa-5_~ >2W~7tOE$ >'[/BkxMz]geJa/-{\e Y/nY6ӏӻ+rec:Hn'{!JGn%sޭ&eS &nmNrec:Hn(PSMh3ouCrHVR$QZmʫ֗LMLĆS[q*8311faךend˼E?ltso)P; km*@*,@MhLyud߆3ۥzjSikF_x:1q`&_hqO/7rf m֔vm RTyN7 ews(ۣxiŔ5[rT('(CE +J)QBJ;;pNxy^\o29)6gh6RTOƃ>'q]nit]U[tyo*!`~C{1H)\jc"%ZIZ5Eg^cmMhm[z_:\^^M]]^V3z0[e/?V&["m[ͨP\]amoF{Ed%Rlsɩl $BLmKQ6m(cFH 6JSS]$REg"e4rhױB7,-r?u1`>&|?rN$a\ibcPd+s}o/6 "nVtC.`Etˬ7Q?w=t׻7؃WdϩWd9i Uzsk몱iMm"o^zy3j6u!"u!:]^ >Eީ0z6 ^](]fT0J ٠M0odb-*}<[6wG5H.2ҵ`OyY˪6z!VQmP!;+i\k `P)?f /m[DYeɘpZ;">(2+8 }h*VV1qwY>%i/c>}}9 L)9VUYGHQb|Y7G{N}F&pϊK(74zY;"`%` N0@ I]KVY 䢲8LjǼa/п,yfotLTȖ-9xIOWvY1y9E)Q޿ G,3+Mj2*'1F!~Tw>A EΤ"Uӓ}cB.:S8h}zzt'MTFR+ #?JOӨfJ)ii$=)A)Gݳ' g&$32wWJGɇA%0ꬌc"ߑ# 3Mx-Y%06M]ޝ֤obZ_j2ϟy~M,|ڍ#AS9:Ҝ\鰷GcpU&VxyeE)|% OpB}$ vbԉr$iNF,*rM:R.2g-pDe uG[hn G=T01܋4Zt=oB~ Ro* e$/Nf@^dnj"w@Eڅm!v$9pŨO B&?-FmZRqb\>cj$p}BTsVыP`7|ORDQ1Z!1x Cz?з %fSo1DA94  F!$NSD!TW'\ ͑(D#O~0rު8R$X+O A&(or#c3E!AawHlrU&vp=#mE=8@^EVCو e133s*3s tO933ffNB oV}FcewQ\!iX]JXh8ve~Qotꮳ"Шbx;]5 W)oubkTIղjk\lƪN֕Ҕ?U8{Zŋ5ކ=]H" gqa+vیG ~fhz{Ʊد$ HLSCޓt]Q^ޓ4(Jz|)RzN]ƢaK=JX@rl9u䫼KZF94 L-NAa*AJM SOTSJh%nayKyiXg7P˨ww2Ȋʒ9Ӹa9 DIzߒfISSe|٬y 0(#p$GzN$GHjdǨsZaGW)@v@{|ڔPT/ bUۃ@U) Ϊtͪ|zpvQU)p/T+̪4}*A$)Wm4jeYRp''=JI8IQV8֠{^q|Z޴:*/]N =*.gp.~PL*U!N~۬$6(Rbo( #t&UψGd3yF˞kQTQ=Dű{=I"`pL*a huJmͥ}k6pDOYu>)ywv Wc}Ul7NDo6% W\_Xn;R13R3,XǧB9K0$ Bq:JšsȰ2xzpJkAȨeTudvs Z hαA X6ɞ|ʸ}7k|fЂεk'y`V^wmK4ciE 5ˆqZYb}\$N59¢@X̂Vhe~|>"9@| &z $ƇEQP75X8K)Sڀ *&uxoF5m$RsJ̴]2Ce2)M+ .l1N`L 3xz\_eZoϹ0MFq@e, Ok(Qtu0%F1bprŲP(xd:]fe(rwTP44/,m0JS}`wUJi4iam+'+E47ڂaZV K몡oX'cM5řNtoGq=۩*߮3`(ȻGsSN@t%<0OD4/s*$[1\j12cԹ0ǧM!` эVeQܣ ף{]>=8z(ң-cjtt[ɎTc©p/pK(EnEJa~x{^Cő{Us^jyqk[l_M$$*!z]<ЮD%]/wZO][2KKv&/[:cFv΀t<wC!S~42"GjM+qT%kkMz4%+"k-j3ªI|3aZFsucr (0mjGdђ = _D'𯏼: WӲo I$6<~良hi2f_RtZU5[gya6#_$Ham u:wy ~鞀Ps9^(V3AP4Ⴉ&MAuFrS?_?fTb8:XbzoԸ"_]6mV?;\L..)XynPTT xgJS%)t4o$pX 1jD6T D6QCQKJl-:OĹ*mEU{(Юn OhӀV4j&) Z.0 "mO1H|W@fG)BH[3̴,maq]K<j#lJR g-$>I/DF$QS)UU~ q1zuA%RT~u!|_>ϴplvukr@ʆ/: 㝭n}ZUB BVAUqpW~O~pkG=67~w>]y3>JV."aPR#q@IbRjPI彑&BWzkN&Wbbu}S=>-vr1 PlfqA??,~78:=XT\%n?K1]gSS9MT }>QהaF7 PU.p2iwxY7`q:TYjdF詘%EX$gh7*O,AzKL41oVt(B`8m@El j#PłQ䔡@H2$vX,eezwTZ#7

:1R4eFs(ucxiBK\m M̖X]]%d{%cJ&1p_x-y(IB+7?qԕ:qB:=Gn:1ox1n S4[~ hw+a!rͲ)Gqn{79Fw;.R[~Ow+a!r=ئlG`Nz~ptz"MK/vzpu0nGVG =őN_:rOu 0)U Y殙wqq?q7/?^ÆXNw) <\d^/:!Tҽe@Pr l4L8d7_f9s;vALׯC-E+x0Lgj36a#C :^N)ޒC+JvW&@I~Y/+zKPT%B>fCA$0c)"랼uv]owv]>@/$xjCRٚW6/%mV7 AMhҾe~|> ڠ1P/Ҡ׽B u!TCaʥ`}כfm6 WYuSlew'蕃MCǨMڅp Ue>*1W<>Bpnt;U#djpl;R#H߈z ᇃ~k,W#cfڋվo # 1!Q{8)N4M,5oi븹. 0hޫr$VqDH+IJGB| ZU˺q!j}j#\H繯F$F{:p%N'GRy STT xg*$4nߝLWMC)F7uPp :O݌|MM/uݡ?/sm9)nF? T)Q܊ܮw2O>]fsgX,Su,Pt]aʮxdhtt.,^152UAVJ}]Lݗy5e ]ؗǏ,a}WnNp1YsUR*l f)(mH $mqRi +7r^MĐ]U"tXyEJ60^ 8Qgr6 ˱Q?ş+E'$t8j'Y{)„RT^m1/pK~k*@-#u$}H$pY"2?Թ(^|FWFsyeN„sJ؛GJϧD*I %eI> amCXN^2ึ. ҡ뽈p%F/^HJ$-"_Q<Q,a"R9bJQBYT4%vOƷ$ E~˒L+_{-Y4W)YhڗMcxT:bG)Rsǜ6sqx`Y.7@B iԵ*6ܢijƊpg9 Բ[Pp~do?H(jҖ /eXR|\Q\$OP~>|Kh%Ga޾i'rB`?o./|] y])"v8 H_r9e_n2C8:r,y%?%-Y-f7,񴻋b)u3ZM.f7:Z`ފGT"Fo/(w;Qs4/=qG)]A'_.Yp XȞyi53V.45^,DcŰKcü.oNjlNlZʉفO-P[^J6mZ;6%}jh$Eė!sc0'wvq".RYz&p=e íd:W8'x\'B7/hvz88!0R-NGo=][ cZj=S iH/D%*M8Ɋyfp-- ldF.߽vzTƙ CRxC[>S9Q9rJao' zyt1Ƙƛ 0o/nf搴H+i5B+p\v$m0{`j=S"RCj=2JA(B{Od}H.^8g/sw=>,z/)"Y% NRg̊s)PzM- m2 z0m?;da\+-3I?o'oWef~R?)$VR6(1Ps|5V6[r5`i _Ƭd#)B].fCX>yyZ\SY<^zkֺqf<=wzxZ]_\]fzjPi)ӻՍ--l*m?Q?ԵOK[ZdQ%٢t%ۦZثzióknퟬźxV5Ae)I^ƑT *O,3{c<{e7~_. d.f G9Մ3KY9yi-SN3{ߦ~i*d&I>J^97WΎ{MAMEA5p̓`L;fQIتk;?%J::gcc_EeX+{C|#l p={ctB;7&Mh 6)'3lWqPɘ^7nW?ynh%~"#Jv=dKMzL ,VKC(H5˒2aUy)eC^_ޟbv t@19|H2^7?y sPxK0W*sf8I%8aQfN;W `6UC@ŲӞ^HJf h 2|3fQN{~V0rd6`h) \J;qR!pvmRjA981bd/ea/w.5OĪHӌP*YR eZѬYeHL3`o{RGeJAUf* R(mucI0RH!D P G $AkݕV  \{}-E07] !x}}V_kާ$ A=ӊ %jZTRiok $T!~)Vޗr {ǖs69sM\r5a+R*́dYQrskUV橖pfTdSڕڱB)}u"l{[$TI#ެvY߻wtʸMפA4'Dc4HΣ)<;]ʏź{,ʵ5`.Fm*m^blfk]'[/Y'K%=C=>;JbT*部eLK^AQM1!zBp[c9vَ_Mv8fTRAOHA#u}2-$xwGAk?;cuTLBOwwF폫CwoMcd-4h3ܾ;>йn2aIzl!N[iBKXFOfG*(Dt*(}d֖s`܎G6`5˕)1ό,.^ǿ-!L'0/^ ؐq5 > `Ɉ޲~)WuD2? `$!wU`h}@T&hŶ~V!ە9[``@u0x}[@R(障pNS=c4BsDʘbAx9/z)+n*eJ.0ơ)la,zɀNSG3) :xT Pb fl|V)ɱ>#1cO dČ5twN)<t ܴSk {epapM`kv\N!D"qf ɑJL#i7`5hHCYvazLYGr |/"`JFB 72DN`y?雭 V\,KbL%Hb.;eS{\i^^VΓexY嬖&syZ^Wc9[<V)Wf߬VW@՜ ˛2[jVܦwU0~տ7K?TOK˯_OdQ%٢5kw]=<;ɲu]_<+94,8,cHNʄ#UY4ϊ7,8{߼rtg s14'c! 2cMb]yTSH ,t5"F+A&I)K%]6ꄹTH7G3D{L$w:t]}-i|UkӝWWn3E@A>OwFĉ2‘|TWVCڗ"\ҁbԾ1Α J+Cnb&pڌ^$J3N^GB4 OMQX09 nM()c.j|X!$܊'?1B pjSOJx覙Á{Rz# <=ܓr~ϩϖEhzp z4n bశ 2<\CLz4Eù}:KAQ PoEFCMVQC+~ep|28 %ƸbGWq/[%lFZܪbж 28Z}{kQ{8 `Dw0KOGz_=jv\OﮝP;A-ͱ كvʆ ->o%(3$n(m-ŕVk7ŒAT_,!"zG-D %:+'`zBPgTo.5'[Rz}6o]Kf|\Mv]ƥi)U[U;?=%ݭ$6a?qO?<}ާוZ`4hoe ӣ%xJ<kG@h#Ғ_^4l2wCi&ry;hALER bG 0Ϛ;}$6+l; 9}ndĿ?_f̺Kɫ KWϿjzo"ۣ]z_ZgdYz0jI\4eRUtJ$-"ل;k|oU͎:JMĜE-$X%S!u"R)^eU*U%4y ӜY6]‘"Gr{ +Fe]Ý`.>S8*n~} j-8g*d$mn' ۊMtm 3iKo1 z>{A{M@QCSɈ%9%Z{E> }x;RSa(o_5UUvR7heȬ^v,m^1媬 I̟';]?NOyCxH$ r<&+$%hB*%s,SY*LkN"Zʌv>>.jzbtCW3NWOLXߜnG]nߙqV9*T>z;/ŎPge.+u s!%%9xiUUJH+ΕYә̊T"Y ś/eF(L (C{w?{ A žow P]+?OGtI -ԍ=?lJ !FƓJJy87ZAiEnud^*0;y6m;q݁7MҺuԟɷ?Y#lo&_,8HMpe:q3닻463~=z!]Lm[ǧٟ}@h{0#0?Mk"ZE7kfa&?-9% cjLhye1dY)jI%*g$#13_"f׍ fjgw[O`twg#}.цmb":a۞vhq6˙r bAɱwد%!X|Ƃ;x4>^7، }iflÃӅ=P:jE̲/~.oOrl-/d<hzD3ܟNsbwq'8=1D W8n1csja!ʡnĶ('\ ǨMu}JPYt3Q~z29=Q'jDB89':P1m8B2 *Q ӆC_4$0%5TCy!ǍǎzArpj+x0 h!fgsiw&%ia:N}>9m8v8v]{Yǖ,&Z[]4J8`?^<>RlM 4XP@;R@BTHQfEA ID+ԩ "S.K:NO2RFxzCDf_7r&Kپ|7:2բ2h큔3#l}봏v7r\9/AMX 2f nvsv tseTN(l8aBsNZS$eJꉠI>Үء_'P$BHyZc߯폔ѢTFʔiS2RHSȦEO%0%Ҳ( LYir.$XPpfYLd.qiALQ!"D[FR6JD:jW n7TJDVuDTzDSkOJDGLrD)XXI:΄/%Df؈Y!Q".RﶗJ k><}6Б`R~$gn%|:rIˡ»VOuرJQY) 7z&JI}&5@􌭔} ^duLQ,H7)ɧ{BW3I)K|˿JWtOMF0$L&_}}s"4e{ \0Wd";Y|2 V ۽&\h/9$f̈́d\dK ]P "JCSI44)["!mB>[-v8Dh}8=拖6Ga㢪{#…䋍H$ּV0f0#(6[%97Pd"+MH`2+K1nD3S ~H/D5 BQ9%HJN8J{ɭA Zd $D-Cʅۉ7`δտ K-|cld f(ȁ"P}x8?#f ~TΪt$GN]wpNk\J֝7xnd!lH^3q5> t% z])3lͷǫ\pԴ;M=zAA^tňBQ*:Xڜ҄yrPR'飙kt3xt! .b9®*no a#<@`Nݑ}+WYom ,=# %_܉[*LMۓ[gg7$;ńk*rF HKgK~yM'77N7>oY|{ͼg0VI;j_u58z ?+e.C-zaLYi%YVzV=T;HIk.<1ŷt>i2}OuID WsΧִe,-eKShIIf19ddi0oܤ󸚩 ^km,}wF=X@uSgnnf %͊0,SwYJ3nLpk% 4/::RR /@a2YJq$\(H]nFNjOE Ŝ!:TT89^<.km-}Y0"2uJc5TlTRjA = ,,g FwD#NE>nJhv+-f4x׏~9[_ҥm*Ĕ}ԸR5Cm.8gUnw}}9=Զ)@]Onm4;w\T:^֩8g`eN3`t-Bzڻjtm#RU#DDN~VC>XI'lkPߎo|?"vhX/Q!=T{c/ ^gȴ- {0m r:Xk,n8+zo8it` NJHُ g,4M7Qzԭ4q83N)!2êSzܙ6=W ph6A+H Z&c*Q-DϜVV#jT/_5}v}mtR,au".P*ja_r(\窠(FFu&4LShga%sQ\2E?gJ"TtU3ZQDVCT2fny5IJFaQ(f‘Nt0zW3+<W )ZJ.Y5.ջ 4Dګ])۫v]jMj.~hzU#XY3V0~*z~~3:b4Er\ջtBܳ'#{5Jɲgm/dv|r?#5ksw?MH*;YO>+Gs5ȏhKjL -g @l4ڗ5vfI55wJU"ʟ[_U(iX[c|b\L-+\>l=&M=sZh%𤮭[^3胥KɒP '^d#L23]9a|v$ik.rV ٷ7>Y|ŏn^<| ^yyC2K}!u0Eo~VJѹdu/tԷRS tҳR&IWFPJI}.}/TyZ)ë:z΃Kw}4UGuE9 qk?u\wL7ѠZ]|љ4g? Ы Xkz3^/5,@GS:9RUn]9`a.9\/)+=_p2x]6zQ1w[Tx=[6#JeX3Ds[B]sf6*0.D}47V:G] +O;]>9OwjvS잘9_af}t碃's>uo XK"5D~󝣼K-jsxfcݧs؃|_ nF?c[|JG_ڼ )sd)œ뉕wݟ Y ! S<=@{_XwYy*p%VM7s֪+>j?UqgMI S/Č_Cx-[8I쒤2[3~3w <=_l+ڗ 6#e LGE2)ˌP!}_yKm- [[-Jg(15n E]X 5kpjB!Cd^z"0Fmw;َo\"|ZS*"2c a>U0-2LXY?b5ZL0h,ok9NRm-U)!`ƹb>bf>o6\YXבa\Xl$3"C8P LK?L=#(Uq1RѫDQ5 yJ\3"eJc[̏A<~%%dES0H)1jFU*yL /xJEfm7P$PUs2gȔsV/*CU:p&Bie%g!E ksye?gQ~ʪD9eWJ3_~e]:Qez>ϘV41?ё>>g?&%m?կt}UO&?<<܏wum_Q%KG7}U]U<3[H4$E /2jL׍F?}CZ&/>/6a&Vпv?y_3M[bL3zsq77]žϼ5 8u__s/%2:i6r3o1Cؠ%aF&,g=KU&һj>Xmh^xsNux#tJQEƃiF}F%~U*XQDiz$Rq, ZY\FeC`4Ei1No87AC C/YwM;K[(k˧W=ET㪧hé-d R?B&,9MٲiuTSVl-} FX<(KjԮrW8L\V;$dqa}/{E?OAMn(ZQ g<6(Y?< N72GZO&L-Z;КJcEX): )U*MJh7oK)J)$RPz,)eu4E|JB4te<$uTc"ոХ Q-L<TK-Nxo;aRySQH~/7f-'iKiR"(t՗Tk!=KIK4)R: )GZ(RzR '>,N G!x7TG9giK)r"/|QH>/7n*ηSREg% BIu6OՐ723n^;]TD=t Pq>`M.V($9_zEwl|hIhiWz,Qj$*pJTA(v|ҁŔu)譂'*9)ȥ7gtQ!2qr&^WC:^>,LF1/z^E{]?j@ϋC|yudJEBLL'DcgO@zfjIjo A;\;LǗwkOuWVKhdcf$5R!ƺ&"KM^uNG$,kmtY{իhh%%uͮ L@0('9sS DEAԭD&PK^D6`#?B9kӪ'B;B˧WBiqU!pjВA*8,Ӌ"+&,9BYP<[-gp% DH_z-ǩR`"Q|u E{Bl Zsv !9PԆ^ Z.-GiG/W !N&ZrQ{!~/-;ǎO1vFv{]kWw~TЯiSe2hE 9 afP)r|M+9w .c$ݗ@:^ t7zițV@T:#l&{DR3BN.+g|phu {)A%K[pޔΰסfRBYU*ޗO ᢛI[m&ARl7nI AZ+vT傓5hZ;_s]qEZT\&}mSepD6lDU0c@|aw ]F -eUeRQT ”Qgݳs ǭ8M)J*g@_^2*+ڪQ{UDZ]I "E$ƟbѩBQ\D "LA5kTwPBJ2Aq$6^ -A( ±! K_ZڙVfl& ύ?`QjVסS. +c4;b&uh, ](4gmY`ѥ$eE3mOϤeP`!Y»R)%Lş76Nň; ƎwI?iR⎺lo­z3č(vBv}q(|!o,!ѾmfJ9aKX.^[1Q'_~|/ 1\V^kNc5 ɽ|3)sٸjZ"]Jl= wۢy)ﮊwo6Ӗb?Π$Ӳ|LB"&ƙ0 2ٹ" k}Ʌ_v0N_v-eBص{oN5BmPeZ%?<Z`F@cYJ9YmGytص]Uz̧'V5i۪3]5OWw.sM/*i|r%ɹ<UfqԶg&*I8vJs&(cw;V?rC6< Lnr {Ljm0MrjnvDIZxTo Y|!e]U}*k<.vE*ܗe뀫V lľ/ݦ~x,]7k^^݆ŏWN4m5M>ss,0"g\Lo$ۥv^nV{˭ 1}*| BBy/Q&^m~_̿y8R8iܼt'$ M(a *leumh`G[yo<8!_BLܳȱU4gu(bd'7xRH{vL\kDxdLH$xg"Is3P[_[kHvXV5/ٺY9dAUe`Q/7 fJp,*sN)[Uj\P)B9F UE[G) XIE9珜Y^b|\5S}@ ]Z`R[;S7 ,C(O[0tz|G-1*6`-nW2ʨhVbFސ!%78$J!dAI{r$ȅOSm̤Za"8C`,F&3Yiq!rj WNK$cYMYrEDpL67D")|!O.n?'~<齺>q%i`k_kUCG)`Kff kå3Y8fMJT5f qQ6qM5^ԦDXYR0N'ļH'3i5Xnꐊ3nwvqKLt7lkȽQuQς!yЌ, (9Tf;:* PUL'.(!GsL!M7x"q5iGʡDjBӹkdɹ<s8j-݂ynLpPb>I~ k82 s4t|@yct53:1Hʙ wL3?[şٱyvx+0ɮqV) ԁ9.*= ,( ;Sm%I'hIvDKdbHL#Sm䱐da#늺mmsֽ%a־J#j,GeT^y[Ǎ_ *pl9 c9[/Ww'*,'rz/Ƚ<VF33 i>۲&!he%~Q96E*\l)r=bBS;·M8f̕a|BU?>0$rkWuw߆cG%lC}(% ݼn$w|x!1\{bGʍfܘItc6_\/qǘVU0f]{dr&?0J#.ϓ1T11Q6e8p꿸E<|`<-۱EkOOLήOO'_n=%vwrmp!Ss4cdk0YGE`R2{Le/8OraĘy$%Ǿ~)b8!L6%1Lw?@*|7ݚt(A,Qciǘ2gP23Bbg98j,E3 + +۸ژFI8x^WA-<Y ~xM/j7TXŒT$+^$%MvKH($[#ϒikBTo|X{ĵQW: 8ﺠcr[`%fْLtLJx`1*rszM D^p=6/ie~C#z}<Or=VQ3rHhEt\d,VF*m[Yfr;_Gzy[jF)V@p"~㔨VGC\cIfOӭ&LJxfX[ әf`|X2KPLK[ZyfΌGqTn4^tprDuHʍkM";F}6w7#dz׮{&'ınJp[%RpkPWn͹+PGsk;y<p6+ 8o9hvm3GSS5 `[y0u uݥv;"4:2 С7>|lfrz&BLN6\g&ﶖ`p>=4Fw@GH[/h 2m}UDKUGu)4V\_ hv Z_4h ,;4Jh崃 VÏnt{{1Ю04Ѐ4&jǚpݺ (ڞ@@Y&K hu'& } ]0B3#$ :?L*\m `y R;q+Papjo9H$r7AC Sh'B"< 1N8mc薶+ܞ=s\ۂ&(# Q ]r6#F9Uov&&$RdH9 ,r: ,\ kA W(erMY'KCDW1 !p`ĐF p SBkDd2]0j{CHى^OvA nв}y͍'ٵڳv}c>g\m4Lη^`7G~x2()ϯ =`r>n b1}sy"?F %/o/n~x3#=,tr47vO?xZAÌRIg"'U!+^+,e+%{r.s-,GlUdAgYMUf 3$)Lf\r6aDFO^ߏ=+d.y{ |䂯+dYi/ D 1o֦(O?vwX#@gOyՖ{ HY}OϻbȔ * |KdFc\_зӫ oyHVuBN_^ lX cM[ Wʼn3Xݍ7>4 >| Ibԯ m{wۤRHyԐ{$._n)OC)!Ò!) o ?xJ>tqŨ9ٷ1F=b *7Gx58aNd.}Ԏw\ ۺdkH h8^UU]lK}0p੫z{'{JlpK0O1HMn&kDuf'2uv ͦ&m:SO^V&Efk]I#Mh7>l<*j.[e/M(LERnٷϞ; -5`L ݼ^&xp+ۣGˬBF|@o$!Gԋ=-Tk& f`Z|nO>Qݿ JK.0o]"-rzx-*P.A%Z!%hM)vvq[S,﨣ݎL+mk"['_T%wK&*cj2Z: QK1 CwڭqX`ɗƚ‘k SDqA$w %'vQӱHp{n`7?{㳔ktj cJ#Hmz^U\w6ӿN`LZyގF~:̦7aw8f_Wzv9嗫W_Dջ_\&v1ȑSS(!N:N͙r9xx7*rrhWKlW/CiX(;i,7#DŃ\yx$W";e7-d[1\;F !!bsg7!# IIsV EYR@QEkFʑRv٣^N :=F{T]z2&"ګ=?]S1'L'_VB7Yo8t>G}ygׂ'#xq`5xtU۝#TUuBۈv¸[k-3%k;jj 7u* ѝҚ$$ɹfag j VdS`t?ϩF1wMcQQGApىƩus+dziiz ֌SzG9ZILYs䭨/d@KK6ԗtHK{W_c8Czc//sOC'eBr!i!電@(%LWyw#wrtpeeh.JgVma~C2¼ PYZWe`T; Pth@23j׃I`:jE()JGvCt! j3w ӭz*)8\DS@uZY)J-X|ܲNё]Pt(Y)JtVt>cPfK'F@i*dY.Йk80s'I.s;ER/uJ%ShjUnet*`͍uu{5w1_|yg͎U䅑6/9/Fqix浃yKXk6 .'ѕdӻ>;F5k2#υo-U`M'x``t'W(%|n)j?>9糛, %ggO3Ϸ]뢕Vf3s7nD2W?PmO22MU=Z8njo}~٦B]МlvWA ֢7̢gof-Pu?Ɵi8G)U&ZM 1,ˤmZF#V)fe.=Nݓ(||LB[qpQl3{f!jOS:t&/󞹩zH#_Iy5y)Ihnkcgj WY\epU~j$O`=V0:b);)t)k_wůS@1dgu0MLg)T녅D^YxZb" S 6Ja*,?.<_5Y : NkX/eZf)^>(o' F1 mk)@eRuZb'Z˺S|m;秝H>H>1P!L1Bi<7E*wMZh815QcCT ֪;21{Ɖs ڤ$h3$K9H>+O4PwbsM"!&HB**fy SIJ#h+x(,ݧ~~ZmPRBwfSzzݗ+Sp6?~.٘iR]}+C`Qcrq;|0T?3FdXtbwAJO_\Is#ƚ 9Mbdq̘POz>9F.F?¼vOjx6h|9}Kw]KRCRG8dLǻ@'4i)qaљܢ燬 {~V|r[6Bvxv1p}-h@-:V뿠:ZRYe%2}T9~_L/+Y|l#U~ XTSN2mQ@H?FN DxFZueɘ]09!F 3sʑ Lf@*fSrMɷvY#E`~1?&yy6]sc%g, çdl-s '7ҜV& Ws}2㺫l#(T$aR2L;:?-anu|a%-R|$nQ^ /ӫo1rgGuO ArRSK{z`NAB{/8MU@H@g,<%Rk3D <kbTFtpwiMTAVo Euc6Z(d߰摇ṱ `dR2W[TsJ3')}R#T)7rUzjϮr=Ey12W[^j5sLhaTש퐍;8Pqnl> Ŷ[2`8pbY9 J2q2SEqTv*<aH"*T69UBck_P @2^ 3k FqkLj'\.cwZf@_bumqӔ>Y=gb6ŧjEAeȜ v:z ,evVG*w_[\Y$x1 ?&O~ *d|pBZǥt)0MSK%g+  !,`…J+aXH;_ t5+޵p.{Sk˶#qrĀbXT[WPn:DcJ;YrIҌ3@e2¼RЇy` s\)׸kV" M0C*BATw'cϞrڶjN(mSP5VgNn)3-Q*faAQyRƉkRP2HN8~ckDM VwK%E2@dBC5>5rWJ]3H5D2P7r 1DJ L<(6̐SJ̛Nq5p= _~|4-G?Ysy ;y2'^UV"- +Z1 b53Jdʋ:^Ǖiv$KR뮋WVU**v0$d4-5av q @N #lfNEg0x|9@hATzwԅPvzzbm*nZuzx__0kjrKEvюV>;wJ0(F{8Vl N/3(0Bo{}|lv )io,Ľ 43cp6:pփ@r2Ј4ڶk(IwۉQ5|?js ZT7N+8j,ukh-q8PgL5Pujj+6-[o6'H'ja΅z.J-Rtnv?!YɗoFMxj|^@`)ŒZ֮?u)<-9ݭך; G% h#l u~f*[S bDu. 7"Y]/?Oݑj ۊΓw,:ѭ&$T.y5;Tɽ A?Vɴ 㩻{]Aahy Sc ާeuTc(.QvP?; a45!r`w$3f+d1@F6'ߖ=5gOOO)}ňu΀l?u(1kPڛ6oc0J[ZZkJD3 ջ 1mU>GLzx9Ai't,1s~|pGos(8iBఽz'iq%W:.| DCQ\T&à NF6 llxCOE)m8>l^cr꬜ Kk[ͧ]^5K ?>%w} ~ޫH6^D{{ӗVH|#RXeѺ38?Z9GL,@ SߐF5.Dz52mPRAaW:KSQ3B4vg\{i~bmxBizZWő#$_Jr['o%Az] v oKpXf%y $8?'Sﭣa⋞ =xON 0:?(i);?e "4HXʔP,]{dP+Y4#(@a.nJIR}WʚVɥboˌ0}3^[)Xտ$P A^ՐYgSM'&7I-ycbֺV{Z8t?U###[A5F*GKJK JF1|ߡ@g q{ߖyVbwW1 <`ۂa|ktIB{t~y}+P$EYfLq^~@ܛwRYJ0Ϝcx# 0f$g5 fAW3O!k%+kYFF]Tu"V^;E q"!(X[-vr:[nEwAIdUI2-fT 18[h|G{abWsZ-K Hg=9U(*̫özw52PU20,%%: ?$+<AEijRy gIiVjlDX"=Q^0֭c44S 5"F6ijZRDYvV9!oFҖz[8xbu,B[{IRXf)n}KJ[ RlKVJ/JӪIQX!o<^"MR +v*إRv̖wSn)WEoim :D]bHhiEz$5! FZyLO1[\Yl 7NH6OS7ƨW!qf@(-fh()+ v\\qoN*I\$ r&pIN$GoE*d"PØڱ{/G88G%zR˜'eh2qZ1\3=R T0*o\ppW"Qͅ^al+\}ɣt^W{BAW*G3H麱ݫv-gUrogj%j@vU?&,K{-6йw]:hzRȵx|~|-&(i}f~Ê[<.hJm)H?^j -~/& ]ьaKRg MV^;&`̀5 >^A sݯB{I8&#, 7MysJy1I&k9$`}Ӗwu lIzkQTer2e ٫ʬ8wQCk /;ξ3ѫfnGx5?sh_`Q2\U7sst^FzS9aXA>'x3]]l|Tׯv;tZn3yqUhvQZw,ۿԯ/)T|ff#8vP<֞i,Xxl5OLgmqPDGj!]ܺcjn^xpj2 ڲJ(Z*VXqO[C˔wnlCRz !Ç]04ZcE}RA5Ow"pL{q`Q髨ЂUglr^h&O 2A )CXbS[ ( IV7 kܢl5CJVYʥ .Zb r < rAImA%Tge ۹W䑌5:RYkPSjtt<s(@@roy#>55E[gQzW.((5J}]Uܚ_m7w?/yW-1Ebi{[(2SH4%荷cL F8"ç^3'{v" 3*)97O_-nE k߆%LX2zCqx.(? &"|Lp]ny2J鞊[l)> sjeU*"ԅ%gVc4tigEq}~vD;2@fӒ0[!%oop`Kw z|Ig55J$HrA˅!Fyh"BZִbrA[hKJʹBR]P\hZ`uޙ =$zjz t$"xeF5[ O(,z`T%_f9tL~Gv^Zkl#Sn g }@Z!pt3RV>KC2 *9{|gZpwZ3aS2Ȳ?`5mvg-ceSH1S-[,2LFHm,LquO,F P4Dɹ )%֊ 8 4Z;j(g <#"jqJao#)J=BЂ2jPX^sAR[<@Q/Cӷ_o{B`R%|+V-+O!tGJe~}83._ LSK x3o$:+Zy_7RJ g.9mZ GmcY (cR֮`=88L鹳ld%qjUܳ'SLғȠ={ѳҒ Z iy(oZRX<(-}Ƹ?4ݜ]mN7.~^Daf^ >,+}~rW^!냛}CAb9[?'}:՟=S ::)gř:eno"2-v s+ mTa~eY! m78*+X#fqm]ǟM ҨߜW,8lTѤr|&{`T7'ȓ:vc(/Wk;m {Mbd%C>M[7AL@&N@ )cO}vkr>R%ުɕv`VD.WVLFHxe4X!)} :3C4ȑ쿌!IS>^+OUzLzq`lw42g)<r uEنIJaI(?x픱 mNJu.?=2 {OoʦEl&=܃&8j(pq=1r-YȈbl 0_r@SBD$ڋxTh0 ,VSMMSYRQq@Mul!T DQgT{DVςe^.e= W/,|7m_q_<_e+ Ք$ѐZ>-zsΨDQW&_&V V;{o\D˔J e-χR2oot=QkT_}6".|LV|j3ƌk|Olrp Q6epSwxod>}`]xSyrvWw5N[9EцњT11b NJJ - -.\ɫ R ePʫw"#ֹ^^~2ĒcDKdJ9|rxmxr0N.$(1XWsr4TUS72jkZ'8 QLbVTRBao\]j9^&}X^' vusBZݜQr(2rg7N`r7͟ڝ}at}V}*1ﳛ8 _ udT9ROzZ ǫڐ]`tH; {15r!(Ì* _Y1J<+cjkNKZ^E.A9d kU*Z1^[4t<(ZJbʻCWT*3Ğ$l:"@hJ.}o1& L='?V+:xJLe{0&&ͭ$=|(r`ZPGL#?OdFQ@w2lZG!p99\../fxnt;ce\ 袷̬j7uimLb'iђOKw7So-e)]ɬ@I赘Q k{ZͭsdHG>I'Cک&N%줇:щҧpAx#E>td\OF_#-8ʄ0c1;;ύ]"}=t+@$'GÂ2+>f3בJѣ9U&=%80ƝSSc *5h$FKLPjzI>-1EOx|䕟tu= { e`. 3Oޗ]v1sl9MVG3!ja6c 5!/QpZХQ^6ă0 BC=P @Ҋ@ȠZHCRVX>_-fSl<=3d㍏ն~w7,ң ׌nO+E1S>3AAX,iG~ɬX,Wm] 3B߹,l/RsA> G[IАOo P\Vp2\%t㞗JP4ClcU*K1xmB7d 1WTOڱX:@z\bTև#7/O'W2O~[njP$n ci(rr"y Eo\DdJ]IQ%a֋&ov풎GJ:b̉Kʗo\DdJ;ȏnNm6=0v˞hvCBq͒[v=TK C(uע)-h^ R%HV)W w$9'(UJ-3دQ|h kvuWE<]-7~}+)*QEM1ry[w|t _ <|8 Ĺ./фo,Ƿ .dCO1dCOГ {\+*$jZFgWFv81I U_"w;1;1oN "^GbE,ӈꡃkOr6nM[G4zm_,7fAuө~-^28fӍ3#O)46bno,P} t^YҲIKcG#Ӱdr1&{bFtmhGfh6q_*#Zӂ3 8^횜R=ciOst(o6_ރh? GFTsvFQu抎y6N-qgzv__^ۘQxr#" |sO^(@e3.1036dH]?Ti T Tz| `{*_6@Jj80ƝVL 2jE=/\Tm ZH?E{zq}_:ig2TL9(*KѺ yUbi`V)o%~aUR^}r 5 P=+AK?ɨʙ'b!Ðg9DM1m&>cә~B4ѲNxXq8OtE[6UN|hOl /Lth''`4J>`ދU;r0ȿe/d(h_]ómB^t7l#F(ӳLQ'!_ >9WjQ>QI|~-J6d%fr3X)1 ޕ;$%RKW8Rr2A[)d;Ղph7@|(˂)C (Zdykg@KAc\ E*_QKfo7(#gnBh=K2[4< gC.pr)'drnj(d es1H;FG]SCF+;oN!!߸v)Bυ5>?w' edE6Fn`~e+{u۱Qɬ1f{y{yY +@b`3( 9Vau4x/K%dޤ (F|ASact>X,AĎ>٬ǰ/|XK)Le3.uZ6"{r(K; ku+5YU59\zTqK,K`]|K:@sɔ>>)ģ^ܷVfvXyp/Zrrqvya6C9'ٞmz|Z&>&=pOM> }L^k!z|ZY§&>&oCYRF0}y -y -7CKjKx`&P-]U0Vr'( g,nW9Q>~@y>~%f d-R` wem$Iz1fh@/k{>f^ֵ$壁Y)utWmb)+ȈP#DjB# L,;%!ĭlnhRbΚ,~l/9.0rd2RT,aQ/s)z0Jځׯ)d]q[djag}͉{<(hFDq")E;M5a]WjM{|b`P5X&CVk[Y@oyCRbvE%P^(9-(#(SI׎}QIWRN%;,TC'ڭ-In;Ѽ[ȸcC/ qj0))&*gij{ęӄh`#J0Ƒ,Z%!jj^0Yq$1 cJh:SHGxRO4@4qrl5MElR x9&#`C `ADnA `Ebi#; U)Q^5KcGӒ&TW4UV@hhW_^pMp_}N1i>>:C#[71rvgAk!i_qԦhyT$B Q480|<& Od# " JmJMd`0WZdPAi̹P?E&!VVMala ˍ"%ߺoGEJ#JlُzVx? gDxsR8O+L!ނ>u|E7 "@^Wӓ8Lg ?>\_o/OS\O]k3Ϣ;gp{ '7Vb&>9\dj"|>aPZ1 "/J{>.R".R69lXd ( zd7x F68tצ/j`F>+}[T t*y|Q%K޴ꩢIk'7 @hiOpBiUswHӂW?5ݑktJ @ բ5/Q.giUELw=HTߥqL ^|n< `hgUK׃֢lT[;P}1+w9 |(1G2 h̵g"ЉRﯜFBa7h)?-`O>\t.POuAYG_j<]HJk}WY&`0vkP+kJd{vqfRnxdEbEbEbEPDyE(-8ARkWi"HSыנ*՚I_& sHQʠYSP͂gv24-ǟ7]73-|ZMo'fzcy@_w14/GeOo\n?):r%^Xyj~r7 U ABP$[>|iYn)9/]7͔R:}w!wCfL *eGqQQL#Ou@=" E$ U(F"B(&eHMDQĄ!-62z`Y{7}QM_/?ʇDg3 ;hͻLw`ы9 jFґԒ2lxB #VJ+1^a-S cj +8*`E4&B#&& [:b4_눹3>SSWJJ=1L[enOyx 0D )4(#c y=U&4g~FC#nc3'#`RY& TǨew=Gӌ>pmB6SNEh,Mu^{m זPqP&]GU\jzn5ADt|'< 7 ~0)@RU|vk=j JgY> !`gjō uh%gsݬpf&&j[ M?.?iFbN.4F~S^;|m {[h e,,FPb;^[h׭/Uv:h'Iov '9 pwSCjҢ= ̱q鐒{g%G'=r j N+$Ul(k6?đ0/MR])48#8*,܁BGS:j*` J.Paw2\GnUɓ!lD) (W1r2Ad#g/xԂL͵`l0*bI_c1 no$KoCHF:B!<5ɻ/m@T>6@0-cQMK3,Ĉ,ALgQ\MH3ۄY7bV55zDo,c|8Lġf=,3qR;yBΊj8}Ǣ̊hi!0(7 , 9=Ǐ\GU& +e0 \Z"az%\~^`)a V#C8st.ހ;pO!5-b;zo@8;Ti)Gk2Ƴ1EP;KL/@V2 K(x ڞGSޥv,1U8*}wu?vZJ$"U}j v257ת}0FAx5a͇W 1=Wd._AY}+a׹!K' mTLHO OP vCnG8eJ+KṼB. Rf++0g#9c' Pэ* &aw,|?`H.Gs#4#ya qd=EIӒ % y!FX(^ 2zM08DANe<#,#i~bx%GQU0E,Tj*ֹF-x Fn"[02FE`-IbSo qZ&R"A|JYQh̻,d-echf`pS%J p[D1p]EuYT[x5@Ä]!Oи9qb$;Eb!"PtQ{*nq菳MЬ ǔ!(q,*/Y)RDQ#T 2qLI5+JRoVB`y O\e I`  "P 31oxj Y]'u)w&b>ְOt?^sx;;tE&D1:+D"̉}c~dh3#R!0 󝫼 L^^94(8 MGa ~hqugSt%F_@t@&^Bxq&5FΪ"%lـ¼:f{A JXMe3@J91H·@=[JB7R:ok6SbT\- }[.ߞ.oL%[:=_7֭L\^؊*<|3w/wq]@pnjQpVt:e8Keh,n}\Ew3&fzOՕkh=vScβn9Xtȱn#qOr<=.ѺC.کOغZ֭+)];rۄ0C[%֭wѽ>@`!X.2T u,5HK P 0V` +^ʅ |Khd7W׿¸ \XT^_ o]$zծ8.KD-⌣ߺRm}n~9^mz[ywIv)Ԓ.KEs~)ރԷ &E8_xF Ɠ(Xr33.j]4'펿 Sƽsa3:3P  cɃ#phHI3RSgEXp @Ec&Gr@LQ;7& !s~>(8(>(>lHf̔Lx"J|A">ZVE%PLڒ!@D(b!–$<^rqS]42(Twy897 {Wݹuqm>5DŐxeeY֢L}3Mdꂓ6G]k#夺|uhpAv􇻾F6ڇ6 _jKӺ%j$jbvz!J |L=!(ƂRs( c^:۶DԖbE4Vuљrsfb%%׹s%' L-qh"Ƀ@^6hv<$$y PD% T:WiAwKOXrIƤ4*$L;N8TڒK̨+a\kì Ue Dh+8C{"bQ$[[`) dw1Ÿ_~oׅY-eh )m(.%ҾߗVA㚛ۺ0G:ڝ)1|3E| i{[]1Rړo p@%WdUiHWڵSߌ{@0sZߟI}0k߄G*T9Z%GYvPJdMR)4 K]3a:<l=Y T [ BppJQϑP^SF6c™7a mX7Oŷ;.cPM[^_f&mq9&+]k-PۧNLE-|hG"_ս_ LrE! k؃K\-/-߬cyگ?7GkZL3 sYPl5RHw\Cq &*Աp9V\WCXռ|w9`w?uMOc.Jg5(Ů}'?8tA3I?Ćeɹ:o/6^G5nOL(&^фkD\w7pSNuF 8W|c$0[B +) o-< /0cwvm$Ht $4'@eh5X8p ýJQ<(R}F`kUeS%HϪ+FW:ùߏX"l vӄۙ쌦HFM . 8+ҬuKN Մ| O8x29c#ׁ([fP's/3te`C[N!9^*fc.5M6lCkwnw.Ȫo7M<ۑnvd>d-^;|%MQ89.KLJQpRj2w+_.sT;]RSc i4wC#`yl\a'] ; Q>Z)Еt"'E `G*(Ka&/9_0!Ҷhxh^vD; $V|lWÜ8[U7t#y Aj' %L1QHFJLI@: r ǬxR2Kb"QhtZ`F&H)-7Xbg9xÜi(s[Gk`-!XM)C,G@pvB1}{s*̗cǻm T{۫Ome[#n߿{[@a-h,=yӪ$rߧsj6au˸d BD"k&&<튩tv]0𰼵]eW5ԛøԊ7\ vԷRksW/{_`!gus_=('m[jÕҋRgPU{̻6GK-`1:3;_ow1Ą xSb^xi0Qh:w S"jaOM`I} !hB5/H㰯{i-h .hAϒf!P:8ny>Ͼ?\pӛ|*i$t块QW"hr5=d64҃Hy;#c}~.O2) 91ju<Y˧WYe*#ҌeduѨQ(R7b-LG8YsXͪ %88<~r9DrbooEs>~_o{c0dAw|ͽyaQ2bO ּ,z OsB}q`o;}P~\U!CϘ_ jpܸ&UԠ2RH'ִ\fnm v r54xBr.थQxgDhfCoo^Ybۻa_tnd|c/nf7^ybo־?ted5-~G]5< 9czOY:,Js08am j&B +H+pR/ af)0!d*Jb%sԤ>V+pE5C9ف>-נk"jk,hg>kLk"gp3;ޣ _fK/iaGZ, J9B6/m)ׯ:M4BCE6ʢ9&Bk XZvPGpq2ՅRY4N'm[j c! . PA?w݇j{eF--5.X꓏Ժ W= &Őj]zXcG8JbKi;2@lr ?o0)F1hr{sO6yʱk뱾1b!ݝLirE4<g^|e(tjiV'8*tl+Cke @F _i,ZҁBuWkRe+)Zhs:/ר;whp;^&49_sppZ r *1N&a"6.MtX]Wt$gQ_WCJ:u 6. .e x1%V16Y:1;AȵG'Zc<6 Wj4>^WQa[j %$-|#׌m04AF!9PX赯|Rڇ`$0KxtYl l} ӹ`o'oʅǧYέ_B7,q5ݑYj)ۦWw9j_?Qb:w Ktj.@.e |K3szmޓjM:QiˬMT; '9 ;cFCBptNgEص#Mw}ĢW!H jAE,M4E#eiKQ|Ms4T2 .y) iᔏjDC0ͅT:25zkaJHq$/y#Po/R  LBMGp* A10aTٙr_!ddu|8mvԷ-EWf),Y>QO۶7^(Lg`(ue$f&A2@JFzd* 4`E@!Qs -Py[Lves+K-@]-*dc/ Dc|o秇4&%H5 DY+IqkayW 账OVrml/y概)s8~Kv]piRAr並F6?7Ӌ',];{kq\0YǨݻ>Cm]Q@ {76 |HP`p&RnHF/; G݈@& UM%rU:ชG]zt4C#HC;qJH YJOfseQR TstZ9: 4!!y"Ejȧo 8!3(h`qXTǠkɹ׬v6Q WZ.BBj-|%Da309?fRJwU׏mu[roƶ8Z4Gӹ>Q9zN$[FuLAYI.V:;[- [*HM;̈́rD>M^& 1':kXבZH>: #{4-HZ˵HH3O-GDEn-)C:ݒA:jyAY jycK !:GF:X%˧fY)OmK5SEea囔3Z##YHͲ:D$!AE͹51i$8V3agԶ-ua:T3͕u˓H[sc1*%7ΘRH`tx 0yA ͘Ql7fw7E^p᪽;ʓ ņ+UVfiФ4;q}=%Ʀu3\)agOپ)zx({^xzH#H$L˧'Wv1!aQZ2ݛi͡ R׮+OM!zh*鵏TJ P}<#x^%6~7mYYX|ctvD'_)'N-:a^rb-Ts !bQJZ+k0yvE d硒:_7sySLEfҫ6kCEn_ֆ斾~Ƀ[z]Uh Jsv6ʏ3F72*KNZCCHa.V9}-b.̵6( (J$k°'V0QzR["*{#>Ѹy B{JikT$dm/(و.tE(a됼\qb0Mt7ZJJҌ g(ۺFB=.P SNm[d{!K`]PKߎ ,V~Nڕt8:%h!F6? @:C3cY-:jtTkvca_AxЗ~_>v~/~E||I)sy1a-e #`,AME|!flU r ׂn52#^J=r<-,"u6]g|0 17U们们们仦W-i aS֠C&exUSsIAΆQ?{ȍ`3(5yxoؙML/,my$;E%KrTźHIe9XLk2_[݌ظNR?voiqm5',oNkeo7#)&TF  ӌĬB"MnlpP*By%/}:&8>`xZ`KKbJNC8 ]i͈nn"H~I;;v%%Uv !>Pp*"=$!'y' }dD~@4WAO;z=e$CI>Mf S= t0nR zxtp1 %ב㘥sVs-^_\vwU:@8GpђAHqW਩o9.&B5l! NI=c[]$@1¨"bӝ&jp: YI[{M ~P QBlwG"5OygځGe޻yG?fcdW9}rɚt_ zb't&[$)m_)οcWľ2XJO[cӉf=eu-rA01$BQ{;QH g$(TLdXJmoOAchV!ۙط "1:x I,h)޴ Rqv\"E61qRw^i2іAdoT8V}-DZ?ךLND⺙kuBrU#@( BJ|b%,h9|h tʅ7tƦkѢ`J ^0([B+9%EQ+!8ˉDIfrIx(m&ٔ#"wjDe۩g옹aAؑo!\zoڃxzzx׉7A&ެ$& y?>fx@SN-(lg'h-VQo&ɢYNO8K>_K|PPB"κ+`N1q.3"d8˨A(̗%\X|8, ;sܢd}.jÝ6~?>}m1JF]Uzw^ec{gqV,5[o $ UhcĦ=8f L$=(/[7(\|'QAOa` ;_C^⮑ \ PsLሻF؟΅\.AM Apr9HUWBr&ERBr%$R*3UrN5W~;eN9JLirBF+A4SNȧ@.;ftb`~IP%[mR#!$eaQLNq$FXqĈS֧9֢`l! D&b4 R/EQDB2H[-v *=RJ#('9҅ۄN3O';FK^^bps)@a*؋J3T0JmAAڜąfXs8sֵfr<V2H@Q"͜j{|՗Wݛ?]εO^c>SPI@ ӛxVr _QG>()?n?zû+d+_g?;KvXݝVw'EQ̧ƝR]Mb׵><>HglOM;g#Xx}Bc|)BBHz}K»류1) ^yquUbw"6|T~wӕVZ8/﮺F,2ֆyw|NvD<:*Zn;p@<%> v5XQj.stDHdM0_C9w]}[y zr;qmm,>7JۛrgU-q~޾yl:Cw=BxJ4^{(ƫ䈅P[΀!°8G|X$Zg9H1#6\A\`6kV=?* ;UeA*{J+>^I- Ӄsq7Fp!UoUmࠪ-60~˥7ܽB41t+}_!Άs9t-j$mG$s>jUkKw[0%-s}Z(.S-uW_7!輻V$kSPFkdLq\;Mf5՟\7U|a=z?:g2ztw>x_^;Q_;;Tgت5AS#SQ't~c5VB6I+N$n6+{D햊A~#FhHN[zSvBB^&8NAb":߈n#"$Ln9ڭ y"ڐO)P)8xU#º7|nG"X#'GasIGN ׫]CV@rICBް < UHz$`E23iF\IH)0)R|R؄&POR(^Rs+%vja,$JKє" Z=S2oP?Һ.so1*9E.{ ٫`Iw7H9Ji7?W uRvMH 6ҀPzQ$DHH`JXI܅ S] (z3H+_PoIQU͟>G_ͻ: ҽgTn_ww8w =˶(SL*@f *Bk$hw˕y۫qR?~N~t*Bjev\xNP UUzhi|]u/.2>_.զ0 b5Y+h@ j|lQΖfy_SjMdJ%`ImT;ɯĒZSIKp7n{]$' 觹b'_^@DQvqtf&2֚Lf9r7i %쪬 W0G Fl3xĜUg"uyp"-Q{EGlvj'^ s 4z⢯"y#<SOz/b^P^)]OPbM#_#ݔ!D=4%\3B \H-pGiK"@NMɬɵDEۣn gf4/cEh~IPr í(Ŕڼ * D JƢ@&cKc`pD;kX^*_`XJ".X%iIsQH9',a 22c)WjQU/@u.F4C:I_nyZXLM)N $ /y.V_Q-)0*g a{Z̻{Xr -+7B&ma9Eh-BZWȃ";nj 􄴘_qRPBl݅)xR~%h TRQߦ=RM~sI|+& [bWMݥCOqa8zĄ(7q9bg<ce|H֙ gE*/½4}n;kn0/>pd=}yA&+d gs,9l7ЬC>۷ P/'Z> zx=ܢ{^Y==8wܞH*%:؇p8.ч0Hrha[*a4 Q j z6*` q=to4 U]aW9Ry. LhTT꼐Iڔe!h&2iQJK\09V[9ivVD0W|On 4,E>M^]cq,jqJq6 .s^h%ߊD$W&;"Fa 2$wJA2le$EL##Gް W=7uLn ۂߦV+i#j +,He BҐfR+,cNuZ|"IjrD¡#Nbw'Ο|l֍͖5L_ԚŒcAY",i-TK"JW뙙.S0Մ䌽᪻Ռ{%z P=m. &ғcfҘ"#&ufieB(ea- C}vbAd7ׇp'V|ㅰ`yZQ\ԙ}"yC)U65G:iXkSHGƼ^Xa|ڂ'r򢟤ER:䢟''g| ~g/_#q-NY(]p1MH`w}MWto nmӄs43o 348!p^JH'N]wIOD-jTQiC7 'b@Qrw,sI|:/<Z#|htB=ir2lΣY&yy6{XwD(ãPP) KJ);jҹ5D^TNbT`9tdI@Ys RJC7>E%&#+ ƈ2 W;(mL. -㨲3ag*ډ 4 .VGqH Anx^#$R~v\yM{jR dluYW@B9QA`&~UԁQk6xyg'by9]nគ%4qEg5-al$Z6)Tl?[>FzMi;uMsuLrq%Z3r2"wB#IY0m&((ئ׊KrLnQtoCj\tf˼T7>nG6^~B~\oMO]|ft{d;oK.Vyl &˻\+ŻWW6+H7כΆb\>.KŲ_n'Sfhd:ha\ algp{љa@%q09ERmF iϥ L\1C41Vpz&7BMH錕i .+sEMG8<7OObD֘QN&F]o~20deB:rXS7ᰰ.} Yw{RDՂ8IfHRpA&o=p#8\~ueu]5K. .+P! YէawX<.}i3f(`Z6Or N18SJxu6n938e a'/IP+AwɲWmEPźa.7?|o6p׾8E`%^_.Lh7H迴͒Xu1mj_/яozUneJڏƯF<ݖ}B>7ƠZ𲣲&p 2oV-"!Ցoܝ =eQ:)-L'09!OAN/Dzus$>@LQJxw:%&Sz|֠wD)m1JCkZ I&UhWlhGPwc6("RGa%4,8)6FMݶ%J쨌'J~)/ ( -Y//RA hRqIoܢ{6FBZ֕J^zG<2 _>>=RQJo~j4Gc 6i0n#l8<3iiL'Q/ZMse&] M_zN++&wLndD-R&-5n2I"6bcOaG/1>l0.TWQb{eDn4ɷ..?w/JT$hui*7c>*vTGՃ (ǃۜ\`R=d(PwZYզi*_qB@@Rp"7亍Ϩ++gd'D?׾ʎ++h @(y`K$_ 2AԮE=45C c$a7&FЦzhחaMd ou,`ҼXx:.kJ* .ZJ]Ws t 6IA([(/Vzˎij txUWwpsRQ3{r3I]cԅ +{#%{V~L0ǠIS ĢNeH [a୑lK"34u'thc4:ܖUc#`[D(F m6BR)hQ:['b1 4r+ݵЉ8ﭴҙQ;"2 7SuSbpzxaV/FsϠh](_)VJ}<(Z1? ,%Bjr-oOEۈL+Df*h.1p#xfZyc \mwMJNA; ѡucf&J{s?TYTl^O]:"@u mϯcGNPnd-@5-5nIT7bPTZ?yp>@uP2q9r3qF\kW!x:DUp!* Tmu`!$p2jA)a,\tP)_JZUNDUAu+wu+ @U'6'5lX4mT;Ȏ'' xLmK|Nk3g.='DvϬ2?2"88Cd?P`qor(U[M].yrr΃E\nf/6({j=ϻ͵?/ۏ]Ϻcv =OO.:OG`ה8wfcfщq01!g[)n.5 bYjӯ_Kؔ -iio۸G@- [i)7fKC&d >ఴaQ4̠R@B @ 4b.\z125ôZ6t(|O`Y9_KͧIǑ`llc]-+˵WnT=#~IH֚=U@SJ` 9[/7|!_ϳqEf@nw'6LJL H2PAd먅u$P>4Uu &Њmm)Dw܂cqȬ+`Z  !Y#c0:u5eSݲߕx9VU bQ% "ת~͟^ݙe^]ס] ؾRم߳ 1ck)I;wvZ_Q6YuueLмtr1!hBV^?5nԖmH^jMEVT /E&%^^ݮԴWWP}_7!(^G4 [Y21uJ\GKצ撕S/wm[mJu|G/]cd.''+kX.U{K6wJHǔ*PCW%r] +%uBH/nqށAw;5'c(S]amΥSu*fՅWb8Ö90zRS1l,FUaKȈYgrYz'GP hSPQRJxKvٵŏWq<0`أ`v%J3m`UcXA[Q>Z˹XC]Jh 'Mxp>uU˺t|dKVQH U6][PVa cfO-S4^k` *eZ"@VYy8`M:n)'>+-XBۜGj-5Jk&Td>+" <9dj =J|_}\fZB[ LQPq1ta5Ωy*.sUOwz/rWv=fZEteB4圓ՄOU|zv<O О p@V|.(*Mm2MyH#e*u6vE}L0d_UhieDUjVc-isu|r[vtg9jK9) `*:QEbM/^0 puyޗMgr#V_!9_F= NLJ ixUT^ ,8/n֗7KH8S1U\==vcJ)I #x*=Xfܤ<; Hͼt_] Xo-2Ac1#.J>bEP2TaZrjk`9Gj,mm9*6䐅Q#bL}SA[=s-D2{dئ8$FW&#*\R[9{Zf],dԕ6h*562}[%*rqQCy񦆩<=؏R=znq "EMPiA$^\c|#OOig$Wro#(,v!2;׫FMC5HfkXeeS[xD˫Dn5h8e[gkK X*7:S?Mi*^/SsPzoR!M[=Yͧ}؀/> $PnBVqNih{cg&0wnEfgvLK {I}f!eĕGJ.KYs/S8j|~r6&4101|ͪ愵"ęɆYa1ƩmqBV,qz~g:l#ذ*zFx7]~ZEk&D!i|l֗7w{o_k,cgXZJFHDM*-3'*1bƨh0f,oɚIBf?NϧՏό!;7q=_=D۞2xul`ST")}1 lQtu,>.>,j*Z;ZMS:7BOJSѵMŇegClhh2?|sBJʏ ʠgT8sZ|\ɀG4>~xsVη?}V0PmI-k[ʢXaq\3kGUwwf8U´"gXe,7X |vxaziQ"<4D턪fH,KL<Ɲ֠_zŁʽڃ/b[ӂ/ۢ\h(@ɼ(~<5<3/m| D%%VlT'!!JѦS__GQɸҢيќR`lE aet^@|̭wϽxy݁Sg+詵.wq:NϽaWv>'xxxca1ˈaܗ:ukb@15 U>Mt؍(T-6-dA@pMEZ܂Ϥ'[LA(\)(aBV!d~j]?7xߔKt}.3$E__kJő*+ ]ե>3GDw=k}Gx\j?[\v9 q@7uNTܸBu1q'&f`oTqG[ R]12 = oJM5Y<*.3SAr3Bĵf:j߉qDߍH*cA Q`E  JLb .XVLY&ofmyo5).nWS!gZ>f0#94sdfx ~lng*-W/mӄ{XM_bVYȗ/39_= VWGGo1̀hg>cw7C߮cNn 0x$fg8V3<] EԨX/nYDjP ''ɂ#@/ }YF +8F!W7* W{+5Reh R^n]Rb@A~!*.VL}sh=Z_oy/z FM]Aoh-[,P@SdʷJwemIz"MU}C` AMֈA-9a/߷LQCshI$xWU]GUƊEK/𝹁(Oz**ON*̄Cj]bRt3(R+2ZA5S&|4W2匦+1J)%L"s^A ]Jm_XkUZP$gpRP$,+r'Cȝi)w/;o'|&R ȉU,c!YBJ4t{ f)ct_1Xri̫Rs:Ĺr2F̼?G:䄡+uan }U:z3'T.Z#X~i,)BqkvwKeK㑈zbѺD}$ e/FF'9z+^ {M˾_u Im]0"􈉨"ꛎv='Z8qS;eZFkMƗ@>:J)E@&BD$JEf^I|]{M̏*Q'意0Wd@ɾrD:#b jz}KBPiW_<%DQ I48zX ;?MG<Fvn¯|cLTA1oU$lj/S'bCވH+,zg0Qp<ΪLsx9ZJtKAFK.ΐ+InN=Y3rLVI0O B}udY@Oh9qdg:VqvY_VτCppIq[dLyjx=UKY)+Tn0n >tY)ڒ980÷#rcTAW؅{`| t\UhC)F":Jy*vh pp`WVK 2(3_H[}pjK9gz$zuK=Ns-`l[ C<õ`g,Q gxa1[ri_2[e:=*;2 _N<*4 <.Ƽ{Z_)FnyMouA)|:5+i fr9;PI/yLN+rKlDKЍ搌|W XAX*vǚfUbfIO5TGUI(Y'Nd .9ץ~C8A}=7Dճ@jefǴ>KJXgl+d^Ii$-?|J^d\`WϦ!eNa8[l6&l*m0/!\{.6gY Œ뙅J7,W*NQp3ݮ 9'#Ls18K8՝"q؆6 ׅRe>h1 Eϯ6z.'R-";~l^\q>*Mqt95'h̯^Fя/ i7bkZز(]ƣPЊ&DO߽s ^~>{J'$ᷯ KO|Ҟ&ࣜRm9(!j hL2}N}e%(B\<iUj] L![ߔ 4u}6OĖ?_yWۛ/OCK$>ճQT?!*_1,kh%-@%H,@rtA(b_ϗ|AɈa*8FQhR/Q+/{f|[3Ak͜{AVQ2ӟo\ԲP3_O򒛶e`zt4$=+#w0hZ3Z˻ijtv]5 #rhWNL$ 2qpQ<[@dKVjǀ 7hw[0]-Tg1Q5^ʞu\+ &%:[8&Q|kt[i'NUK(OAH.V+|Z}'~vz?[ـ`ōgZڼ+uKu_ 4'.ˆ.NXbU^Ҋgx:9.zb/[Q$8bt,Fz $/ ԝu^mF_>zٖ(#{ xS}xjIr 9a2Ӈrt{jmt8WLچL{YA4Ǡ sė.\ZaU1z5)w Y,<;qk+ASz=&w~-JJW]ޕ"@&{YI4IIH(qqYdʕ_~w9ȶx-cM>|ԴDZק{sɏN HJ RFECLR!CQ'oݿ;̶ n!_-fi $% C_L2%68I(\r;[\ܰ '8kN,/s 'D!J-Z*Z%/Z,">-lQ_2FskǩTѱJ bb1i̬4+FZJ6o(fkwtPץߕ_5 |"9͆u师Mtu&E :cr*FmޖE2a{.XT_GU\:LRE%3YtچE-W 2vǢr p0kܪc0J_~RptPcs!؄$=R!w:5>K-#:6?TeT)%^Ϣ#F)`\e%drO꾤@F}`R͌d@r *c K)~9E,$E! -/QȳBeF ( k =?ֳtZ5JOWz:CՇFє~Ȓ6p#-QUIV5n* Srbj?h,5]aE$`2bɩ4c8~z[\Vh\|yZݬt"5=w7FvSX=7AřL piݴ1}RJ(~{bM.a[C%LZ`DG ǟ,E9:156oϺk%}"ֽU<_~ґ&')sѧw ~{x }{LӔjyZgklV 3Q Yl3-EM)D ՃmZrد\[4,?UeH,f+ň3L㔹U}_hr #QRĥ&RJG@aYiۃ/p.J=I3L9͋kqSJ^ygr[iVyЖ?Y VHZPd.6.yʚJխ",C1>JhS:>8gjFe;P1 1Ju^نbGy1^KCiؑ4^,U+ ffopm*Ťu5DJ(;trS2'qYexZ{O]lSCDe|N昌f(7xTV2SSܮc[]Ff+ny3ɹg6C)11缈!{PiUOF,FS՜fZ$9kqi-iBb[z 'd Svgsây›gw-}c)( } 0*kNlWJaf!2.h~-G(F,PV<eߚ1 1m=̮yVIRY +H )g*A)dNr$PdKHR g8IS4\@XG$3($?{ƭ K/[P@qKRv&Tbcb:}!% o3ItbEb 93V01@Ys \ %$ā6BbP@.Q5| *Eħż7,1:^KbQK1rAs]78tħ1}\%Mz ֋|li}y-zN7㦳%򖸆/|}KoRS&_ջ/x)>Qqۑ/?=w!& 2Yu#?"x27oa;1DZ/wo~H0%ܚ~(>^t\HD .eQnp 45[`[= +Y,>|׶J9bb&$s} aBa.0X7ۂ)r!P(@A\Y8ɕkza1#9wYd0 Rǜ[0y+Q{DCCŚ#] db}D:@u@$ +KwMZk7ōwh=-VRƭUilKxc!}k/Tr~8:}b;,k-RvߎK[@+8Wӗ]y}yOUqZzTj]wTdylja~5U֥x");z}n m".:.뢖c])& BPN۰HRAGFI 1?8+5|,j8j`6__zD@*#3:X2B98o^E*8i^ؼLBF6Ddm3P=mlvgG{ħ6k`!Iytt v]M{Q=s/eY"LH"'Vr]KWԾal)+{)=ʺDӖOB]J}rSCYLeA*^OG&Zl4 (]YaWekz@y!ia>,L54 ⩎';rQW2q-[(+(e^v FCyn~<(I7mԓ}~Q7O.85R_H*MgjSK%T2Y{fe2Qt)cj׫M }p.8{hr"Гe:,BZl"w.|-*ȕ}r̦JBU R{JZPVW2^A*Z"h<]F |lP>/Wጫ!PՑ$!U]@l^4Lnx+0^K#ӹwkn$_\Ԥ̚IXBey:Ɇ?]^6&([2sXeD5c, k(WK8\88)twh}0ꊸmh|>G# =FȫDXa, Ûѻzfj@tGw~HdlD55 ﮾vz}3~љ_7Os; O6Iajonb5,=Hsq:r`vM!v3C._y|frl6,]IŔіz+MWu^nR8ubhS?ּ^Owk.ڤOAOkĹj&![S RNwn\EhYȼkּ^wk.u_uvgWVDNoogz>?=xlT3)vc/篞e%]Ӝ{IyH+@TGq)fUEo #H<+._˩ɜfw!O~:~$+抋L>Y;B3C /Fu9@vq&ϙVJ W^ 4EV(Y3m@g؝4\ˤC$h֧!Dv3\,I-tt;p" q&ں\YYϿ{ (3NEá  >L.7M]i~-P [k{WPʚ ,4(upU%(b*ʜ~)f|~9O{) E/}J%s/=^O$|Kp.w 40av s?@󰔡$b\8~FګeiBw^ :R0BBW Y!ŞmU{}ߟ;cBt=ɨRa$t(, ld*M kLј@#TwpovoU0k,hRq] +R7ILPWSY)v:ŭLtlqnK ~(Pehcq,łcaMp1Ye{N2!z4dhHXj>f~\hA˖Mo2F4K\6:Ҋ@+  4kV ٟ6GB"ՓO=-62Ƶ+^n,A9((c|e?(Sef%×6ˡPqkXM2]!rN:[j3Un %| N,Mh?-o>ߡJw|KJgJ^ߦ[ k9YLk)G}K6ޒ)S^֔cY4g&_ߜ=7wio>D=:Zk_cdfdSF@Ɏlt-%P|Gf:HSRDwE+6 Kǻl&!ԦyzpyC "A˶9GD6ݍmAwkCIH%ӡRrN2t0OeP;4Q`%GcDHL%gR<"4V+V8JlJ1'4uh>JH, .7~6Mö  y^+\fј»8J)Z(Ѕ+Z0Fj=l")I ́Uc,G 7Z@/0̸jBPG,^SA~>ϲr(Y9m7>T և_߷f&sy>w_TM?n1Q@V~$Oo/:hN;7?RBAu5a S!.:_$> ׵\J8,Oع Jg+~I.k1q_W)H89`߭ɕ%Q-+VW jYjLإ%zfKW wW^b;i?@cm:FȡRm1~Ƭ]7sۧŦsn7r/)fak绶}~ڻ_fOk3MP~JH2{zf\/FAYIq%C5G6g TkD3p$J^d+*iwi+eHdZ(kJqj M mjg(<8r[[Q PK$Of|i۩)DKk@ Sy>Rvn-?(l]@ ?>տ^= C*D{U!ZT'*B(en"Jz'3 ūL90|2uܖQ)g@^FvY>(pv~KK꫟=pyAL^'Ϟ:9uLUbARsAN r{ ֳgiX&QG-/LOh%!pd;XZx8f4x<ϴ&s*3 'M\:P-jYj@\SEKAwcG/}JeQ3UiR0ɠ `K\cʜ1dLOrAZ,dÅ0o"I^Uࡲ5g*,o53Cxvf8GyxuXW: uф]k}FXSI運 )߿ ե# P܅~X)% 3-U}6mJkh=\-ga" Q >sۺI`. ۥ9{TH[8c=b;O[؃"ykmsH};p.91_ylԲ w^`+b3eϦ-v)Xٶm ҭR8fS론 4DE^ID.!WȽEe!n$lN?LsW IbBAihPc\IFBzd K^ֺLR+ dZ! 8-+,Ʌ69Q`#G$ĭQYcb$t٩5V|S_SkD U8K+ÒʘRBAɵaE/9'9 WY%MÌD2V"(BɈP2T>Z ZNE)eZ VH2`Z6k6(\p`R\mcګbB L%ߊ _QR4ȻL( !U|kCJ׷*)MRP ,k.Wo2>'Sa%*4^#,L7w?o.0:# Q=d/= ˬ2d0T; 4W{7˸=?ͺuhL\uYW7^>/Hgԅ@s".tfGλm.`' #ŹNfJ9>~}zb]Sp>Tjo9~HZ*>^5uw]Mzzt)'jjyM !A:emznt)n׊,eA:WeePN  X)#ϝBjHa(׀aIjE_A5A)D ղEWG;SM/V s_ۏEZQ՝j-yM5GG8"Y&1 $mGwh\Gw6v:]Lr]ӒmoݪSM+J qf;NéjcMzG%Z,OfH2Q*"5yr>M#PmGeK ZOhNßNc?hj1QoX#z_Q[}duCCq)nWsuSXu GujQbݎy h֭%Ӻ!߸nTf3'SG t#5B4=PiA16b,W$u*nF0;՜I[U3ޟ9o8e&39#pkCȂruzv-@rk73i5 NjjwiA2.i``v\3 ȓHL >EzkJ0 ^;,4W `'NEO fAigT\Rĥ(W[?834=7 ViU %EpAj& !ȁ'fKK+@BP!Uru_;"*ڃCT׸TN rBjPzԫr>&?HO.?Lr/gQ"BZ5E-Y&lLrzL=TfHnCIj(ܾ_5ݎH|h?"<(cT#  jsRfZP}>K-rV,=zʰhcy|jH/]@)JFg4P VA~ĺϭ2FnLև|*Y2|ivsʞGG?졡"AR~n|;pFMGRgQr70a&4>d fU?4RuKE*6T^+I3I{pJLT6-h0mbm%Ҡ0;v0183Əh>q*O,hg FS7Fjc9+ 28s !h]:6@Y5CcN $S$iKRKo(٤ ϵE d"ʠ 3`0aE _ d\RYJET3O=F|Z1ʙ tE&bF(D˖U3~ؖA t[ַL[F, ֡7 <{afAi<:`6ALcW`HQ {*;nw_Fim~oU0tݯN_i@O?̘ߒ}3LfD)%)(h7>.z-)" ׉Y4&r-{~7`]'.~JEYڻ&z|{͔Js48ܮM|Ybm jPXFb F V'-f:4 6Ɯ00_FA_lDw܅b¸K}ި3Ȟo%hy'OJ2%xx@>.cjc" J"Y:pdCr/A۹+P/O*5hl5wcH 4N JF#!6w_)<| )BK #ExOf Qg~,zzb](ڭ >T(^qIKlܭ.w[==S[D9lrQ7rBp$gmt88×mpWsmƷ+P*4ͨ=nFf%RqPJ@ey&繲Ĝ7(K̛3qʱBp2A#$J7n+i; N A@j Z9J#T7$?=7i&uBk(C; hY~ь.x>7OMn[%{qɟ>XE="i|OV/~Ѯ].wo13oP|NKܕ,d&RtjxHύކu/j@['xW?4=LXi6'^=l5~;kb`(1D) Ȕ4-U d^_gت(3MGӇMe6^Sg|H gR љK;Ub7T(5'B)gaHZ%纬5~Dccsq+(mN6 &$#KnǵG_Jl%G|᜷{oE0 1vN *NM$GZ2 M );Eŵ(o.zf+.M%^l`WWuÔсr_eI6L(NI!8 <$GlH&dF1ֱ)Og'^[^ "B:ɭqO@>0/Ai-ffćeȢeA ɂBc+'psqrDȖX|2=ӁCfCYХ_*m&VùG ")=` Yzm7o/-z+ \뼔<ӸKz87lEF/gE>{٨$S%233Ukqz\@dlv0vϊvvrj>۝E0S[?D$Xv pS2aTF0UJkB2הyM,3yWR5a4aׄ& eq b bh0xH$Ng<zvF)R\u/H?Lzf:jdT ݱDFʱȀ#YLQS+BԚ^_8+ұm|WA -a<Řx$3RѠ8pN RAL刉!Rf ԇۑHz$p9FBIl=RFPP"ߩUR:3"M4rL2 QHpЭvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005335063615150111174017704 0ustar rootrootFeb 26 15:41:11 crc systemd[1]: Starting Kubernetes Kubelet... Feb 26 15:41:11 crc restorecon[4754]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:11 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:41:12 crc restorecon[4754]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 26 15:41:13 crc kubenswrapper[5033]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 15:41:13 crc kubenswrapper[5033]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 26 15:41:13 crc kubenswrapper[5033]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 15:41:13 crc kubenswrapper[5033]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 15:41:13 crc kubenswrapper[5033]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 26 15:41:13 crc kubenswrapper[5033]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.367364 5033 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380419 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380450 5033 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380454 5033 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380458 5033 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380462 5033 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380466 5033 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380471 5033 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380476 5033 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380482 5033 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380488 5033 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380493 5033 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380499 5033 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380504 5033 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380507 5033 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380511 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380515 5033 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380518 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380524 5033 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380530 5033 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380535 5033 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380540 5033 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380544 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380548 5033 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380553 5033 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380557 5033 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380561 5033 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380565 5033 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380569 5033 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380572 5033 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380576 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380579 5033 feature_gate.go:330] unrecognized feature gate: Example Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380583 5033 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380586 5033 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380590 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380594 5033 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380598 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380601 5033 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380606 5033 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380611 5033 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380616 5033 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380620 5033 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380623 5033 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380627 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380631 5033 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380635 5033 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380639 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380642 5033 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380646 5033 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380649 5033 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380653 5033 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380657 5033 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380662 5033 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380665 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380669 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380673 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380678 5033 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380681 5033 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380685 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380689 5033 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380693 5033 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380697 5033 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380701 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380705 5033 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380709 5033 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380713 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380717 5033 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380721 5033 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380727 5033 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380731 5033 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380735 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.380740 5033 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382765 5033 flags.go:64] FLAG: --address="0.0.0.0" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382799 5033 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382814 5033 flags.go:64] FLAG: --anonymous-auth="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382828 5033 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382840 5033 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382850 5033 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382862 5033 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382874 5033 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382886 5033 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382896 5033 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382906 5033 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382916 5033 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382926 5033 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382936 5033 flags.go:64] FLAG: --cgroup-root="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382945 5033 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382955 5033 flags.go:64] FLAG: --client-ca-file="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382964 5033 flags.go:64] FLAG: --cloud-config="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382973 5033 flags.go:64] FLAG: --cloud-provider="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382982 5033 flags.go:64] FLAG: --cluster-dns="[]" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.382994 5033 flags.go:64] FLAG: --cluster-domain="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383003 5033 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383015 5033 flags.go:64] FLAG: --config-dir="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383025 5033 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383037 5033 flags.go:64] FLAG: --container-log-max-files="5" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383050 5033 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383060 5033 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383069 5033 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383079 5033 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383088 5033 flags.go:64] FLAG: --contention-profiling="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383098 5033 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383107 5033 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383116 5033 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383125 5033 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383136 5033 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383146 5033 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383155 5033 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383165 5033 flags.go:64] FLAG: --enable-load-reader="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383174 5033 flags.go:64] FLAG: --enable-server="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383183 5033 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383194 5033 flags.go:64] FLAG: --event-burst="100" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383204 5033 flags.go:64] FLAG: --event-qps="50" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383213 5033 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383223 5033 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383232 5033 flags.go:64] FLAG: --eviction-hard="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383243 5033 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383252 5033 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383291 5033 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383301 5033 flags.go:64] FLAG: --eviction-soft="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383310 5033 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383320 5033 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383329 5033 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383339 5033 flags.go:64] FLAG: --experimental-mounter-path="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383348 5033 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383357 5033 flags.go:64] FLAG: --fail-swap-on="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383366 5033 flags.go:64] FLAG: --feature-gates="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383378 5033 flags.go:64] FLAG: --file-check-frequency="20s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383388 5033 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383398 5033 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383408 5033 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383418 5033 flags.go:64] FLAG: --healthz-port="10248" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383429 5033 flags.go:64] FLAG: --help="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383439 5033 flags.go:64] FLAG: --hostname-override="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383448 5033 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383488 5033 flags.go:64] FLAG: --http-check-frequency="20s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383499 5033 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383509 5033 flags.go:64] FLAG: --image-credential-provider-config="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383518 5033 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383527 5033 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383536 5033 flags.go:64] FLAG: --image-service-endpoint="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383545 5033 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383554 5033 flags.go:64] FLAG: --kube-api-burst="100" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383564 5033 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383573 5033 flags.go:64] FLAG: --kube-api-qps="50" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383583 5033 flags.go:64] FLAG: --kube-reserved="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383594 5033 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383603 5033 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383613 5033 flags.go:64] FLAG: --kubelet-cgroups="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383622 5033 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383631 5033 flags.go:64] FLAG: --lock-file="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383640 5033 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383649 5033 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383658 5033 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383672 5033 flags.go:64] FLAG: --log-json-split-stream="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383683 5033 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383693 5033 flags.go:64] FLAG: --log-text-split-stream="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383702 5033 flags.go:64] FLAG: --logging-format="text" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383711 5033 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383721 5033 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383731 5033 flags.go:64] FLAG: --manifest-url="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383741 5033 flags.go:64] FLAG: --manifest-url-header="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383752 5033 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383762 5033 flags.go:64] FLAG: --max-open-files="1000000" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383782 5033 flags.go:64] FLAG: --max-pods="110" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383791 5033 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383800 5033 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383810 5033 flags.go:64] FLAG: --memory-manager-policy="None" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383819 5033 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383828 5033 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383837 5033 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383846 5033 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383866 5033 flags.go:64] FLAG: --node-status-max-images="50" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383875 5033 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383884 5033 flags.go:64] FLAG: --oom-score-adj="-999" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383894 5033 flags.go:64] FLAG: --pod-cidr="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383903 5033 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383916 5033 flags.go:64] FLAG: --pod-manifest-path="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383925 5033 flags.go:64] FLAG: --pod-max-pids="-1" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383935 5033 flags.go:64] FLAG: --pods-per-core="0" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383944 5033 flags.go:64] FLAG: --port="10250" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383953 5033 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383962 5033 flags.go:64] FLAG: --provider-id="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383971 5033 flags.go:64] FLAG: --qos-reserved="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383980 5033 flags.go:64] FLAG: --read-only-port="10255" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383989 5033 flags.go:64] FLAG: --register-node="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.383998 5033 flags.go:64] FLAG: --register-schedulable="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384007 5033 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384024 5033 flags.go:64] FLAG: --registry-burst="10" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384033 5033 flags.go:64] FLAG: --registry-qps="5" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384042 5033 flags.go:64] FLAG: --reserved-cpus="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384052 5033 flags.go:64] FLAG: --reserved-memory="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384065 5033 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384076 5033 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384087 5033 flags.go:64] FLAG: --rotate-certificates="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384096 5033 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384106 5033 flags.go:64] FLAG: --runonce="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384115 5033 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384124 5033 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384134 5033 flags.go:64] FLAG: --seccomp-default="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384144 5033 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384153 5033 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384162 5033 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384172 5033 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384181 5033 flags.go:64] FLAG: --storage-driver-password="root" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384190 5033 flags.go:64] FLAG: --storage-driver-secure="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384199 5033 flags.go:64] FLAG: --storage-driver-table="stats" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384208 5033 flags.go:64] FLAG: --storage-driver-user="root" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384216 5033 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384226 5033 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384236 5033 flags.go:64] FLAG: --system-cgroups="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384245 5033 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384286 5033 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384296 5033 flags.go:64] FLAG: --tls-cert-file="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384304 5033 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384317 5033 flags.go:64] FLAG: --tls-min-version="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384326 5033 flags.go:64] FLAG: --tls-private-key-file="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384335 5033 flags.go:64] FLAG: --topology-manager-policy="none" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384344 5033 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384353 5033 flags.go:64] FLAG: --topology-manager-scope="container" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384362 5033 flags.go:64] FLAG: --v="2" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384374 5033 flags.go:64] FLAG: --version="false" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384385 5033 flags.go:64] FLAG: --vmodule="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384397 5033 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.384408 5033 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384608 5033 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384619 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384629 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384638 5033 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384647 5033 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384655 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384664 5033 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384671 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384679 5033 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384687 5033 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384695 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384702 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384713 5033 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384723 5033 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384732 5033 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384741 5033 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384749 5033 feature_gate.go:330] unrecognized feature gate: Example Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384757 5033 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384765 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384773 5033 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384781 5033 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384790 5033 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384798 5033 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384807 5033 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384815 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384823 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384832 5033 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384841 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384849 5033 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384856 5033 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384864 5033 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384872 5033 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384880 5033 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384888 5033 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384895 5033 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384903 5033 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384911 5033 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384919 5033 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384930 5033 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384939 5033 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384948 5033 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384956 5033 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384964 5033 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384972 5033 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384980 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384988 5033 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.384995 5033 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385003 5033 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385011 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385018 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385026 5033 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385036 5033 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385046 5033 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385056 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385066 5033 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385074 5033 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385082 5033 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385090 5033 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385097 5033 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385105 5033 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385113 5033 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385121 5033 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385128 5033 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385136 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385148 5033 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385157 5033 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385165 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385174 5033 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385182 5033 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385192 5033 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.385201 5033 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.386075 5033 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.399789 5033 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.399845 5033 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400018 5033 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400046 5033 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400061 5033 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400073 5033 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400082 5033 feature_gate.go:330] unrecognized feature gate: Example Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400091 5033 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400099 5033 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400109 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400119 5033 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400128 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400136 5033 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400144 5033 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400152 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400160 5033 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400167 5033 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400175 5033 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400183 5033 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400191 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400200 5033 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400208 5033 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400215 5033 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400223 5033 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400230 5033 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400238 5033 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400246 5033 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400613 5033 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400630 5033 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400644 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400657 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400665 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400673 5033 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400684 5033 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400695 5033 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400705 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400731 5033 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400741 5033 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400750 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400757 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400766 5033 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400774 5033 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400782 5033 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400790 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400798 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400806 5033 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400814 5033 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400822 5033 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400829 5033 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400837 5033 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400845 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400853 5033 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400861 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400868 5033 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400876 5033 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400885 5033 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400895 5033 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400904 5033 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400913 5033 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400921 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400929 5033 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400940 5033 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400950 5033 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400958 5033 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400965 5033 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400976 5033 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400987 5033 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.400996 5033 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401005 5033 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401014 5033 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401022 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401031 5033 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401061 5033 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.401075 5033 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401439 5033 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401462 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401473 5033 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401482 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401491 5033 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401499 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401507 5033 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401515 5033 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401523 5033 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401531 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401539 5033 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401547 5033 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401555 5033 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401563 5033 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401571 5033 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401582 5033 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401593 5033 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401604 5033 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401614 5033 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401623 5033 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401631 5033 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401639 5033 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401647 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401657 5033 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401667 5033 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401675 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401684 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401692 5033 feature_gate.go:330] unrecognized feature gate: Example Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401700 5033 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401708 5033 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401716 5033 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401723 5033 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401732 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401740 5033 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401761 5033 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401769 5033 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401777 5033 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401785 5033 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401793 5033 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401887 5033 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401898 5033 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401906 5033 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401913 5033 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401921 5033 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401929 5033 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401937 5033 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401944 5033 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401952 5033 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401959 5033 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401967 5033 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401975 5033 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401983 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401990 5033 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.401998 5033 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402006 5033 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402013 5033 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402021 5033 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402028 5033 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402036 5033 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402043 5033 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402051 5033 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402058 5033 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402066 5033 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402074 5033 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402082 5033 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402090 5033 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402097 5033 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402105 5033 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402112 5033 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402120 5033 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.402149 5033 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.402162 5033 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.402491 5033 server.go:940] "Client rotation is on, will bootstrap in background" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.407068 5033 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.411903 5033 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.412076 5033 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.414034 5033 server.go:997] "Starting client certificate rotation" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.414088 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.415110 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.441988 5033 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.446011 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.446912 5033 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.463067 5033 log.go:25] "Validated CRI v1 runtime API" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.507334 5033 log.go:25] "Validated CRI v1 image API" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.509508 5033 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.517711 5033 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-26-15-36-23-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.517754 5033 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.541573 5033 manager.go:217] Machine: {Timestamp:2026-02-26 15:41:13.538495763 +0000 UTC m=+0.751094453 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8ea4d39b-2c74-4703-9738-58e0d85cfdf9 BootID:4e74ecb8-0e4e-4659-99ac-3abfb809185b Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2b:5c:98 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2b:5c:98 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:84:dd:11 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:7c:ff:4a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:49:29:e4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:4f:0f:63 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:2a:79:6c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3e:24:36:d7:70:af Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0a:06:8a:b5:70:ba Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.541872 5033 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.542068 5033 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.542405 5033 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.542629 5033 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.542672 5033 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.542947 5033 topology_manager.go:138] "Creating topology manager with none policy" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.542966 5033 container_manager_linux.go:303] "Creating device plugin manager" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.543522 5033 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.543562 5033 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.544345 5033 state_mem.go:36] "Initialized new in-memory state store" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.544460 5033 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.549606 5033 kubelet.go:418] "Attempting to sync node with API server" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.549648 5033 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.549686 5033 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.549701 5033 kubelet.go:324] "Adding apiserver pod source" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.549717 5033 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.553797 5033 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.555135 5033 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.556336 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.556410 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.556387 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.556512 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.556549 5033 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558209 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558247 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558282 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558295 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558314 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558326 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558338 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558386 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558400 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558412 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558445 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.558457 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.559530 5033 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.560184 5033 server.go:1280] "Started kubelet" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.561842 5033 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.561881 5033 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.562177 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:13 crc systemd[1]: Started Kubernetes Kubelet. Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.562724 5033 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.562811 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.562856 5033 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.563110 5033 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.563138 5033 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.563224 5033 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.563100 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.567718 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="200ms" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.574737 5033 server.go:460] "Adding debug handlers to kubelet server" Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.575916 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.577470 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.574996 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1897d62e7dbfc7e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,LastTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.577722 5033 factory.go:55] Registering systemd factory Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.577757 5033 factory.go:221] Registration of the systemd container factory successfully Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.578401 5033 factory.go:153] Registering CRI-O factory Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.578445 5033 factory.go:221] Registration of the crio container factory successfully Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.578558 5033 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.578591 5033 factory.go:103] Registering Raw factory Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.578621 5033 manager.go:1196] Started watching for new ooms in manager Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.580106 5033 manager.go:319] Starting recovery of all containers Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.580241 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582487 5033 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582525 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582542 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582556 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582571 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582586 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582601 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582617 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582635 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582648 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582661 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582674 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582687 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582701 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582718 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582730 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582742 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582754 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582770 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582787 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582800 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582811 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582891 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582908 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582955 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582966 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.582984 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583000 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583012 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583024 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583038 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583052 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583067 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583082 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583099 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583114 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583144 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583163 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583177 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583193 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583209 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583227 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583239 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583277 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583291 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583307 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583323 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583340 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583378 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583402 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583417 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583470 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583503 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583517 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583528 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583539 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583553 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583566 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583578 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583590 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583612 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583625 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583641 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583653 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583664 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583673 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583684 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583695 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583706 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583715 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583729 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583742 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583757 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583769 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583784 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583794 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583808 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583822 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583842 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.583907 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584025 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584057 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584089 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584113 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584140 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584170 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584194 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584216 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584238 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584299 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584330 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584360 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584388 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584414 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584447 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584471 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584494 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584513 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584535 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584562 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584591 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584616 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584637 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584661 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584699 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584755 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584805 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584832 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584858 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584881 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584906 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584929 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584952 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584974 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.584997 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585018 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585038 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585059 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585081 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585100 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585119 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585142 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585161 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585181 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585199 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585218 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585238 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585292 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585313 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585334 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585353 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585373 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585391 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585410 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585432 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585451 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585471 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585493 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585511 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585531 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585550 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585570 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585591 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585610 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585632 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585651 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585672 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585692 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585715 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585740 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585766 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585795 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585818 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585844 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585866 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585887 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585906 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585926 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585948 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585969 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.585988 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586010 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586031 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586050 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586069 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586089 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586107 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586126 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586144 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586164 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586183 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586200 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586218 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586238 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586309 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586331 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586350 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586370 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586390 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586410 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586429 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586448 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586468 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586489 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586508 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586528 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586547 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586590 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586613 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586630 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586650 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586667 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586687 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586705 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586724 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586742 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586759 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586778 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586796 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586815 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586832 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586867 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586887 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586905 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586925 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586943 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586962 5033 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586982 5033 reconstruct.go:97] "Volume reconstruction finished" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.586997 5033 reconciler.go:26] "Reconciler: start to sync state" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.608936 5033 manager.go:324] Recovery completed Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.620223 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.621918 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.621967 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.621980 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.622913 5033 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.623747 5033 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.623776 5033 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.623800 5033 state_mem.go:36] "Initialized new in-memory state store" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.626738 5033 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.626785 5033 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.626814 5033 kubelet.go:2335] "Starting kubelet main sync loop" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.626864 5033 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 26 15:41:13 crc kubenswrapper[5033]: W0226 15:41:13.627758 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.627844 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.644806 5033 policy_none.go:49] "None policy: Start" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.645791 5033 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.645836 5033 state_mem.go:35] "Initializing new in-memory state store" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.666936 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.711343 5033 manager.go:334] "Starting Device Plugin manager" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.711442 5033 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.711519 5033 server.go:79] "Starting device plugin registration server" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.712849 5033 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.712887 5033 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.713075 5033 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.713244 5033 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.713312 5033 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.723530 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.727744 5033 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.727841 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.729114 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.729158 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.729173 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.729365 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.729701 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.729776 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.730480 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.730516 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.730528 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.730743 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731024 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731100 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731515 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731537 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731549 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731579 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731614 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731631 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731677 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731885 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.731948 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.732600 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.732643 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.732642 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.732661 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.732685 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.732768 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.732908 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.733053 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.733086 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.733212 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.733243 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.733320 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.734281 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.734305 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.734315 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.734456 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.734478 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.734450 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.734567 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.734633 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.738789 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.739577 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.740924 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.768923 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="400ms" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.789950 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.789991 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790009 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790024 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790041 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790059 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790074 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790130 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790181 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790202 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790227 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790244 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790276 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790291 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.790322 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.813352 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.815512 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.815551 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.815564 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.815587 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:13 crc kubenswrapper[5033]: E0226 15:41:13.816062 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.891993 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892060 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892098 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892132 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892161 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892193 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892204 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892313 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892341 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892345 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892378 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892203 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892221 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892423 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892457 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892477 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892477 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892495 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892519 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892528 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892545 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892551 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892563 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892579 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892580 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892589 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892612 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892635 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892643 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:41:13 crc kubenswrapper[5033]: I0226 15:41:13.892775 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.017048 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.018472 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.018550 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.018568 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.018632 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:14 crc kubenswrapper[5033]: E0226 15:41:14.019248 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.072163 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.084093 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.107400 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.120124 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.127550 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:41:14 crc kubenswrapper[5033]: W0226 15:41:14.128490 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-dc8265b4a9ff0c570d5c24ea5ab05d306f3b330245e48056989b682b90ab3d35 WatchSource:0}: Error finding container dc8265b4a9ff0c570d5c24ea5ab05d306f3b330245e48056989b682b90ab3d35: Status 404 returned error can't find the container with id dc8265b4a9ff0c570d5c24ea5ab05d306f3b330245e48056989b682b90ab3d35 Feb 26 15:41:14 crc kubenswrapper[5033]: W0226 15:41:14.128919 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-7e2fa01fb5734a37a59a364eeb34e4976b8a16ca6acef8001e4f372becaac7f6 WatchSource:0}: Error finding container 7e2fa01fb5734a37a59a364eeb34e4976b8a16ca6acef8001e4f372becaac7f6: Status 404 returned error can't find the container with id 7e2fa01fb5734a37a59a364eeb34e4976b8a16ca6acef8001e4f372becaac7f6 Feb 26 15:41:14 crc kubenswrapper[5033]: W0226 15:41:14.134425 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-1bb9f317ffdd541de512565357dcdc243e337028401d9036e2451b13839198ce WatchSource:0}: Error finding container 1bb9f317ffdd541de512565357dcdc243e337028401d9036e2451b13839198ce: Status 404 returned error can't find the container with id 1bb9f317ffdd541de512565357dcdc243e337028401d9036e2451b13839198ce Feb 26 15:41:14 crc kubenswrapper[5033]: W0226 15:41:14.143660 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-10e776233640c7ea4b9ae9ef99436cb4e6418a580eb31aff26e51e3688022fb7 WatchSource:0}: Error finding container 10e776233640c7ea4b9ae9ef99436cb4e6418a580eb31aff26e51e3688022fb7: Status 404 returned error can't find the container with id 10e776233640c7ea4b9ae9ef99436cb4e6418a580eb31aff26e51e3688022fb7 Feb 26 15:41:14 crc kubenswrapper[5033]: W0226 15:41:14.145654 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5e0c069e5d7b5bed13a3f2408699257bb16969c4195102a544c6cbf2752e9179 WatchSource:0}: Error finding container 5e0c069e5d7b5bed13a3f2408699257bb16969c4195102a544c6cbf2752e9179: Status 404 returned error can't find the container with id 5e0c069e5d7b5bed13a3f2408699257bb16969c4195102a544c6cbf2752e9179 Feb 26 15:41:14 crc kubenswrapper[5033]: E0226 15:41:14.170638 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="800ms" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.419382 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.420760 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.420829 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.420854 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.420897 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:14 crc kubenswrapper[5033]: E0226 15:41:14.421908 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.563206 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.634489 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dc8265b4a9ff0c570d5c24ea5ab05d306f3b330245e48056989b682b90ab3d35"} Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.635667 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7e2fa01fb5734a37a59a364eeb34e4976b8a16ca6acef8001e4f372becaac7f6"} Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.636553 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5e0c069e5d7b5bed13a3f2408699257bb16969c4195102a544c6cbf2752e9179"} Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.637445 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"10e776233640c7ea4b9ae9ef99436cb4e6418a580eb31aff26e51e3688022fb7"} Feb 26 15:41:14 crc kubenswrapper[5033]: I0226 15:41:14.638323 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1bb9f317ffdd541de512565357dcdc243e337028401d9036e2451b13839198ce"} Feb 26 15:41:14 crc kubenswrapper[5033]: W0226 15:41:14.797046 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:14 crc kubenswrapper[5033]: E0226 15:41:14.797510 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:14 crc kubenswrapper[5033]: W0226 15:41:14.872293 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:14 crc kubenswrapper[5033]: E0226 15:41:14.872399 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:14 crc kubenswrapper[5033]: W0226 15:41:14.880907 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:14 crc kubenswrapper[5033]: E0226 15:41:14.880983 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:14 crc kubenswrapper[5033]: E0226 15:41:14.971758 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="1.6s" Feb 26 15:41:15 crc kubenswrapper[5033]: W0226 15:41:15.052497 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:15 crc kubenswrapper[5033]: E0226 15:41:15.052603 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.222456 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.224591 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.224649 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.224666 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.224706 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:15 crc kubenswrapper[5033]: E0226 15:41:15.225338 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.563996 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.571292 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:41:15 crc kubenswrapper[5033]: E0226 15:41:15.572468 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.643417 5033 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33" exitCode=0 Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.643525 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.643524 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33"} Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.644507 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.644540 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.644555 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.648160 5033 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="6f482405da075424da5a44e6641fd1d66ccb726e99e255d2c371957ffdae42ff" exitCode=0 Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.648318 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"6f482405da075424da5a44e6641fd1d66ccb726e99e255d2c371957ffdae42ff"} Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.648662 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.653176 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.653230 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.653244 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.653365 5033 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="da8d70588c4a34b6d41719ebd49ca005f75f26a02271614803217707afab30a3" exitCode=0 Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.653469 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"da8d70588c4a34b6d41719ebd49ca005f75f26a02271614803217707afab30a3"} Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.653501 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.654529 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.654551 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.654562 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.658217 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8f643abd831b7eac7606ee02d190871e8d527789b6f4522415416394dead3550"} Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.658275 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3322eb4c05203e17f4f14462df514182fe065badafcf134917bd1a0f067b4443"} Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.658291 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c86862bdc1157284ee2b33caed922e81d6dcf1ebfcc7fd0ba4c2120083ff5cf5"} Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.659985 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6" exitCode=0 Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.660015 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6"} Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.660045 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.660734 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.660764 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.660773 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.662224 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.662923 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.662948 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:15 crc kubenswrapper[5033]: I0226 15:41:15.662958 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.562871 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:16 crc kubenswrapper[5033]: E0226 15:41:16.572685 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="3.2s" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.666221 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413"} Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.666318 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203"} Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.666340 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b"} Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.668444 5033 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68" exitCode=0 Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.668488 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68"} Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.668598 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.670186 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.670312 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.670364 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.670612 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dfdc914a6d94e093ae9b588490ee4ffdf54dd2cfe233fc3a9457bc8cf50cdff9"} Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.670700 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.672043 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.672094 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.672116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.674677 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0ae5884aa7b151e17c1f8417de42354d6c8509318b69eec38ca42267ca6fcd1e"} Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.674740 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cf00384e5355fc9166dce0c74068322041dad8e2635c6bfddc3f81e9c0efcaae"} Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.674768 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1cf7ebda738243811952eff76cf6376de46880a9527e96e3b93055a121538823"} Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.674821 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.676214 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.676314 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.676348 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.678689 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"94e141121ba022606f86253d983d940c32383d92dfabe417df0dbb42445b8299"} Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.678795 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.680053 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.680085 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.680094 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:16 crc kubenswrapper[5033]: W0226 15:41:16.760220 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:16 crc kubenswrapper[5033]: E0226 15:41:16.760378 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.826140 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.828525 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.828557 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.828567 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:16 crc kubenswrapper[5033]: I0226 15:41:16.828587 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:16 crc kubenswrapper[5033]: E0226 15:41:16.828966 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Feb 26 15:41:17 crc kubenswrapper[5033]: W0226 15:41:17.020905 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:17 crc kubenswrapper[5033]: E0226 15:41:17.021076 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:17 crc kubenswrapper[5033]: W0226 15:41:17.134573 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:17 crc kubenswrapper[5033]: E0226 15:41:17.134690 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:17 crc kubenswrapper[5033]: W0226 15:41:17.242106 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:17 crc kubenswrapper[5033]: E0226 15:41:17.242219 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.562878 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.684882 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"18ee712eef4dc61d68f4bef4b2d871e2b89992290f999fbcfbdc1aba5fd1a7f7"} Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.684922 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e"} Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.685012 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.685935 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.685956 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.685966 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.686882 5033 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642" exitCode=0 Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.686961 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.686988 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687014 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687022 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687059 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642"} Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687023 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687789 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687800 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687874 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687894 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.687904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.688974 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.688998 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.689007 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.689023 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.689039 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:17 crc kubenswrapper[5033]: I0226 15:41:17.689049 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.027292 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.112681 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.693102 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"236bd9859aa14e2adf418323c049fdd9ba63540adc8d44feab7db2deb72af262"} Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.693548 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0737283a2d944dc412534ebe53bc180bf3005fb66e164e72abb7113a86452c6e"} Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.693581 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f55baa5845d1d93e807193c728c2adc29ac1396a864f97c8445094ba944820e1"} Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.693599 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c1c5f32b45ecafcde39a72bf4595128654203e513cb5660bdf8c5536dd0dd6fe"} Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.693232 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.693177 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.693657 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.694867 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.694903 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.694904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.694938 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.694953 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.694915 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.841968 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.848942 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:18 crc kubenswrapper[5033]: I0226 15:41:18.918730 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.533988 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.699766 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6b2be63f3aeb64760fc286645929a8863c2794930849e4d662a65a0ec04aee78"} Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.699862 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.699929 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.700023 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.701307 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.701364 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.701379 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.701511 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.701608 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.701689 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.701658 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.701816 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.701837 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.823232 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:41:19 crc kubenswrapper[5033]: I0226 15:41:19.881199 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.029599 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.030860 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.030995 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.031011 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.031032 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.703185 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.703308 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.704352 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.705116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.705160 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.705175 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.705270 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.705351 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.705366 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.706290 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.706320 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:20 crc kubenswrapper[5033]: I0226 15:41:20.706332 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.005528 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.211734 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.466289 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.466546 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.468217 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.468290 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.468308 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.705133 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.705290 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.706488 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.706521 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.706528 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.706534 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.706547 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:21 crc kubenswrapper[5033]: I0226 15:41:21.706563 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:22 crc kubenswrapper[5033]: I0226 15:41:22.708222 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:22 crc kubenswrapper[5033]: I0226 15:41:22.709485 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:22 crc kubenswrapper[5033]: I0226 15:41:22.709549 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:22 crc kubenswrapper[5033]: I0226 15:41:22.709565 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:23 crc kubenswrapper[5033]: I0226 15:41:23.569494 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:23 crc kubenswrapper[5033]: I0226 15:41:23.569771 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:23 crc kubenswrapper[5033]: I0226 15:41:23.571279 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:23 crc kubenswrapper[5033]: I0226 15:41:23.571319 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:23 crc kubenswrapper[5033]: I0226 15:41:23.571328 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:23 crc kubenswrapper[5033]: E0226 15:41:23.723680 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:41:24 crc kubenswrapper[5033]: I0226 15:41:24.212516 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:41:24 crc kubenswrapper[5033]: I0226 15:41:24.212667 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:41:27 crc kubenswrapper[5033]: E0226 15:41:27.580419 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.1897d62e7dbfc7e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,LastTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.600396 5033 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41594->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.600475 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41594->192.168.126.11:17697: read: connection reset by peer" Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.723903 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.726480 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="18ee712eef4dc61d68f4bef4b2d871e2b89992290f999fbcfbdc1aba5fd1a7f7" exitCode=255 Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.726537 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"18ee712eef4dc61d68f4bef4b2d871e2b89992290f999fbcfbdc1aba5fd1a7f7"} Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.726812 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.728022 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.728056 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.728065 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:27 crc kubenswrapper[5033]: I0226 15:41:27.728570 5033 scope.go:117] "RemoveContainer" containerID="18ee712eef4dc61d68f4bef4b2d871e2b89992290f999fbcfbdc1aba5fd1a7f7" Feb 26 15:41:28 crc kubenswrapper[5033]: W0226 15:41:28.017304 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z Feb 26 15:41:28 crc kubenswrapper[5033]: E0226 15:41:28.017420 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:28 crc kubenswrapper[5033]: W0226 15:41:28.018524 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z Feb 26 15:41:28 crc kubenswrapper[5033]: E0226 15:41:28.018624 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:28 crc kubenswrapper[5033]: W0226 15:41:28.018768 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z Feb 26 15:41:28 crc kubenswrapper[5033]: E0226 15:41:28.018835 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:28 crc kubenswrapper[5033]: E0226 15:41:28.020495 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:28 crc kubenswrapper[5033]: E0226 15:41:28.023779 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:41:28 crc kubenswrapper[5033]: E0226 15:41:28.026552 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 26 15:41:28 crc kubenswrapper[5033]: W0226 15:41:28.028937 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z Feb 26 15:41:28 crc kubenswrapper[5033]: E0226 15:41:28.029034 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.029153 5033 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.029202 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.029755 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.037357 5033 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.037438 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.117149 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.117288 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.118310 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.118347 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.118356 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.566237 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:28Z is after 2026-02-23T05:33:13Z Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.731526 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.734229 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e8583b8c62973b7798acf408b2ea689ec1bbb2a6a25aa368e75e72143e2d3eb2"} Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.734559 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.735797 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.735858 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.735875 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:28 crc kubenswrapper[5033]: I0226 15:41:28.918984 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.540610 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.566209 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:29Z is after 2026-02-23T05:33:13Z Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.739620 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.740224 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.742759 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e8583b8c62973b7798acf408b2ea689ec1bbb2a6a25aa368e75e72143e2d3eb2" exitCode=255 Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.742818 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e8583b8c62973b7798acf408b2ea689ec1bbb2a6a25aa368e75e72143e2d3eb2"} Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.742895 5033 scope.go:117] "RemoveContainer" containerID="18ee712eef4dc61d68f4bef4b2d871e2b89992290f999fbcfbdc1aba5fd1a7f7" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.742921 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.744156 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.744202 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.744218 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.745334 5033 scope.go:117] "RemoveContainer" containerID="e8583b8c62973b7798acf408b2ea689ec1bbb2a6a25aa368e75e72143e2d3eb2" Feb 26 15:41:29 crc kubenswrapper[5033]: E0226 15:41:29.745719 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:41:29 crc kubenswrapper[5033]: I0226 15:41:29.748516 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:30 crc kubenswrapper[5033]: I0226 15:41:30.567127 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:30Z is after 2026-02-23T05:33:13Z Feb 26 15:41:30 crc kubenswrapper[5033]: I0226 15:41:30.748775 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 15:41:30 crc kubenswrapper[5033]: I0226 15:41:30.752025 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:30 crc kubenswrapper[5033]: I0226 15:41:30.752919 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:30 crc kubenswrapper[5033]: I0226 15:41:30.752959 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:30 crc kubenswrapper[5033]: I0226 15:41:30.752971 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:30 crc kubenswrapper[5033]: I0226 15:41:30.753502 5033 scope.go:117] "RemoveContainer" containerID="e8583b8c62973b7798acf408b2ea689ec1bbb2a6a25aa368e75e72143e2d3eb2" Feb 26 15:41:30 crc kubenswrapper[5033]: E0226 15:41:30.753688 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.068009 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.068309 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.069932 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.069997 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.070017 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.080460 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.568589 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:31Z is after 2026-02-23T05:33:13Z Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.754736 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.755568 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.755973 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.756104 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.756139 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.756958 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.757115 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.757364 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:31 crc kubenswrapper[5033]: I0226 15:41:31.758095 5033 scope.go:117] "RemoveContainer" containerID="e8583b8c62973b7798acf408b2ea689ec1bbb2a6a25aa368e75e72143e2d3eb2" Feb 26 15:41:31 crc kubenswrapper[5033]: E0226 15:41:31.758462 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:41:32 crc kubenswrapper[5033]: I0226 15:41:32.570243 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:32Z is after 2026-02-23T05:33:13Z Feb 26 15:41:33 crc kubenswrapper[5033]: I0226 15:41:33.566423 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:33Z is after 2026-02-23T05:33:13Z Feb 26 15:41:33 crc kubenswrapper[5033]: E0226 15:41:33.723841 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:41:34 crc kubenswrapper[5033]: I0226 15:41:34.212393 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:41:34 crc kubenswrapper[5033]: I0226 15:41:34.212508 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:41:34 crc kubenswrapper[5033]: I0226 15:41:34.424303 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:34 crc kubenswrapper[5033]: I0226 15:41:34.426145 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:34 crc kubenswrapper[5033]: I0226 15:41:34.426191 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:34 crc kubenswrapper[5033]: I0226 15:41:34.426210 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:34 crc kubenswrapper[5033]: I0226 15:41:34.426243 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:34 crc kubenswrapper[5033]: E0226 15:41:34.429248 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:34Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:41:34 crc kubenswrapper[5033]: E0226 15:41:34.431077 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:34Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 15:41:34 crc kubenswrapper[5033]: I0226 15:41:34.566440 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:34Z is after 2026-02-23T05:33:13Z Feb 26 15:41:35 crc kubenswrapper[5033]: I0226 15:41:35.565542 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:35Z is after 2026-02-23T05:33:13Z Feb 26 15:41:36 crc kubenswrapper[5033]: I0226 15:41:36.429920 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:41:36 crc kubenswrapper[5033]: E0226 15:41:36.433929 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:36 crc kubenswrapper[5033]: I0226 15:41:36.526730 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:36 crc kubenswrapper[5033]: I0226 15:41:36.527044 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:36 crc kubenswrapper[5033]: I0226 15:41:36.529146 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:36 crc kubenswrapper[5033]: I0226 15:41:36.529208 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:36 crc kubenswrapper[5033]: I0226 15:41:36.529222 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:36 crc kubenswrapper[5033]: I0226 15:41:36.529983 5033 scope.go:117] "RemoveContainer" containerID="e8583b8c62973b7798acf408b2ea689ec1bbb2a6a25aa368e75e72143e2d3eb2" Feb 26 15:41:36 crc kubenswrapper[5033]: E0226 15:41:36.530203 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:41:36 crc kubenswrapper[5033]: I0226 15:41:36.568746 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:36Z is after 2026-02-23T05:33:13Z Feb 26 15:41:36 crc kubenswrapper[5033]: W0226 15:41:36.985181 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:36Z is after 2026-02-23T05:33:13Z Feb 26 15:41:36 crc kubenswrapper[5033]: E0226 15:41:36.985362 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:37 crc kubenswrapper[5033]: W0226 15:41:37.518369 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:37Z is after 2026-02-23T05:33:13Z Feb 26 15:41:37 crc kubenswrapper[5033]: E0226 15:41:37.518512 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:37 crc kubenswrapper[5033]: I0226 15:41:37.907175 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:37Z is after 2026-02-23T05:33:13Z Feb 26 15:41:37 crc kubenswrapper[5033]: E0226 15:41:37.911767 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:37Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897d62e7dbfc7e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,LastTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:38 crc kubenswrapper[5033]: W0226 15:41:38.235588 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:38Z is after 2026-02-23T05:33:13Z Feb 26 15:41:38 crc kubenswrapper[5033]: E0226 15:41:38.235733 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:38 crc kubenswrapper[5033]: I0226 15:41:38.567080 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:38Z is after 2026-02-23T05:33:13Z Feb 26 15:41:38 crc kubenswrapper[5033]: W0226 15:41:38.906606 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:38Z is after 2026-02-23T05:33:13Z Feb 26 15:41:38 crc kubenswrapper[5033]: E0226 15:41:38.906762 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:39 crc kubenswrapper[5033]: I0226 15:41:39.743443 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:39Z is after 2026-02-23T05:33:13Z Feb 26 15:41:40 crc kubenswrapper[5033]: I0226 15:41:40.565528 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:40Z is after 2026-02-23T05:33:13Z Feb 26 15:41:41 crc kubenswrapper[5033]: I0226 15:41:41.429682 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:41 crc kubenswrapper[5033]: I0226 15:41:41.431325 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:41 crc kubenswrapper[5033]: I0226 15:41:41.431383 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:41 crc kubenswrapper[5033]: I0226 15:41:41.431406 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:41 crc kubenswrapper[5033]: I0226 15:41:41.431447 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:41 crc kubenswrapper[5033]: E0226 15:41:41.436564 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:41Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:41:41 crc kubenswrapper[5033]: E0226 15:41:41.439922 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:41Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 15:41:41 crc kubenswrapper[5033]: I0226 15:41:41.566455 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:41Z is after 2026-02-23T05:33:13Z Feb 26 15:41:42 crc kubenswrapper[5033]: I0226 15:41:42.566058 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:42Z is after 2026-02-23T05:33:13Z Feb 26 15:41:43 crc kubenswrapper[5033]: I0226 15:41:43.568345 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:43Z is after 2026-02-23T05:33:13Z Feb 26 15:41:43 crc kubenswrapper[5033]: E0226 15:41:43.723972 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.212954 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.213493 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.214068 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.214753 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.219295 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.219369 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.219390 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.220112 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"3322eb4c05203e17f4f14462df514182fe065badafcf134917bd1a0f067b4443"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.220404 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://3322eb4c05203e17f4f14462df514182fe065badafcf134917bd1a0f067b4443" gracePeriod=30 Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.566137 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:44Z is after 2026-02-23T05:33:13Z Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.930944 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.931789 5033 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3322eb4c05203e17f4f14462df514182fe065badafcf134917bd1a0f067b4443" exitCode=255 Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.931830 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3322eb4c05203e17f4f14462df514182fe065badafcf134917bd1a0f067b4443"} Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.932043 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"96375a2f9531b9366685030ca8542a82a2e70af614761b56d12e6705a619e833"} Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.932249 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.933371 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.933424 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:44 crc kubenswrapper[5033]: I0226 15:41:44.933445 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:45 crc kubenswrapper[5033]: I0226 15:41:45.566644 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:45Z is after 2026-02-23T05:33:13Z Feb 26 15:41:45 crc kubenswrapper[5033]: I0226 15:41:45.935468 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:45 crc kubenswrapper[5033]: I0226 15:41:45.936687 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:45 crc kubenswrapper[5033]: I0226 15:41:45.936909 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:45 crc kubenswrapper[5033]: I0226 15:41:45.937048 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:46 crc kubenswrapper[5033]: I0226 15:41:46.567776 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:46Z is after 2026-02-23T05:33:13Z Feb 26 15:41:47 crc kubenswrapper[5033]: I0226 15:41:47.566887 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:47Z is after 2026-02-23T05:33:13Z Feb 26 15:41:47 crc kubenswrapper[5033]: E0226 15:41:47.917953 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:47Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897d62e7dbfc7e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,LastTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:48 crc kubenswrapper[5033]: I0226 15:41:48.437696 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:48 crc kubenswrapper[5033]: I0226 15:41:48.439664 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:48 crc kubenswrapper[5033]: I0226 15:41:48.439744 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:48 crc kubenswrapper[5033]: I0226 15:41:48.439763 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:48 crc kubenswrapper[5033]: I0226 15:41:48.439803 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:48 crc kubenswrapper[5033]: E0226 15:41:48.442889 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:48Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:41:48 crc kubenswrapper[5033]: E0226 15:41:48.445027 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:48Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 15:41:48 crc kubenswrapper[5033]: I0226 15:41:48.565634 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:48Z is after 2026-02-23T05:33:13Z Feb 26 15:41:49 crc kubenswrapper[5033]: I0226 15:41:49.567719 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:49Z is after 2026-02-23T05:33:13Z Feb 26 15:41:50 crc kubenswrapper[5033]: I0226 15:41:50.568589 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:50Z is after 2026-02-23T05:33:13Z Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.212329 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.212811 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.214551 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.214845 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.215017 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.566331 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:51Z is after 2026-02-23T05:33:13Z Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.627248 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.628905 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.628974 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.628998 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.632852 5033 scope.go:117] "RemoveContainer" containerID="e8583b8c62973b7798acf408b2ea689ec1bbb2a6a25aa368e75e72143e2d3eb2" Feb 26 15:41:51 crc kubenswrapper[5033]: W0226 15:41:51.769623 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:51Z is after 2026-02-23T05:33:13Z Feb 26 15:41:51 crc kubenswrapper[5033]: E0226 15:41:51.769696 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:51 crc kubenswrapper[5033]: W0226 15:41:51.936935 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:51Z is after 2026-02-23T05:33:13Z Feb 26 15:41:51 crc kubenswrapper[5033]: E0226 15:41:51.937043 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.952827 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.957093 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17705066b77b164631d24a0da155579caf31cca306b32335e97812c02508c416"} Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.957289 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.958396 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.958446 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:51 crc kubenswrapper[5033]: I0226 15:41:51.958458 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.566480 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:52Z is after 2026-02-23T05:33:13Z Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.794756 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:41:52 crc kubenswrapper[5033]: E0226 15:41:52.800454 5033 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:52 crc kubenswrapper[5033]: E0226 15:41:52.801679 5033 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 26 15:41:52 crc kubenswrapper[5033]: W0226 15:41:52.933397 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:52Z is after 2026-02-23T05:33:13Z Feb 26 15:41:52 crc kubenswrapper[5033]: E0226 15:41:52.933475 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.962810 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.963474 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.965699 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17705066b77b164631d24a0da155579caf31cca306b32335e97812c02508c416" exitCode=255 Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.965742 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"17705066b77b164631d24a0da155579caf31cca306b32335e97812c02508c416"} Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.965780 5033 scope.go:117] "RemoveContainer" containerID="e8583b8c62973b7798acf408b2ea689ec1bbb2a6a25aa368e75e72143e2d3eb2" Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.965973 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.967241 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.967324 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.967347 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:52 crc kubenswrapper[5033]: I0226 15:41:52.968381 5033 scope.go:117] "RemoveContainer" containerID="17705066b77b164631d24a0da155579caf31cca306b32335e97812c02508c416" Feb 26 15:41:52 crc kubenswrapper[5033]: E0226 15:41:52.968718 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:41:53 crc kubenswrapper[5033]: I0226 15:41:53.568135 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:53Z is after 2026-02-23T05:33:13Z Feb 26 15:41:53 crc kubenswrapper[5033]: I0226 15:41:53.569872 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:41:53 crc kubenswrapper[5033]: I0226 15:41:53.570093 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:53 crc kubenswrapper[5033]: I0226 15:41:53.571560 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:53 crc kubenswrapper[5033]: I0226 15:41:53.571669 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:53 crc kubenswrapper[5033]: I0226 15:41:53.571728 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:53 crc kubenswrapper[5033]: E0226 15:41:53.724127 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:41:53 crc kubenswrapper[5033]: I0226 15:41:53.970082 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 15:41:54 crc kubenswrapper[5033]: I0226 15:41:54.212731 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:41:54 crc kubenswrapper[5033]: I0226 15:41:54.212804 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:41:54 crc kubenswrapper[5033]: I0226 15:41:54.566850 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:54Z is after 2026-02-23T05:33:13Z Feb 26 15:41:55 crc kubenswrapper[5033]: I0226 15:41:55.443452 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:55 crc kubenswrapper[5033]: I0226 15:41:55.445158 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:55 crc kubenswrapper[5033]: I0226 15:41:55.445225 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:55 crc kubenswrapper[5033]: I0226 15:41:55.445251 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:55 crc kubenswrapper[5033]: I0226 15:41:55.445354 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:41:55 crc kubenswrapper[5033]: E0226 15:41:55.450534 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:55Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 15:41:55 crc kubenswrapper[5033]: E0226 15:41:55.450954 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:41:55Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:41:55 crc kubenswrapper[5033]: I0226 15:41:55.568761 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:41:56 crc kubenswrapper[5033]: I0226 15:41:56.526207 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:56 crc kubenswrapper[5033]: I0226 15:41:56.526473 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:56 crc kubenswrapper[5033]: I0226 15:41:56.527996 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:56 crc kubenswrapper[5033]: I0226 15:41:56.528046 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:56 crc kubenswrapper[5033]: I0226 15:41:56.528064 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:56 crc kubenswrapper[5033]: I0226 15:41:56.528886 5033 scope.go:117] "RemoveContainer" containerID="17705066b77b164631d24a0da155579caf31cca306b32335e97812c02508c416" Feb 26 15:41:56 crc kubenswrapper[5033]: E0226 15:41:56.529197 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:41:56 crc kubenswrapper[5033]: I0226 15:41:56.568392 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:41:57 crc kubenswrapper[5033]: I0226 15:41:57.571175 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.924097 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e7dbfc7e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,LastTimestamp:2026-02-26 15:41:13.560139744 +0000 UTC m=+0.772738404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.929781 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f05e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621956069 +0000 UTC m=+0.834554729,LastTimestamp:2026-02-26 15:41:13.621956069 +0000 UTC m=+0.834554729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.937638 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f5351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621975889 +0000 UTC m=+0.834574549,LastTimestamp:2026-02-26 15:41:13.621975889 +0000 UTC m=+0.834574549,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.948898 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f7bc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621986249 +0000 UTC m=+0.834584899,LastTimestamp:2026-02-26 15:41:13.621986249 +0000 UTC m=+0.834584899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.956206 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e870cb604 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.716176388 +0000 UTC m=+0.928775058,LastTimestamp:2026-02-26 15:41:13.716176388 +0000 UTC m=+0.928775058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.966440 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f05e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f05e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621956069 +0000 UTC m=+0.834554729,LastTimestamp:2026-02-26 15:41:13.729140615 +0000 UTC m=+0.941739265,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.973597 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f5351\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f5351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621975889 +0000 UTC m=+0.834574549,LastTimestamp:2026-02-26 15:41:13.729168074 +0000 UTC m=+0.941766724,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.981914 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f7bc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f7bc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621986249 +0000 UTC m=+0.834584899,LastTimestamp:2026-02-26 15:41:13.729180914 +0000 UTC m=+0.941779564,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.989246 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f05e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f05e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621956069 +0000 UTC m=+0.834554729,LastTimestamp:2026-02-26 15:41:13.730503697 +0000 UTC m=+0.943102347,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:57 crc kubenswrapper[5033]: W0226 15:41:57.995237 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.995320 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 15:41:57 crc kubenswrapper[5033]: E0226 15:41:57.996241 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f5351\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f5351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621975889 +0000 UTC m=+0.834574549,LastTimestamp:2026-02-26 15:41:13.730523277 +0000 UTC m=+0.943121927,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.003560 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f7bc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f7bc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621986249 +0000 UTC m=+0.834584899,LastTimestamp:2026-02-26 15:41:13.730534937 +0000 UTC m=+0.943133587,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.008931 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f05e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f05e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621956069 +0000 UTC m=+0.834554729,LastTimestamp:2026-02-26 15:41:13.731529847 +0000 UTC m=+0.944128497,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.015819 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f5351\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f5351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621975889 +0000 UTC m=+0.834574549,LastTimestamp:2026-02-26 15:41:13.731543317 +0000 UTC m=+0.944141967,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.021187 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f7bc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f7bc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621986249 +0000 UTC m=+0.834584899,LastTimestamp:2026-02-26 15:41:13.731555316 +0000 UTC m=+0.944153966,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.029201 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f05e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f05e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621956069 +0000 UTC m=+0.834554729,LastTimestamp:2026-02-26 15:41:13.731603825 +0000 UTC m=+0.944202515,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.037227 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f5351\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f5351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621975889 +0000 UTC m=+0.834574549,LastTimestamp:2026-02-26 15:41:13.731625015 +0000 UTC m=+0.944223705,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.045976 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f7bc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f7bc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621986249 +0000 UTC m=+0.834584899,LastTimestamp:2026-02-26 15:41:13.731640425 +0000 UTC m=+0.944239115,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.051876 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f05e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f05e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621956069 +0000 UTC m=+0.834554729,LastTimestamp:2026-02-26 15:41:13.732630844 +0000 UTC m=+0.945229524,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.059049 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f5351\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f5351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621975889 +0000 UTC m=+0.834574549,LastTimestamp:2026-02-26 15:41:13.732655514 +0000 UTC m=+0.945254204,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.065141 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f05e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f05e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621956069 +0000 UTC m=+0.834554729,LastTimestamp:2026-02-26 15:41:13.732673644 +0000 UTC m=+0.945272324,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.073474 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f7bc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f7bc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621986249 +0000 UTC m=+0.834584899,LastTimestamp:2026-02-26 15:41:13.732712403 +0000 UTC m=+0.945311073,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.079252 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f5351\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f5351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621975889 +0000 UTC m=+0.834574549,LastTimestamp:2026-02-26 15:41:13.732752762 +0000 UTC m=+0.945351412,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.086884 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f7bc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f7bc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621986249 +0000 UTC m=+0.834584899,LastTimestamp:2026-02-26 15:41:13.732776842 +0000 UTC m=+0.945375492,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.094094 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f05e5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f05e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621956069 +0000 UTC m=+0.834554729,LastTimestamp:2026-02-26 15:41:13.733229842 +0000 UTC m=+0.945828532,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.100941 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d62e816f5351\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d62e816f5351 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:13.621975889 +0000 UTC m=+0.834574549,LastTimestamp:2026-02-26 15:41:13.733285761 +0000 UTC m=+0.945884451,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.104478 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62ea0148fd3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.136121299 +0000 UTC m=+1.348719949,LastTimestamp:2026-02-26 15:41:14.136121299 +0000 UTC m=+1.348719949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.107975 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d62ea01bfb82 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.136607618 +0000 UTC m=+1.349206268,LastTimestamp:2026-02-26 15:41:14.136607618 +0000 UTC m=+1.349206268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.112051 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62ea03140de openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.13800163 +0000 UTC m=+1.350600280,LastTimestamp:2026-02-26 15:41:14.13800163 +0000 UTC m=+1.350600280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.117920 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62ea135f9af openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.155088303 +0000 UTC m=+1.367686953,LastTimestamp:2026-02-26 15:41:14.155088303 +0000 UTC m=+1.367686953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.124291 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ea1387722 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.15525149 +0000 UTC m=+1.367850180,LastTimestamp:2026-02-26 15:41:14.15525149 +0000 UTC m=+1.367850180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.131525 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d62ec76782c7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.795868871 +0000 UTC m=+2.008467571,LastTimestamp:2026-02-26 15:41:14.795868871 +0000 UTC m=+2.008467571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.139302 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62ec7828841 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.797639745 +0000 UTC m=+2.010238395,LastTimestamp:2026-02-26 15:41:14.797639745 +0000 UTC m=+2.010238395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.147597 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62ec786daa6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.797922982 +0000 UTC m=+2.010521672,LastTimestamp:2026-02-26 15:41:14.797922982 +0000 UTC m=+2.010521672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.154867 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ec787eac4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.797992644 +0000 UTC m=+2.010591324,LastTimestamp:2026-02-26 15:41:14.797992644 +0000 UTC m=+2.010591324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.161750 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62ec78a040c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.798130188 +0000 UTC m=+2.010728838,LastTimestamp:2026-02-26 15:41:14.798130188 +0000 UTC m=+2.010728838,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.168890 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ec8a6a65b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.816783963 +0000 UTC m=+2.029382643,LastTimestamp:2026-02-26 15:41:14.816783963 +0000 UTC m=+2.029382643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.175973 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ec8f7c6a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.822100646 +0000 UTC m=+2.034699326,LastTimestamp:2026-02-26 15:41:14.822100646 +0000 UTC m=+2.034699326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.182781 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62ec92197eb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.824841195 +0000 UTC m=+2.037439875,LastTimestamp:2026-02-26 15:41:14.824841195 +0000 UTC m=+2.037439875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.189930 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d62ec938d323 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.826363683 +0000 UTC m=+2.038962373,LastTimestamp:2026-02-26 15:41:14.826363683 +0000 UTC m=+2.038962373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.197917 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62ec93a87b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.826475445 +0000 UTC m=+2.039074125,LastTimestamp:2026-02-26 15:41:14.826475445 +0000 UTC m=+2.039074125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.208422 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62ec94eb15c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.827796828 +0000 UTC m=+2.040395488,LastTimestamp:2026-02-26 15:41:14.827796828 +0000 UTC m=+2.040395488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.214811 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62edffabc6e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.208170606 +0000 UTC m=+2.420769286,LastTimestamp:2026-02-26 15:41:15.208170606 +0000 UTC m=+2.420769286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.220003 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ee107a446 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.225793606 +0000 UTC m=+2.438392296,LastTimestamp:2026-02-26 15:41:15.225793606 +0000 UTC m=+2.438392296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.226635 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ee11c02a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.22712848 +0000 UTC m=+2.439727160,LastTimestamp:2026-02-26 15:41:15.22712848 +0000 UTC m=+2.439727160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.233503 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ef0615108 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.483328776 +0000 UTC m=+2.695927426,LastTimestamp:2026-02-26 15:41:15.483328776 +0000 UTC m=+2.695927426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.239980 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ef1118e5f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.494878815 +0000 UTC m=+2.707477475,LastTimestamp:2026-02-26 15:41:15.494878815 +0000 UTC m=+2.707477475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.246881 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ef1284a84 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.496368772 +0000 UTC m=+2.708967462,LastTimestamp:2026-02-26 15:41:15.496368772 +0000 UTC m=+2.708967462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.252572 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62efa2769f7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.647306231 +0000 UTC m=+2.859904891,LastTimestamp:2026-02-26 15:41:15.647306231 +0000 UTC m=+2.859904891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.259055 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d62efa99d9f7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.654806007 +0000 UTC m=+2.867404647,LastTimestamp:2026-02-26 15:41:15.654806007 +0000 UTC m=+2.867404647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.264785 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62efab86634 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.656807988 +0000 UTC m=+2.869406638,LastTimestamp:2026-02-26 15:41:15.656807988 +0000 UTC m=+2.869406638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.271126 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62efb09159a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.66209577 +0000 UTC m=+2.874694420,LastTimestamp:2026-02-26 15:41:15.66209577 +0000 UTC m=+2.874694420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.278460 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62efb461755 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.666093909 +0000 UTC m=+2.878692559,LastTimestamp:2026-02-26 15:41:15.666093909 +0000 UTC m=+2.878692559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.285237 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62efc3ec190 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.682390416 +0000 UTC m=+2.894989066,LastTimestamp:2026-02-26 15:41:15.682390416 +0000 UTC m=+2.894989066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.292433 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62f0611e77a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.847223162 +0000 UTC m=+3.059821852,LastTimestamp:2026-02-26 15:41:15.847223162 +0000 UTC m=+3.059821852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.298927 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d62f06651ce8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.852676328 +0000 UTC m=+3.065274978,LastTimestamp:2026-02-26 15:41:15.852676328 +0000 UTC m=+3.065274978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.304517 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f06685d50 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.852889424 +0000 UTC m=+3.065488074,LastTimestamp:2026-02-26 15:41:15.852889424 +0000 UTC m=+3.065488074,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.308672 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62f06e6394c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.86113774 +0000 UTC m=+3.073736390,LastTimestamp:2026-02-26 15:41:15.86113774 +0000 UTC m=+3.073736390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.313538 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62f06f317b6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.86198111 +0000 UTC m=+3.074579760,LastTimestamp:2026-02-26 15:41:15.86198111 +0000 UTC m=+3.074579760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.320785 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f0730d061 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.866026081 +0000 UTC m=+3.078624731,LastTimestamp:2026-02-26 15:41:15.866026081 +0000 UTC m=+3.078624731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.327526 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d62f07a7d3d2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.873825746 +0000 UTC m=+3.086424396,LastTimestamp:2026-02-26 15:41:15.873825746 +0000 UTC m=+3.086424396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.334425 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f08a7abfd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.890592765 +0000 UTC m=+3.103191415,LastTimestamp:2026-02-26 15:41:15.890592765 +0000 UTC m=+3.103191415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.339875 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f098cef87 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.905617799 +0000 UTC m=+3.118216449,LastTimestamp:2026-02-26 15:41:15.905617799 +0000 UTC m=+3.118216449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.345967 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f099f0217 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.906802199 +0000 UTC m=+3.119400849,LastTimestamp:2026-02-26 15:41:15.906802199 +0000 UTC m=+3.119400849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.352464 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62f1488be77 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.089892471 +0000 UTC m=+3.302491111,LastTimestamp:2026-02-26 15:41:16.089892471 +0000 UTC m=+3.302491111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.357112 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f14b78c86 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.092959878 +0000 UTC m=+3.305558528,LastTimestamp:2026-02-26 15:41:16.092959878 +0000 UTC m=+3.305558528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.362541 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62f158e03b1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.107015089 +0000 UTC m=+3.319613739,LastTimestamp:2026-02-26 15:41:16.107015089 +0000 UTC m=+3.319613739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.369514 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62f15987b86 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.107701126 +0000 UTC m=+3.320299776,LastTimestamp:2026-02-26 15:41:16.107701126 +0000 UTC m=+3.320299776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.375589 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f15eaaaaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.113087151 +0000 UTC m=+3.325685801,LastTimestamp:2026-02-26 15:41:16.113087151 +0000 UTC m=+3.325685801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.382792 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f1602cbac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.11466846 +0000 UTC m=+3.327267110,LastTimestamp:2026-02-26 15:41:16.11466846 +0000 UTC m=+3.327267110,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.389973 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62f21f9cc82 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.315405442 +0000 UTC m=+3.528004092,LastTimestamp:2026-02-26 15:41:16.315405442 +0000 UTC m=+3.528004092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.396228 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d62f22bd8a78 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.328233592 +0000 UTC m=+3.540832242,LastTimestamp:2026-02-26 15:41:16.328233592 +0000 UTC m=+3.540832242,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.401023 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f2411d999 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.350536089 +0000 UTC m=+3.563134739,LastTimestamp:2026-02-26 15:41:16.350536089 +0000 UTC m=+3.563134739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.406878 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f253aa6b7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.369987255 +0000 UTC m=+3.582585905,LastTimestamp:2026-02-26 15:41:16.369987255 +0000 UTC m=+3.582585905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.414036 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f254d30c5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.371202245 +0000 UTC m=+3.583800895,LastTimestamp:2026-02-26 15:41:16.371202245 +0000 UTC m=+3.583800895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.418901 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f37421257 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.672463447 +0000 UTC m=+3.885062097,LastTimestamp:2026-02-26 15:41:16.672463447 +0000 UTC m=+3.885062097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.424688 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f39b5c9a5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.713601445 +0000 UTC m=+3.926200135,LastTimestamp:2026-02-26 15:41:16.713601445 +0000 UTC m=+3.926200135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.430039 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f3acbe27a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.73182681 +0000 UTC m=+3.944425460,LastTimestamp:2026-02-26 15:41:16.73182681 +0000 UTC m=+3.944425460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.437780 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f3b0e063a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.736161338 +0000 UTC m=+3.948759988,LastTimestamp:2026-02-26 15:41:16.736161338 +0000 UTC m=+3.948759988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.444539 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f4246596a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.857293162 +0000 UTC m=+4.069891832,LastTimestamp:2026-02-26 15:41:16.857293162 +0000 UTC m=+4.069891832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.449635 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f43baf49a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.881712282 +0000 UTC m=+4.094310932,LastTimestamp:2026-02-26 15:41:16.881712282 +0000 UTC m=+4.094310932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.457901 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f46332dd6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.923145686 +0000 UTC m=+4.135744336,LastTimestamp:2026-02-26 15:41:16.923145686 +0000 UTC m=+4.135744336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.463751 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f46f5c574 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.935898484 +0000 UTC m=+4.148497124,LastTimestamp:2026-02-26 15:41:16.935898484 +0000 UTC m=+4.148497124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.471138 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f73f75227 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:17.690974759 +0000 UTC m=+4.903573409,LastTimestamp:2026-02-26 15:41:17.690974759 +0000 UTC m=+4.903573409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.476577 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f807ee0c1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:17.901185217 +0000 UTC m=+5.113783877,LastTimestamp:2026-02-26 15:41:17.901185217 +0000 UTC m=+5.113783877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.482690 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f814e8abe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:17.914794686 +0000 UTC m=+5.127393336,LastTimestamp:2026-02-26 15:41:17.914794686 +0000 UTC m=+5.127393336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.489667 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f81628d55 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:17.916106069 +0000 UTC m=+5.128704749,LastTimestamp:2026-02-26 15:41:17.916106069 +0000 UTC m=+5.128704749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.496547 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f8d77347a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.11878617 +0000 UTC m=+5.331384820,LastTimestamp:2026-02-26 15:41:18.11878617 +0000 UTC m=+5.331384820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.502978 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f8e70e90b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.135150859 +0000 UTC m=+5.347749509,LastTimestamp:2026-02-26 15:41:18.135150859 +0000 UTC m=+5.347749509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.510009 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f8e818dc6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.136241606 +0000 UTC m=+5.348840256,LastTimestamp:2026-02-26 15:41:18.136241606 +0000 UTC m=+5.348840256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.515933 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f99cc1b2e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.325676846 +0000 UTC m=+5.538275496,LastTimestamp:2026-02-26 15:41:18.325676846 +0000 UTC m=+5.538275496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.521239 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f9aa9d4f2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.340207858 +0000 UTC m=+5.552806508,LastTimestamp:2026-02-26 15:41:18.340207858 +0000 UTC m=+5.552806508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.526403 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62f9abae74a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.341326666 +0000 UTC m=+5.553925316,LastTimestamp:2026-02-26 15:41:18.341326666 +0000 UTC m=+5.553925316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.532058 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62fa7445980 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.551660928 +0000 UTC m=+5.764259608,LastTimestamp:2026-02-26 15:41:18.551660928 +0000 UTC m=+5.764259608,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.537567 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62fa845e0c5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.568538309 +0000 UTC m=+5.781136999,LastTimestamp:2026-02-26 15:41:18.568538309 +0000 UTC m=+5.781136999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.543724 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62fa85f9d06 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.570224902 +0000 UTC m=+5.782823592,LastTimestamp:2026-02-26 15:41:18.570224902 +0000 UTC m=+5.782823592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.550981 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62fb37424dd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.756119773 +0000 UTC m=+5.968718423,LastTimestamp:2026-02-26 15:41:18.756119773 +0000 UTC m=+5.968718423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.557071 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d62fb48cfaf7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:18.774524663 +0000 UTC m=+5.987123303,LastTimestamp:2026-02-26 15:41:18.774524663 +0000 UTC m=+5.987123303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: I0226 15:41:58.565486 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.566107 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:41:58 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d630f8aff2d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 15:41:58 crc kubenswrapper[5033]: body: Feb 26 15:41:58 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:24.212634326 +0000 UTC m=+11.425233016,LastTimestamp:2026-02-26 15:41:24.212634326 +0000 UTC m=+11.425233016,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:41:58 crc kubenswrapper[5033]: > Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.568022 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d630f8b138bf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:24.212717759 +0000 UTC m=+11.425316449,LastTimestamp:2026-02-26 15:41:24.212717759 +0000 UTC m=+11.425316449,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.574761 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 15:41:58 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-apiserver-crc.1897d631c29dffab openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:41594->192.168.126.11:17697: read: connection reset by peer Feb 26 15:41:58 crc kubenswrapper[5033]: body: Feb 26 15:41:58 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:27.600455595 +0000 UTC m=+14.813054245,LastTimestamp:2026-02-26 15:41:27.600455595 +0000 UTC m=+14.813054245,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:41:58 crc kubenswrapper[5033]: > Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.581132 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d631c29eb046 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41594->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:27.600500806 +0000 UTC m=+14.813099446,LastTimestamp:2026-02-26 15:41:27.600500806 +0000 UTC m=+14.813099446,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.591084 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897d62f3b0e063a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f3b0e063a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.736161338 +0000 UTC m=+3.948759988,LastTimestamp:2026-02-26 15:41:27.729765955 +0000 UTC m=+14.942364605,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.598493 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897d62f46332dd6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f46332dd6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.923145686 +0000 UTC m=+4.135744336,LastTimestamp:2026-02-26 15:41:27.948449931 +0000 UTC m=+15.161048581,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.605133 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897d62f46f5c574\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d62f46f5c574 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:16.935898484 +0000 UTC m=+4.148497124,LastTimestamp:2026-02-26 15:41:27.961684761 +0000 UTC m=+15.174283411,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.611407 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 15:41:58 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-apiserver-crc.1897d631dc2be573 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 15:41:58 crc kubenswrapper[5033]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 15:41:58 crc kubenswrapper[5033]: Feb 26 15:41:58 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:28.029185395 +0000 UTC m=+15.241784065,LastTimestamp:2026-02-26 15:41:28.029185395 +0000 UTC m=+15.241784065,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:41:58 crc kubenswrapper[5033]: > Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.618446 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d631dc2c7944 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:28.029223236 +0000 UTC m=+15.241821906,LastTimestamp:2026-02-26 15:41:28.029223236 +0000 UTC m=+15.241821906,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.629282 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d630f8aff2d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:41:58 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d630f8aff2d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 15:41:58 crc kubenswrapper[5033]: body: Feb 26 15:41:58 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:24.212634326 +0000 UTC m=+11.425233016,LastTimestamp:2026-02-26 15:41:34.212483961 +0000 UTC m=+21.425082611,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:41:58 crc kubenswrapper[5033]: > Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.635394 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d630f8b138bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d630f8b138bf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:24.212717759 +0000 UTC m=+11.425316449,LastTimestamp:2026-02-26 15:41:34.212537123 +0000 UTC m=+21.425135773,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.642129 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d630f8aff2d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:41:58 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d630f8aff2d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 15:41:58 crc kubenswrapper[5033]: body: Feb 26 15:41:58 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:24.212634326 +0000 UTC m=+11.425233016,LastTimestamp:2026-02-26 15:41:44.213452707 +0000 UTC m=+31.426051397,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:41:58 crc kubenswrapper[5033]: > Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.649128 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d630f8b138bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d630f8b138bf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:24.212717759 +0000 UTC m=+11.425316449,LastTimestamp:2026-02-26 15:41:44.213670422 +0000 UTC m=+31.426269112,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.656951 5033 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d635a13dd174 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:44.220373364 +0000 UTC m=+31.432972044,LastTimestamp:2026-02-26 15:41:44.220373364 +0000 UTC m=+31.432972044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.664515 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d62ec8f7c6a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ec8f7c6a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:14.822100646 +0000 UTC m=+2.034699326,LastTimestamp:2026-02-26 15:41:44.344812287 +0000 UTC m=+31.557410977,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.670661 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d62edffabc6e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62edffabc6e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.208170606 +0000 UTC m=+2.420769286,LastTimestamp:2026-02-26 15:41:44.539110292 +0000 UTC m=+31.751708942,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.677797 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d62ee107a446\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d62ee107a446 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:15.225793606 +0000 UTC m=+2.438392296,LastTimestamp:2026-02-26 15:41:44.551334268 +0000 UTC m=+31.763932958,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.686449 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d630f8aff2d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:41:58 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d630f8aff2d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 15:41:58 crc kubenswrapper[5033]: body: Feb 26 15:41:58 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:24.212634326 +0000 UTC m=+11.425233016,LastTimestamp:2026-02-26 15:41:54.212786124 +0000 UTC m=+41.425384774,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:41:58 crc kubenswrapper[5033]: > Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.692303 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d630f8b138bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d630f8b138bf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:24.212717759 +0000 UTC m=+11.425316449,LastTimestamp:2026-02-26 15:41:54.212838245 +0000 UTC m=+41.425436895,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:41:58 crc kubenswrapper[5033]: I0226 15:41:58.919543 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:41:58 crc kubenswrapper[5033]: I0226 15:41:58.919855 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:41:58 crc kubenswrapper[5033]: I0226 15:41:58.921564 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:41:58 crc kubenswrapper[5033]: I0226 15:41:58.921610 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:41:58 crc kubenswrapper[5033]: I0226 15:41:58.921632 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:41:58 crc kubenswrapper[5033]: I0226 15:41:58.922475 5033 scope.go:117] "RemoveContainer" containerID="17705066b77b164631d24a0da155579caf31cca306b32335e97812c02508c416" Feb 26 15:41:58 crc kubenswrapper[5033]: E0226 15:41:58.922839 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:41:59 crc kubenswrapper[5033]: I0226 15:41:59.570046 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:00 crc kubenswrapper[5033]: I0226 15:42:00.569817 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:01 crc kubenswrapper[5033]: I0226 15:42:01.473585 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:01 crc kubenswrapper[5033]: I0226 15:42:01.473796 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:01 crc kubenswrapper[5033]: I0226 15:42:01.475435 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:01 crc kubenswrapper[5033]: I0226 15:42:01.475518 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:01 crc kubenswrapper[5033]: I0226 15:42:01.475545 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:01 crc kubenswrapper[5033]: I0226 15:42:01.569665 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:02 crc kubenswrapper[5033]: I0226 15:42:02.451361 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:02 crc kubenswrapper[5033]: I0226 15:42:02.453972 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:02 crc kubenswrapper[5033]: I0226 15:42:02.454031 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:02 crc kubenswrapper[5033]: I0226 15:42:02.454054 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:02 crc kubenswrapper[5033]: I0226 15:42:02.454136 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:02 crc kubenswrapper[5033]: E0226 15:42:02.460056 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 15:42:02 crc kubenswrapper[5033]: E0226 15:42:02.460178 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 15:42:02 crc kubenswrapper[5033]: I0226 15:42:02.567861 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:03 crc kubenswrapper[5033]: I0226 15:42:03.569927 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:03 crc kubenswrapper[5033]: E0226 15:42:03.724362 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:42:04 crc kubenswrapper[5033]: I0226 15:42:04.212488 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:42:04 crc kubenswrapper[5033]: I0226 15:42:04.212680 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:42:04 crc kubenswrapper[5033]: E0226 15:42:04.219995 5033 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d630f8aff2d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:42:04 crc kubenswrapper[5033]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d630f8aff2d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 15:42:04 crc kubenswrapper[5033]: body: Feb 26 15:42:04 crc kubenswrapper[5033]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:41:24.212634326 +0000 UTC m=+11.425233016,LastTimestamp:2026-02-26 15:42:04.212650094 +0000 UTC m=+51.425248774,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:42:04 crc kubenswrapper[5033]: > Feb 26 15:42:04 crc kubenswrapper[5033]: I0226 15:42:04.567944 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:05 crc kubenswrapper[5033]: I0226 15:42:05.570623 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:06 crc kubenswrapper[5033]: I0226 15:42:06.570012 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:07 crc kubenswrapper[5033]: I0226 15:42:07.569777 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:08 crc kubenswrapper[5033]: I0226 15:42:08.569671 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:09 crc kubenswrapper[5033]: I0226 15:42:09.460424 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:09 crc kubenswrapper[5033]: I0226 15:42:09.462976 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:09 crc kubenswrapper[5033]: I0226 15:42:09.463034 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:09 crc kubenswrapper[5033]: I0226 15:42:09.463053 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:09 crc kubenswrapper[5033]: I0226 15:42:09.463087 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:09 crc kubenswrapper[5033]: E0226 15:42:09.467689 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 15:42:09 crc kubenswrapper[5033]: E0226 15:42:09.468963 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 15:42:09 crc kubenswrapper[5033]: I0226 15:42:09.568551 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:10 crc kubenswrapper[5033]: I0226 15:42:10.570631 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:11 crc kubenswrapper[5033]: I0226 15:42:11.570523 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:11 crc kubenswrapper[5033]: I0226 15:42:11.627425 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:11 crc kubenswrapper[5033]: I0226 15:42:11.629046 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:11 crc kubenswrapper[5033]: I0226 15:42:11.629109 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:11 crc kubenswrapper[5033]: I0226 15:42:11.629133 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:11 crc kubenswrapper[5033]: I0226 15:42:11.629962 5033 scope.go:117] "RemoveContainer" containerID="17705066b77b164631d24a0da155579caf31cca306b32335e97812c02508c416" Feb 26 15:42:11 crc kubenswrapper[5033]: E0226 15:42:11.630240 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:12 crc kubenswrapper[5033]: I0226 15:42:12.569099 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:13 crc kubenswrapper[5033]: I0226 15:42:13.571532 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:13 crc kubenswrapper[5033]: E0226 15:42:13.725299 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.014734 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.014990 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.017406 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.017484 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.017508 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.021593 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.022810 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.024464 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.024517 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.024542 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:14 crc kubenswrapper[5033]: I0226 15:42:14.570846 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:15 crc kubenswrapper[5033]: I0226 15:42:15.570774 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:16 crc kubenswrapper[5033]: I0226 15:42:16.468211 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:16 crc kubenswrapper[5033]: I0226 15:42:16.471097 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:16 crc kubenswrapper[5033]: I0226 15:42:16.471160 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:16 crc kubenswrapper[5033]: I0226 15:42:16.471183 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:16 crc kubenswrapper[5033]: I0226 15:42:16.471223 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:16 crc kubenswrapper[5033]: E0226 15:42:16.479546 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 15:42:16 crc kubenswrapper[5033]: E0226 15:42:16.479807 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 15:42:16 crc kubenswrapper[5033]: I0226 15:42:16.566999 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:17 crc kubenswrapper[5033]: I0226 15:42:17.567017 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:18 crc kubenswrapper[5033]: I0226 15:42:18.566914 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:19 crc kubenswrapper[5033]: I0226 15:42:19.566825 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:20 crc kubenswrapper[5033]: I0226 15:42:20.569483 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:21 crc kubenswrapper[5033]: I0226 15:42:21.566226 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:22 crc kubenswrapper[5033]: I0226 15:42:22.568555 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:22 crc kubenswrapper[5033]: I0226 15:42:22.627021 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:22 crc kubenswrapper[5033]: I0226 15:42:22.628343 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:22 crc kubenswrapper[5033]: I0226 15:42:22.628389 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:22 crc kubenswrapper[5033]: I0226 15:42:22.628399 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:22 crc kubenswrapper[5033]: W0226 15:42:22.695795 5033 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:22 crc kubenswrapper[5033]: E0226 15:42:22.695890 5033 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 15:42:23 crc kubenswrapper[5033]: I0226 15:42:23.480595 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:23 crc kubenswrapper[5033]: I0226 15:42:23.482067 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:23 crc kubenswrapper[5033]: I0226 15:42:23.482122 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:23 crc kubenswrapper[5033]: I0226 15:42:23.482143 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:23 crc kubenswrapper[5033]: I0226 15:42:23.482183 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:23 crc kubenswrapper[5033]: E0226 15:42:23.486908 5033 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 15:42:23 crc kubenswrapper[5033]: E0226 15:42:23.487487 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 15:42:23 crc kubenswrapper[5033]: I0226 15:42:23.570160 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:23 crc kubenswrapper[5033]: E0226 15:42:23.725815 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:42:24 crc kubenswrapper[5033]: I0226 15:42:24.567551 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:24 crc kubenswrapper[5033]: I0226 15:42:24.628007 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:24 crc kubenswrapper[5033]: I0226 15:42:24.629235 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:24 crc kubenswrapper[5033]: I0226 15:42:24.629284 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:24 crc kubenswrapper[5033]: I0226 15:42:24.629296 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:24 crc kubenswrapper[5033]: I0226 15:42:24.629807 5033 scope.go:117] "RemoveContainer" containerID="17705066b77b164631d24a0da155579caf31cca306b32335e97812c02508c416" Feb 26 15:42:24 crc kubenswrapper[5033]: I0226 15:42:24.803784 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:42:24 crc kubenswrapper[5033]: I0226 15:42:24.821877 5033 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 15:42:25 crc kubenswrapper[5033]: I0226 15:42:25.049959 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 15:42:25 crc kubenswrapper[5033]: I0226 15:42:25.052202 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b"} Feb 26 15:42:25 crc kubenswrapper[5033]: I0226 15:42:25.052386 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:25 crc kubenswrapper[5033]: I0226 15:42:25.053305 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:25 crc kubenswrapper[5033]: I0226 15:42:25.053331 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:25 crc kubenswrapper[5033]: I0226 15:42:25.053343 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:25 crc kubenswrapper[5033]: I0226 15:42:25.567864 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.059033 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.060060 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.064430 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b"} Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.065164 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b" exitCode=255 Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.065221 5033 scope.go:117] "RemoveContainer" containerID="17705066b77b164631d24a0da155579caf31cca306b32335e97812c02508c416" Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.065406 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.067448 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.067521 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.067543 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.068381 5033 scope.go:117] "RemoveContainer" containerID="3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b" Feb 26 15:42:26 crc kubenswrapper[5033]: E0226 15:42:26.068686 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.526693 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:26 crc kubenswrapper[5033]: I0226 15:42:26.570211 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:27 crc kubenswrapper[5033]: I0226 15:42:27.070977 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 15:42:27 crc kubenswrapper[5033]: I0226 15:42:27.073987 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:27 crc kubenswrapper[5033]: I0226 15:42:27.075292 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:27 crc kubenswrapper[5033]: I0226 15:42:27.075338 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:27 crc kubenswrapper[5033]: I0226 15:42:27.075357 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:27 crc kubenswrapper[5033]: I0226 15:42:27.076391 5033 scope.go:117] "RemoveContainer" containerID="3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b" Feb 26 15:42:27 crc kubenswrapper[5033]: E0226 15:42:27.076702 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:27 crc kubenswrapper[5033]: I0226 15:42:27.568307 5033 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:42:28 crc kubenswrapper[5033]: I0226 15:42:28.423557 5033 csr.go:261] certificate signing request csr-568gm is approved, waiting to be issued Feb 26 15:42:28 crc kubenswrapper[5033]: I0226 15:42:28.433136 5033 csr.go:257] certificate signing request csr-568gm is issued Feb 26 15:42:28 crc kubenswrapper[5033]: I0226 15:42:28.513625 5033 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 26 15:42:28 crc kubenswrapper[5033]: I0226 15:42:28.919578 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:28 crc kubenswrapper[5033]: I0226 15:42:28.919826 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:28 crc kubenswrapper[5033]: I0226 15:42:28.921819 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:28 crc kubenswrapper[5033]: I0226 15:42:28.921872 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:28 crc kubenswrapper[5033]: I0226 15:42:28.921890 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:28 crc kubenswrapper[5033]: I0226 15:42:28.922909 5033 scope.go:117] "RemoveContainer" containerID="3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b" Feb 26 15:42:28 crc kubenswrapper[5033]: E0226 15:42:28.923204 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:29 crc kubenswrapper[5033]: I0226 15:42:29.413475 5033 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 26 15:42:29 crc kubenswrapper[5033]: I0226 15:42:29.434901 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-06 14:08:58.040130989 +0000 UTC Feb 26 15:42:29 crc kubenswrapper[5033]: I0226 15:42:29.434943 5033 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7534h26m28.60519252s for next certificate rotation Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.487454 5033 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.489684 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.489876 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.490016 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.490332 5033 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.504121 5033 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.504888 5033 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.505066 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.509729 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.509785 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.509802 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.509828 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.509846 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:30Z","lastTransitionTime":"2026-02-26T15:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.531140 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.542972 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.543033 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.543050 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.543075 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.543093 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:30Z","lastTransitionTime":"2026-02-26T15:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.559302 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.569950 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.569992 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.570009 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.570034 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.570052 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:30Z","lastTransitionTime":"2026-02-26T15:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.585105 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.594959 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.595016 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.595039 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.595067 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:30 crc kubenswrapper[5033]: I0226 15:42:30.595093 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:30Z","lastTransitionTime":"2026-02-26T15:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.609956 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.610310 5033 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.610358 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.711356 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.812019 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:30 crc kubenswrapper[5033]: E0226 15:42:30.912933 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.013559 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.114105 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.214461 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.315576 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.416113 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.516315 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.617253 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.718021 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.818619 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:31 crc kubenswrapper[5033]: E0226 15:42:31.918789 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.019370 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.119982 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.220465 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.321045 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.422092 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.523136 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.623545 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.724501 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.825645 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:32 crc kubenswrapper[5033]: E0226 15:42:32.926080 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.027349 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.128088 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.228878 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.329436 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.430483 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.531158 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.631352 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.726864 5033 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.731727 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.832491 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:33 crc kubenswrapper[5033]: E0226 15:42:33.933510 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.034097 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.135694 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.236564 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.337107 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.437639 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.537756 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.638071 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.739215 5033 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.761416 5033 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.771705 5033 apiserver.go:52] "Watching apiserver" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.777002 5033 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.777312 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.777785 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.777918 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.778002 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.778107 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.778278 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.778660 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.778746 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.778660 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.779513 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.779565 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.780057 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.780412 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.780635 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.782321 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.782523 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.782653 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.783815 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.783969 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.819097 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.833324 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.841886 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.841942 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.841961 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.841985 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.842003 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:34Z","lastTransitionTime":"2026-02-26T15:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.847318 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.858210 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.864780 5033 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.870984 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.890438 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.906429 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.920444 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.944898 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.944957 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.944974 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.944997 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.945014 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:34Z","lastTransitionTime":"2026-02-26T15:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.950939 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.951085 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.951530 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.951933 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.952707 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.953110 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.954149 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.955082 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.955986 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.956482 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.956879 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.958008 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.958805 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.959491 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.960004 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.960643 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.961339 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.962678 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.963010 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.963509 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.963615 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.963988 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.964421 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.964774 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.951488 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.952219 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.952481 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.952651 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.952962 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.954098 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.955028 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.955933 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.956429 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.956830 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.957955 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.958749 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.959438 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.959940 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.960576 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.961217 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.962961 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.963284 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.963431 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.963930 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.964198 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.964363 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.964730 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.965049 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.966895 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.967606 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968045 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968442 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968498 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968519 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968541 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968560 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968579 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968598 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968613 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968980 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969068 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969113 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969143 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969179 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969210 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969237 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969322 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969348 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969383 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969409 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969437 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969463 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969492 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969524 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.967544 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.967999 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969550 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969582 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969611 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969640 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969672 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969702 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969735 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969768 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969798 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969823 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969888 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969922 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969947 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969975 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969998 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970023 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970165 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970195 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970218 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970246 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970293 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970319 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970345 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970370 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970399 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970427 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970451 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970474 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970501 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970527 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970551 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970574 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970598 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970626 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970650 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970699 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970725 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970769 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970793 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970816 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970839 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970862 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970884 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970914 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970938 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970962 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971002 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971032 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971063 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971095 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971122 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971285 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971317 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971349 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971394 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971419 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971446 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971494 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971522 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971549 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971572 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971594 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971617 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971639 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971670 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971698 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971724 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971802 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971833 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971859 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971883 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971908 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971931 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971963 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971988 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972012 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972037 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972062 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972085 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972114 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972143 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972171 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972200 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972224 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972273 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972301 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972325 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972347 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972373 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972405 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972431 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972456 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972483 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972506 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972529 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972553 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972577 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972600 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972621 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972648 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972675 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972696 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972721 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972746 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972774 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972808 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972833 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972861 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972885 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972907 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972928 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972958 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972981 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973006 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973032 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973062 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973088 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973112 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973138 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973162 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973186 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973213 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973241 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973294 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973320 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973344 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973374 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973402 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973428 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973451 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973479 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973504 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973527 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973551 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973577 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973602 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973627 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973663 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973690 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973715 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973744 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973777 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968360 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973811 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973843 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974000 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974026 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974090 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974096 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974011 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968948 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974144 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974192 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974231 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974300 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974347 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974387 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974426 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974466 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974502 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974539 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974576 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974623 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974712 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974738 5033 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974760 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974782 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974806 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974830 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974851 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974871 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974892 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974914 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974934 5033 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974954 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974974 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974995 5033 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975015 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975086 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975109 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975129 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975150 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975171 5033 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975194 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975216 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975236 5033 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975280 5033 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975302 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975322 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975342 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975362 5033 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975382 5033 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975402 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.976464 5033 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969170 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969359 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969392 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969524 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969654 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969710 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969835 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.969907 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970215 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970242 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970329 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970376 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.970972 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971057 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971324 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971472 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971645 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971669 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.971677 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972072 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972086 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972114 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972712 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972739 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972824 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.972983 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973248 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973358 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973522 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973702 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.973770 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974560 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974575 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974776 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975040 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975394 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975439 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975461 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975821 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.975850 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.976171 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.976189 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.968856 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.974377 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.976377 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.976516 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.976719 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.977391 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.976719 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.977465 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.977499 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:35.477477088 +0000 UTC m=+82.690075738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.976941 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.978018 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.978241 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.978463 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.979135 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.979249 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:42:35.47923075 +0000 UTC m=+82.691829410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.979523 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.979933 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.979942 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.979954 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.980054 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.980284 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.980933 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.981585 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.981825 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.982035 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.982471 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.982494 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.982804 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.982886 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.983147 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.983180 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.983548 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.983934 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.983752 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.984121 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.984442 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.984701 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.985056 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.985360 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.985373 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.985808 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.985397 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.985897 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.985933 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.986308 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.986649 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.987431 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.988009 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.988476 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.988889 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.989043 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.989138 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:35.489111176 +0000 UTC m=+82.701709846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.989542 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.990334 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.990560 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.991046 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.991485 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.992281 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.992337 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.984552 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.992924 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.992985 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.993426 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.993529 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.993993 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.994082 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.994679 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.994782 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.995014 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.995197 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.995697 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.996021 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.996093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.996453 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.996496 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.996646 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.996710 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.997045 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.997580 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.997651 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.997674 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.998874 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.998982 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:34 crc kubenswrapper[5033]: E0226 15:42:34.999014 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.999178 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.998811 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.999354 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:34 crc kubenswrapper[5033]: I0226 15:42:34.999521 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.001508 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:35.501464322 +0000 UTC m=+82.714063172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.001527 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.001819 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.002375 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.002461 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.002679 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.003234 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.003395 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.003795 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.003820 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.003965 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.004041 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.004196 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.004242 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.004298 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.004315 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.004323 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.004357 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.004394 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:35.504374012 +0000 UTC m=+82.716972662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.004735 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.004762 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.004857 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.005043 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.005183 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.005736 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.007072 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.008548 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.007466 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.008607 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.008765 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.008780 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.008840 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.008557 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.009017 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.009368 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.012640 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.012855 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.012909 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.013734 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.013775 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.013864 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.014416 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.014877 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.016947 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.017074 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.017216 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.017574 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.019676 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.020239 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.021018 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.034913 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.039512 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.045337 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.047541 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.047583 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.047595 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.047615 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.047631 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076655 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076699 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076788 5033 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076807 5033 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076822 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076835 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076848 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076861 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076873 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076885 5033 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076897 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076909 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076924 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076938 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076950 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076963 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076976 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076989 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077001 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077012 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077023 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077034 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077048 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077031 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077060 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077147 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077173 5033 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077195 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077216 5033 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.076954 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077238 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077405 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077423 5033 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077452 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077465 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077476 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077485 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077494 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077503 5033 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077512 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077522 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077548 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077558 5033 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077570 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077582 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077593 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077604 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077614 5033 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077624 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077633 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077643 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077653 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077662 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077686 5033 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077696 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077706 5033 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077716 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077725 5033 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077735 5033 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077744 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077754 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077764 5033 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077773 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077785 5033 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077795 5033 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077804 5033 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077813 5033 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077823 5033 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077831 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077840 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077849 5033 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077858 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077869 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077878 5033 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077900 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077910 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077918 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077927 5033 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077937 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077946 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077954 5033 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077963 5033 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077972 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.077983 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078010 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078020 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078031 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078039 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078048 5033 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078056 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078065 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078073 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078081 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078091 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078099 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078108 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078117 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078126 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078135 5033 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078145 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078153 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078162 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078172 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078181 5033 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078191 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078201 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078210 5033 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078219 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078230 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078239 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078248 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078287 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078296 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078306 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078314 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078321 5033 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078330 5033 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078339 5033 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078348 5033 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078355 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078363 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078371 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078378 5033 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078387 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078396 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078403 5033 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078411 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078419 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078428 5033 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078436 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078444 5033 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078453 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078461 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078468 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078477 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078486 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078495 5033 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078503 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078512 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078520 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078529 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078539 5033 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078547 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078556 5033 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078565 5033 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078574 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078582 5033 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078593 5033 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078601 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078610 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078618 5033 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078659 5033 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078696 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078706 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078714 5033 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078724 5033 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078735 5033 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078748 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078759 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078771 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078779 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078790 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078798 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078807 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078815 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078824 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078832 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.078842 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.098317 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.107495 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.114012 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.121250 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 15:42:35 crc kubenswrapper[5033]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 26 15:42:35 crc kubenswrapper[5033]: set -o allexport Feb 26 15:42:35 crc kubenswrapper[5033]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 26 15:42:35 crc kubenswrapper[5033]: source /etc/kubernetes/apiserver-url.env Feb 26 15:42:35 crc kubenswrapper[5033]: else Feb 26 15:42:35 crc kubenswrapper[5033]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 26 15:42:35 crc kubenswrapper[5033]: exit 1 Feb 26 15:42:35 crc kubenswrapper[5033]: fi Feb 26 15:42:35 crc kubenswrapper[5033]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 26 15:42:35 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 26 15:42:35 crc kubenswrapper[5033]: > logger="UnhandledError" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.122559 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 26 15:42:35 crc kubenswrapper[5033]: W0226 15:42:35.132288 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e378f8f3fc84fab1810715b2f03a1a17070bffa9f0aaa0c4460b1ca691707d66 WatchSource:0}: Error finding container e378f8f3fc84fab1810715b2f03a1a17070bffa9f0aaa0c4460b1ca691707d66: Status 404 returned error can't find the container with id e378f8f3fc84fab1810715b2f03a1a17070bffa9f0aaa0c4460b1ca691707d66 Feb 26 15:42:35 crc kubenswrapper[5033]: W0226 15:42:35.133427 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d34d5a4d324a7d8e04489b738a489228c608fe2af1c581545c3de4da758efd1f WatchSource:0}: Error finding container d34d5a4d324a7d8e04489b738a489228c608fe2af1c581545c3de4da758efd1f: Status 404 returned error can't find the container with id d34d5a4d324a7d8e04489b738a489228c608fe2af1c581545c3de4da758efd1f Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.136250 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 15:42:35 crc kubenswrapper[5033]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 26 15:42:35 crc kubenswrapper[5033]: if [[ -f "/env/_master" ]]; then Feb 26 15:42:35 crc kubenswrapper[5033]: set -o allexport Feb 26 15:42:35 crc kubenswrapper[5033]: source "/env/_master" Feb 26 15:42:35 crc kubenswrapper[5033]: set +o allexport Feb 26 15:42:35 crc kubenswrapper[5033]: fi Feb 26 15:42:35 crc kubenswrapper[5033]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 26 15:42:35 crc kubenswrapper[5033]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 26 15:42:35 crc kubenswrapper[5033]: ho_enable="--enable-hybrid-overlay" Feb 26 15:42:35 crc kubenswrapper[5033]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 26 15:42:35 crc kubenswrapper[5033]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 26 15:42:35 crc kubenswrapper[5033]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 26 15:42:35 crc kubenswrapper[5033]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 26 15:42:35 crc kubenswrapper[5033]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 26 15:42:35 crc kubenswrapper[5033]: --webhook-host=127.0.0.1 \ Feb 26 15:42:35 crc kubenswrapper[5033]: --webhook-port=9743 \ Feb 26 15:42:35 crc kubenswrapper[5033]: ${ho_enable} \ Feb 26 15:42:35 crc kubenswrapper[5033]: --enable-interconnect \ Feb 26 15:42:35 crc kubenswrapper[5033]: --disable-approver \ Feb 26 15:42:35 crc kubenswrapper[5033]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 26 15:42:35 crc kubenswrapper[5033]: --wait-for-kubernetes-api=200s \ Feb 26 15:42:35 crc kubenswrapper[5033]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 26 15:42:35 crc kubenswrapper[5033]: --loglevel="${LOGLEVEL}" Feb 26 15:42:35 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 26 15:42:35 crc kubenswrapper[5033]: > logger="UnhandledError" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.136631 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.137817 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.139660 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 15:42:35 crc kubenswrapper[5033]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 26 15:42:35 crc kubenswrapper[5033]: if [[ -f "/env/_master" ]]; then Feb 26 15:42:35 crc kubenswrapper[5033]: set -o allexport Feb 26 15:42:35 crc kubenswrapper[5033]: source "/env/_master" Feb 26 15:42:35 crc kubenswrapper[5033]: set +o allexport Feb 26 15:42:35 crc kubenswrapper[5033]: fi Feb 26 15:42:35 crc kubenswrapper[5033]: Feb 26 15:42:35 crc kubenswrapper[5033]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 26 15:42:35 crc kubenswrapper[5033]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 26 15:42:35 crc kubenswrapper[5033]: --disable-webhook \ Feb 26 15:42:35 crc kubenswrapper[5033]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 26 15:42:35 crc kubenswrapper[5033]: --loglevel="${LOGLEVEL}" Feb 26 15:42:35 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 26 15:42:35 crc kubenswrapper[5033]: > logger="UnhandledError" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.140878 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.149549 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.149883 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.150067 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.150227 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.150402 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.252756 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.252839 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.252856 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.252877 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.252893 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.356005 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.356083 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.356108 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.356136 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.356158 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.459562 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.459617 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.459630 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.459648 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.459660 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.484081 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.484223 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.484378 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.484387 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:42:36.484329488 +0000 UTC m=+83.696928178 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.484469 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:36.484430341 +0000 UTC m=+83.697029031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.563210 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.563301 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.563319 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.563349 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.563365 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.585659 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.585707 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.585721 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.585784 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:36.585767638 +0000 UTC m=+83.798366288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.585442 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.586198 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.586348 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.586396 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.586407 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.586478 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.586542 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:36.586500746 +0000 UTC m=+83.799099396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.586620 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: E0226 15:42:35.586730 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:36.58670063 +0000 UTC m=+83.799299320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.632059 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.633117 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.635558 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.636788 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.638796 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.639941 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.641177 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.643202 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.644640 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.646612 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.647655 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.649913 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.650900 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.651948 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.653956 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.655062 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.657061 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.658336 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.659826 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.662153 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.663345 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.665856 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.666794 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.666825 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.666833 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.666846 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.666858 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.667659 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.670136 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.671139 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.672538 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.675007 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.676149 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.678425 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.679530 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.681492 5033 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.681876 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.685683 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.687399 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.687854 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.689394 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.690030 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.690948 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.691555 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.692626 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.693058 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.694048 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.694684 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.695618 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.696072 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.696939 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.697489 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.698625 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.699156 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.700120 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.700595 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.701534 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.702124 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.702620 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.768916 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.768994 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.769014 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.769039 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.769057 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.872136 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.872185 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.872202 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.872219 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.872231 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.974303 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.974383 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.974405 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.974438 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:35 crc kubenswrapper[5033]: I0226 15:42:35.974461 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:35Z","lastTransitionTime":"2026-02-26T15:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.077370 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.077445 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.077457 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.077472 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.077483 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:36Z","lastTransitionTime":"2026-02-26T15:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.104045 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"55e0bc43daa4b715f5188cbd8d0b72e9094994821595c6debd358a7f9b2061f5"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.105985 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d34d5a4d324a7d8e04489b738a489228c608fe2af1c581545c3de4da758efd1f"} Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.106739 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 15:42:36 crc kubenswrapper[5033]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 26 15:42:36 crc kubenswrapper[5033]: set -o allexport Feb 26 15:42:36 crc kubenswrapper[5033]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 26 15:42:36 crc kubenswrapper[5033]: source /etc/kubernetes/apiserver-url.env Feb 26 15:42:36 crc kubenswrapper[5033]: else Feb 26 15:42:36 crc kubenswrapper[5033]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 26 15:42:36 crc kubenswrapper[5033]: exit 1 Feb 26 15:42:36 crc kubenswrapper[5033]: fi Feb 26 15:42:36 crc kubenswrapper[5033]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 26 15:42:36 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 26 15:42:36 crc kubenswrapper[5033]: > logger="UnhandledError" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.107179 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e378f8f3fc84fab1810715b2f03a1a17070bffa9f0aaa0c4460b1ca691707d66"} Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.108020 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.108166 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.109248 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 15:42:36 crc kubenswrapper[5033]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 26 15:42:36 crc kubenswrapper[5033]: if [[ -f "/env/_master" ]]; then Feb 26 15:42:36 crc kubenswrapper[5033]: set -o allexport Feb 26 15:42:36 crc kubenswrapper[5033]: source "/env/_master" Feb 26 15:42:36 crc kubenswrapper[5033]: set +o allexport Feb 26 15:42:36 crc kubenswrapper[5033]: fi Feb 26 15:42:36 crc kubenswrapper[5033]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 26 15:42:36 crc kubenswrapper[5033]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 26 15:42:36 crc kubenswrapper[5033]: ho_enable="--enable-hybrid-overlay" Feb 26 15:42:36 crc kubenswrapper[5033]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 26 15:42:36 crc kubenswrapper[5033]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 26 15:42:36 crc kubenswrapper[5033]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 26 15:42:36 crc kubenswrapper[5033]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 26 15:42:36 crc kubenswrapper[5033]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 26 15:42:36 crc kubenswrapper[5033]: --webhook-host=127.0.0.1 \ Feb 26 15:42:36 crc kubenswrapper[5033]: --webhook-port=9743 \ Feb 26 15:42:36 crc kubenswrapper[5033]: ${ho_enable} \ Feb 26 15:42:36 crc kubenswrapper[5033]: --enable-interconnect \ Feb 26 15:42:36 crc kubenswrapper[5033]: --disable-approver \ Feb 26 15:42:36 crc kubenswrapper[5033]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 26 15:42:36 crc kubenswrapper[5033]: --wait-for-kubernetes-api=200s \ Feb 26 15:42:36 crc kubenswrapper[5033]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 26 15:42:36 crc kubenswrapper[5033]: --loglevel="${LOGLEVEL}" Feb 26 15:42:36 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 26 15:42:36 crc kubenswrapper[5033]: > logger="UnhandledError" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.109322 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.111551 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 15:42:36 crc kubenswrapper[5033]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 26 15:42:36 crc kubenswrapper[5033]: if [[ -f "/env/_master" ]]; then Feb 26 15:42:36 crc kubenswrapper[5033]: set -o allexport Feb 26 15:42:36 crc kubenswrapper[5033]: source "/env/_master" Feb 26 15:42:36 crc kubenswrapper[5033]: set +o allexport Feb 26 15:42:36 crc kubenswrapper[5033]: fi Feb 26 15:42:36 crc kubenswrapper[5033]: Feb 26 15:42:36 crc kubenswrapper[5033]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 26 15:42:36 crc kubenswrapper[5033]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 26 15:42:36 crc kubenswrapper[5033]: --disable-webhook \ Feb 26 15:42:36 crc kubenswrapper[5033]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 26 15:42:36 crc kubenswrapper[5033]: --loglevel="${LOGLEVEL}" Feb 26 15:42:36 crc kubenswrapper[5033]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 26 15:42:36 crc kubenswrapper[5033]: > logger="UnhandledError" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.112667 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.121223 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.137735 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.153551 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.168250 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.180308 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.180503 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.180705 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.180906 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.181101 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:36Z","lastTransitionTime":"2026-02-26T15:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.184906 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.199663 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.215820 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.230640 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.246457 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.261060 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.277682 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.283958 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.284145 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.284341 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.284642 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.284784 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:36Z","lastTransitionTime":"2026-02-26T15:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.293873 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.312422 5033 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.387664 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.387703 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.387718 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.387735 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.387748 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:36Z","lastTransitionTime":"2026-02-26T15:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.490513 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.490824 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.490911 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.490988 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.491069 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:36Z","lastTransitionTime":"2026-02-26T15:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.495979 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.496081 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.496323 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:42:38.496291037 +0000 UTC m=+85.708889707 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.496373 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.496532 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:38.496520283 +0000 UTC m=+85.709118933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.594016 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.594094 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.594114 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.594137 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.594154 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:36Z","lastTransitionTime":"2026-02-26T15:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.596560 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.596621 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.596673 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.596684 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.596772 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:38.596747543 +0000 UTC m=+85.809346223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.596815 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.596840 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.596859 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.596920 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:38.596900017 +0000 UTC m=+85.809498697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.596998 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.597015 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.597028 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.597067 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:38.597054841 +0000 UTC m=+85.809653531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.627446 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.627474 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.627469 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.627653 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.627703 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:36 crc kubenswrapper[5033]: E0226 15:42:36.627781 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.697375 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.697436 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.697454 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.697478 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.697496 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:36Z","lastTransitionTime":"2026-02-26T15:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.800869 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.800924 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.800944 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.800971 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.801030 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:36Z","lastTransitionTime":"2026-02-26T15:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.903540 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.903584 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.903597 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.903614 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:36 crc kubenswrapper[5033]: I0226 15:42:36.903626 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:36Z","lastTransitionTime":"2026-02-26T15:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.006669 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.006964 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.007078 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.007186 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.007307 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.111068 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.111133 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.111157 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.111187 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.111209 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.213833 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.213879 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.213890 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.213908 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.213920 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.316739 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.316784 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.316795 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.316810 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.316853 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.419497 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.419571 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.419595 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.419624 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.419641 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.522706 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.523230 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.523463 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.523622 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.523777 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.627481 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.627557 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.627577 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.627603 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.627618 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.731305 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.731374 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.731389 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.731412 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.731428 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.835027 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.835089 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.835108 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.835136 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.835153 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.938399 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.938692 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.938798 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.938895 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:37 crc kubenswrapper[5033]: I0226 15:42:37.938977 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:37Z","lastTransitionTime":"2026-02-26T15:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.041858 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.041932 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.041952 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.041979 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.041998 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.145320 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.145401 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.145425 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.145460 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.145484 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.248488 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.248556 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.248581 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.248611 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.248635 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.351412 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.351790 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.351917 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.352050 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.352246 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.455738 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.456458 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.456508 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.456542 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.456568 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.514477 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.514640 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.514685 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:42:42.514656043 +0000 UTC m=+89.727254723 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.514775 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.514852 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:42.514833077 +0000 UTC m=+89.727431767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.559012 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.559103 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.559122 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.559154 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.559178 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.615884 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.615936 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.615980 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.616045 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.616099 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:42.616086862 +0000 UTC m=+89.828685512 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.616162 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.616220 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.616233 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.616166 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.616310 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:42.616293868 +0000 UTC m=+89.828892518 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.616328 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.616349 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.617585 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:42.617554668 +0000 UTC m=+89.830153348 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.627241 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.627513 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.627727 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.627754 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.627866 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:38 crc kubenswrapper[5033]: E0226 15:42:38.628073 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.661554 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.661611 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.661627 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.661645 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.661656 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.764920 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.764994 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.765016 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.765044 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.765065 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.867833 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.867874 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.867886 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.867902 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.867911 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.971140 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.971581 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.971840 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.972043 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:38 crc kubenswrapper[5033]: I0226 15:42:38.972215 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:38Z","lastTransitionTime":"2026-02-26T15:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.075905 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.076005 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.076030 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.076055 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.076073 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:39Z","lastTransitionTime":"2026-02-26T15:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.179456 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.179533 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.179553 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.179579 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.179597 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:39Z","lastTransitionTime":"2026-02-26T15:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.281388 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.281461 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.281473 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.281519 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.281532 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:39Z","lastTransitionTime":"2026-02-26T15:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.384655 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.384743 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.384769 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.384801 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.384823 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:39Z","lastTransitionTime":"2026-02-26T15:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.487532 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.487572 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.487580 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.487596 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.487609 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:39Z","lastTransitionTime":"2026-02-26T15:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.590659 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.590724 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.590740 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.590765 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.590781 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:39Z","lastTransitionTime":"2026-02-26T15:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.693687 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.693775 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.693792 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.693843 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.693863 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:39Z","lastTransitionTime":"2026-02-26T15:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.796373 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.796417 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.796427 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.796442 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.796453 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:39Z","lastTransitionTime":"2026-02-26T15:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.899578 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.899638 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.899656 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.899681 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:39 crc kubenswrapper[5033]: I0226 15:42:39.899701 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:39Z","lastTransitionTime":"2026-02-26T15:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.002552 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.002618 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.002632 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.002648 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.002659 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.105067 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.105114 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.105130 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.105150 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.105167 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.207307 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.207377 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.207399 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.207428 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.207446 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.310684 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.310738 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.310749 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.310766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.310782 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.414435 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.414487 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.414509 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.414539 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.414561 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.517960 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.518021 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.518036 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.518060 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.518072 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.622217 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.622290 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.622305 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.622333 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.622348 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.627830 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.628018 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:40 crc kubenswrapper[5033]: E0226 15:42:40.628178 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:40 crc kubenswrapper[5033]: E0226 15:42:40.628700 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.629094 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:40 crc kubenswrapper[5033]: E0226 15:42:40.629312 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.645174 5033 scope.go:117] "RemoveContainer" containerID="3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b" Feb 26 15:42:40 crc kubenswrapper[5033]: E0226 15:42:40.645525 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.647511 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.725791 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.726163 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.726218 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.726251 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.726693 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.829173 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.829220 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.829234 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.829252 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.829291 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.897512 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.897548 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.897557 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.897573 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.897584 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: E0226 15:42:40.908521 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.912344 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.912390 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.912400 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.912416 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.912427 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: E0226 15:42:40.922286 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.925846 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.925872 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.925882 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.925896 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.925907 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: E0226 15:42:40.964948 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.969216 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.969249 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.969306 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.969321 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.969332 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:40 crc kubenswrapper[5033]: E0226 15:42:40.985637 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.990041 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.990089 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.990107 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.990132 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:40 crc kubenswrapper[5033]: I0226 15:42:40.990149 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:40Z","lastTransitionTime":"2026-02-26T15:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: E0226 15:42:41.003836 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:41 crc kubenswrapper[5033]: E0226 15:42:41.003947 5033 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.005225 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.005316 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.005335 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.005356 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.005380 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.107013 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.107048 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.107060 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.107075 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.107086 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.120726 5033 scope.go:117] "RemoveContainer" containerID="3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b" Feb 26 15:42:41 crc kubenswrapper[5033]: E0226 15:42:41.120854 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.209534 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.209598 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.209608 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.209621 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.209630 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.312533 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.312568 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.312578 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.312596 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.312607 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.415800 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.415923 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.415945 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.415973 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.415993 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.518076 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.518121 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.518140 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.518162 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.518180 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.621491 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.621574 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.621595 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.621653 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.621671 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.724845 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.724905 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.724923 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.724947 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.724964 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.827547 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.827580 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.827591 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.827608 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.827618 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.929984 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.930097 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.930122 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.930150 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:41 crc kubenswrapper[5033]: I0226 15:42:41.930173 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:41Z","lastTransitionTime":"2026-02-26T15:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.032417 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.032466 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.032475 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.032490 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.032498 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.033949 5033 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.134837 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.134904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.134925 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.134947 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.134964 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.237932 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.237977 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.237988 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.238003 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.238017 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.339971 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.340006 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.340014 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.340028 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.340036 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.442513 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.442549 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.442560 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.442575 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.442585 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.544869 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.544920 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.544932 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.544949 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.544962 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.549415 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.549518 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.549590 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:42:50.54956707 +0000 UTC m=+97.762165790 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.549737 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.549853 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:50.549824946 +0000 UTC m=+97.762423626 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.627623 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.627670 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.627700 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.627798 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.627950 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.628072 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.648612 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.648732 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.648753 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.648777 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.648794 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.650151 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.650218 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.650286 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650450 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650476 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650494 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650559 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:50.650538399 +0000 UTC m=+97.863137089 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650694 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650727 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650745 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650809 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:50.650787054 +0000 UTC m=+97.863385744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650869 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:42 crc kubenswrapper[5033]: E0226 15:42:42.650911 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:42:50.650897927 +0000 UTC m=+97.863496607 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.752106 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.752193 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.752213 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.752235 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.752253 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.855537 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.855600 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.855619 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.855643 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.855661 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.957532 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.957597 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.957618 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.957648 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:42 crc kubenswrapper[5033]: I0226 15:42:42.957671 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:42Z","lastTransitionTime":"2026-02-26T15:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.059726 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.059767 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.059780 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.059798 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.059810 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.162101 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.162144 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.162156 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.162172 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.162188 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.264105 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.264156 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.264168 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.264187 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.264198 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.367096 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.367179 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.367212 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.367246 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.367557 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.471125 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.471172 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.471182 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.471196 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.471210 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.573768 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.573818 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.573831 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.573854 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.573867 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.642748 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.655133 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.670619 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.676932 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.676991 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.677004 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.677019 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.677032 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.682781 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.694463 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.708685 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.724044 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127cc120-a042-4dd7-8513-f6ee67014a46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:42:25.355394 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:42:25.355540 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:42:25.356824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1398027493/tls.crt::/tmp/serving-cert-1398027493/tls.key\\\\\\\"\\\\nI0226 15:42:25.948466 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:42:25.950731 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:42:25.950748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:42:25.950769 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:42:25.950774 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:42:25.958513 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 15:42:25.958527 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 15:42:25.958563 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958574 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958583 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:42:25.958589 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:42:25.958595 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:42:25.958600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 15:42:25.960644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.778682 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.778731 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.778748 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.778771 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.778793 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.881111 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.881936 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.882069 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.882208 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.882370 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.985756 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.985805 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.985821 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.985844 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:43 crc kubenswrapper[5033]: I0226 15:42:43.985861 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:43Z","lastTransitionTime":"2026-02-26T15:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.087964 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.088004 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.088017 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.088031 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.088042 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:44Z","lastTransitionTime":"2026-02-26T15:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.190691 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.190732 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.190744 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.190759 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.190771 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:44Z","lastTransitionTime":"2026-02-26T15:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.293655 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.293705 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.293721 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.293745 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.293761 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:44Z","lastTransitionTime":"2026-02-26T15:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.396476 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.396538 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.396562 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.396587 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.396604 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:44Z","lastTransitionTime":"2026-02-26T15:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.499667 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.499748 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.499771 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.499801 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.499822 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:44Z","lastTransitionTime":"2026-02-26T15:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.602904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.602947 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.602962 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.602979 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.602990 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:44Z","lastTransitionTime":"2026-02-26T15:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.627646 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.627752 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:44 crc kubenswrapper[5033]: E0226 15:42:44.627925 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.628043 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:44 crc kubenswrapper[5033]: E0226 15:42:44.628120 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:44 crc kubenswrapper[5033]: E0226 15:42:44.628326 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.705642 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.705729 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.705756 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.705794 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.705816 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:44Z","lastTransitionTime":"2026-02-26T15:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.808596 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.808704 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.808724 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.808756 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.808777 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:44Z","lastTransitionTime":"2026-02-26T15:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.911665 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.911767 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.911778 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.911800 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:44 crc kubenswrapper[5033]: I0226 15:42:44.911814 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:44Z","lastTransitionTime":"2026-02-26T15:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.014395 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.014455 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.014465 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.014483 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.014494 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.117092 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.117166 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.117188 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.117215 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.117237 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.219754 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.219823 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.219839 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.219863 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.219882 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.322336 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.322719 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.322739 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.322762 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.322779 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.425781 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.425816 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.425826 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.425842 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.425854 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.529774 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.529856 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.529877 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.529904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.529932 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.632009 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.632056 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.632073 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.632095 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.632111 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.735664 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.735791 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.735818 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.735852 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.735978 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.838690 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.838766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.838788 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.838816 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.838839 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.941648 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.941699 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.941715 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.941739 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:45 crc kubenswrapper[5033]: I0226 15:42:45.941755 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:45Z","lastTransitionTime":"2026-02-26T15:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.045397 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.045456 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.045476 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.045508 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.045532 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.148353 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.148413 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.148429 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.148456 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.148473 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.251568 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.251661 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.251685 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.251715 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.251735 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.354747 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.354834 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.354864 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.354889 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.354906 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.457613 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.457673 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.457690 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.457714 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.457730 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.560640 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.560707 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.560731 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.560759 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.560782 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.627178 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.627200 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.627312 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:46 crc kubenswrapper[5033]: E0226 15:42:46.627511 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:46 crc kubenswrapper[5033]: E0226 15:42:46.627659 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:46 crc kubenswrapper[5033]: E0226 15:42:46.627795 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.662942 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.662996 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.663020 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.663047 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.663068 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.765824 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.765884 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.765904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.765931 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.765951 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.869028 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.869088 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.869105 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.869128 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.869144 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.971688 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.971724 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.971735 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.971751 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:46 crc kubenswrapper[5033]: I0226 15:42:46.971761 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:46Z","lastTransitionTime":"2026-02-26T15:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.074484 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.074551 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.074569 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.074594 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.074611 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:47Z","lastTransitionTime":"2026-02-26T15:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.177911 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.178001 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.178024 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.178053 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.178081 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:47Z","lastTransitionTime":"2026-02-26T15:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.281541 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.281602 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.281620 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.281644 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.281662 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:47Z","lastTransitionTime":"2026-02-26T15:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.384590 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.384653 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.384688 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.384726 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.384749 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:47Z","lastTransitionTime":"2026-02-26T15:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.488243 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.488359 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.488382 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.488411 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.488431 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:47Z","lastTransitionTime":"2026-02-26T15:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.591146 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.591185 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.591198 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.591214 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.591227 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:47Z","lastTransitionTime":"2026-02-26T15:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.693915 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.693994 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.694019 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.694054 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.694075 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:47Z","lastTransitionTime":"2026-02-26T15:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.796288 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.796340 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.796357 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.796382 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.796399 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:47Z","lastTransitionTime":"2026-02-26T15:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.899228 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.899299 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.899312 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.899329 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:47 crc kubenswrapper[5033]: I0226 15:42:47.899345 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:47Z","lastTransitionTime":"2026-02-26T15:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.001455 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.001493 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.001501 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.001513 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.001521 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.103637 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.103665 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.103673 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.103684 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.103694 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.139745 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"748cc274375fda9c75beb1d4155d07ba9b488c176d3f3c8bf69c667511898b12"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.152099 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.165841 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748cc274375fda9c75beb1d4155d07ba9b488c176d3f3c8bf69c667511898b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.177472 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.189985 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.203354 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.206239 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.206355 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.206399 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.206425 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.206444 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.216006 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.228881 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127cc120-a042-4dd7-8513-f6ee67014a46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:42:25.355394 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:42:25.355540 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:42:25.356824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1398027493/tls.crt::/tmp/serving-cert-1398027493/tls.key\\\\\\\"\\\\nI0226 15:42:25.948466 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:42:25.950731 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:42:25.950748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:42:25.950769 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:42:25.950774 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:42:25.958513 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 15:42:25.958527 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 15:42:25.958563 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958574 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958583 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:42:25.958589 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:42:25.958595 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:42:25.958600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 15:42:25.960644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.308745 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.308787 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.308796 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.308815 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.308825 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.411288 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.411329 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.411340 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.411356 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.411367 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.512984 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.513023 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.513035 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.513050 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.513062 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.619766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.619817 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.619831 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.619852 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.619867 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.627670 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.627685 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.627937 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:48 crc kubenswrapper[5033]: E0226 15:42:48.627923 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:48 crc kubenswrapper[5033]: E0226 15:42:48.628101 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:48 crc kubenswrapper[5033]: E0226 15:42:48.628182 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.645134 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.722999 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.723033 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.723043 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.723061 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.723072 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.826029 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.826089 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.826113 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.826145 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.826204 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.928450 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.928497 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.928511 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.928527 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:48 crc kubenswrapper[5033]: I0226 15:42:48.928537 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:48Z","lastTransitionTime":"2026-02-26T15:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.031784 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.031852 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.031865 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.031885 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.032229 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.135199 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.135248 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.135284 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.135302 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.135316 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.238319 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.238371 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.238379 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.238393 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.238403 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.340629 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.340676 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.340687 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.340708 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.340721 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.442937 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.442978 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.442987 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.443001 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.443011 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.488110 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-n59mx"] Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.488472 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n59mx" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.490820 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.490896 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.490821 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.502146 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748cc274375fda9c75beb1d4155d07ba9b488c176d3f3c8bf69c667511898b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.512290 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.524923 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.534282 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.543388 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-n59mx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6e80f78-4fca-4d88-b133-4da7db46c9b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdd7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-n59mx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.544697 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.544722 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.544730 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.544750 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.544759 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.565285 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"050af11c-f014-4b1e-b033-63526858a53e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f55baa5845d1d93e807193c728c2adc29ac1396a864f97c8445094ba944820e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0737283a2d944dc412534ebe53bc180bf3005fb66e164e72abb7113a86452c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236bd9859aa14e2adf418323c049fdd9ba63540adc8d44feab7db2deb72af262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2be63f3aeb64760fc286645929a8863c2794930849e4d662a65a0ec04aee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c5f32b45ecafcde39a72bf4595128654203e513cb5660bdf8c5536dd0dd6fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.582483 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127cc120-a042-4dd7-8513-f6ee67014a46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:42:25.355394 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:42:25.355540 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:42:25.356824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1398027493/tls.crt::/tmp/serving-cert-1398027493/tls.key\\\\\\\"\\\\nI0226 15:42:25.948466 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:42:25.950731 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:42:25.950748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:42:25.950769 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:42:25.950774 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:42:25.958513 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 15:42:25.958527 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 15:42:25.958563 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958574 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958583 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:42:25.958589 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:42:25.958595 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:42:25.958600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 15:42:25.960644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.593524 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.603344 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.618270 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6e80f78-4fca-4d88-b133-4da7db46c9b6-hosts-file\") pod \"node-resolver-n59mx\" (UID: \"b6e80f78-4fca-4d88-b133-4da7db46c9b6\") " pod="openshift-dns/node-resolver-n59mx" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.618316 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdd7s\" (UniqueName: \"kubernetes.io/projected/b6e80f78-4fca-4d88-b133-4da7db46c9b6-kube-api-access-bdd7s\") pod \"node-resolver-n59mx\" (UID: \"b6e80f78-4fca-4d88-b133-4da7db46c9b6\") " pod="openshift-dns/node-resolver-n59mx" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.646729 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.646774 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.646784 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.646798 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.646807 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.718990 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6e80f78-4fca-4d88-b133-4da7db46c9b6-hosts-file\") pod \"node-resolver-n59mx\" (UID: \"b6e80f78-4fca-4d88-b133-4da7db46c9b6\") " pod="openshift-dns/node-resolver-n59mx" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.719035 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdd7s\" (UniqueName: \"kubernetes.io/projected/b6e80f78-4fca-4d88-b133-4da7db46c9b6-kube-api-access-bdd7s\") pod \"node-resolver-n59mx\" (UID: \"b6e80f78-4fca-4d88-b133-4da7db46c9b6\") " pod="openshift-dns/node-resolver-n59mx" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.719138 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6e80f78-4fca-4d88-b133-4da7db46c9b6-hosts-file\") pod \"node-resolver-n59mx\" (UID: \"b6e80f78-4fca-4d88-b133-4da7db46c9b6\") " pod="openshift-dns/node-resolver-n59mx" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.735887 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdd7s\" (UniqueName: \"kubernetes.io/projected/b6e80f78-4fca-4d88-b133-4da7db46c9b6-kube-api-access-bdd7s\") pod \"node-resolver-n59mx\" (UID: \"b6e80f78-4fca-4d88-b133-4da7db46c9b6\") " pod="openshift-dns/node-resolver-n59mx" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.748846 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.748910 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.748927 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.748950 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.748967 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.809997 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n59mx" Feb 26 15:42:49 crc kubenswrapper[5033]: W0226 15:42:49.824649 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6e80f78_4fca_4d88_b133_4da7db46c9b6.slice/crio-368e392a8dd3257294328c9ac20a2d1eec86a1c368aa8bf84107ef3a437c2654 WatchSource:0}: Error finding container 368e392a8dd3257294328c9ac20a2d1eec86a1c368aa8bf84107ef3a437c2654: Status 404 returned error can't find the container with id 368e392a8dd3257294328c9ac20a2d1eec86a1c368aa8bf84107ef3a437c2654 Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.850309 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-zxss5"] Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.850791 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.852008 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-hnlmd"] Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.852862 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2h9gn"] Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.853242 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.853753 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.853919 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.853971 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.854697 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.854733 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.854752 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.854775 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.854793 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.855452 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.855639 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.855752 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.855999 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.856046 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.856668 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.858791 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.859067 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.860673 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.861229 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.866748 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.876099 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.888501 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.897087 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b30da39-cf26-480f-82fc-dbb869e9acbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxss5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.913410 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748cc274375fda9c75beb1d4155d07ba9b488c176d3f3c8bf69c667511898b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921363 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-os-release\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921431 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-var-lib-cni-bin\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921477 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-run-multus-certs\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921520 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9afdf68c-a47a-4700-abc9-b877d7220836-cni-binary-copy\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921590 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-run-k8s-cni-cncf-io\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921633 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-var-lib-kubelet\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921708 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqqbn\" (UniqueName: \"kubernetes.io/projected/9afdf68c-a47a-4700-abc9-b877d7220836-kube-api-access-wqqbn\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921804 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-multus-socket-dir-parent\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921866 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e974dc06-da1a-49ae-af6e-80b3436df208-multus-daemon-config\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921912 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b30da39-cf26-480f-82fc-dbb869e9acbd-proxy-tls\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921954 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-var-lib-cni-multus\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.921995 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-etc-kubernetes\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922053 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e974dc06-da1a-49ae-af6e-80b3436df208-cni-binary-copy\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922086 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-run-netns\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922135 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b30da39-cf26-480f-82fc-dbb869e9acbd-mcd-auth-proxy-config\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922167 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7xh9\" (UniqueName: \"kubernetes.io/projected/3b30da39-cf26-480f-82fc-dbb869e9acbd-kube-api-access-d7xh9\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922215 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-multus-cni-dir\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922293 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-system-cni-dir\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922342 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9afdf68c-a47a-4700-abc9-b877d7220836-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922374 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-os-release\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922421 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-hostroot\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922453 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9frmg\" (UniqueName: \"kubernetes.io/projected/e974dc06-da1a-49ae-af6e-80b3436df208-kube-api-access-9frmg\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922498 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-cnibin\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922531 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3b30da39-cf26-480f-82fc-dbb869e9acbd-rootfs\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922567 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922602 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-system-cni-dir\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922633 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-cnibin\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.922664 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-multus-conf-dir\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.927214 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.939300 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.945424 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-n59mx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6e80f78-4fca-4d88-b133-4da7db46c9b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdd7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-n59mx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.957064 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.957116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.957135 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.957163 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.957182 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:49Z","lastTransitionTime":"2026-02-26T15:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.964344 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"050af11c-f014-4b1e-b033-63526858a53e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f55baa5845d1d93e807193c728c2adc29ac1396a864f97c8445094ba944820e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0737283a2d944dc412534ebe53bc180bf3005fb66e164e72abb7113a86452c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236bd9859aa14e2adf418323c049fdd9ba63540adc8d44feab7db2deb72af262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2be63f3aeb64760fc286645929a8863c2794930849e4d662a65a0ec04aee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c5f32b45ecafcde39a72bf4595128654203e513cb5660bdf8c5536dd0dd6fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.980485 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127cc120-a042-4dd7-8513-f6ee67014a46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:42:25.355394 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:42:25.355540 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:42:25.356824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1398027493/tls.crt::/tmp/serving-cert-1398027493/tls.key\\\\\\\"\\\\nI0226 15:42:25.948466 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:42:25.950731 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:42:25.950748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:42:25.950769 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:42:25.950774 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:42:25.958513 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 15:42:25.958527 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 15:42:25.958563 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958574 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958583 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:42:25.958589 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:42:25.958595 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:42:25.958600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 15:42:25.960644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:49 crc kubenswrapper[5033]: I0226 15:42:49.993868 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2h9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e974dc06-da1a-49ae-af6e-80b3436df208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9frmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2h9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.007112 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748cc274375fda9c75beb1d4155d07ba9b488c176d3f3c8bf69c667511898b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.023310 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.023562 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-run-netns\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.023608 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b30da39-cf26-480f-82fc-dbb869e9acbd-mcd-auth-proxy-config\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.023639 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7xh9\" (UniqueName: \"kubernetes.io/projected/3b30da39-cf26-480f-82fc-dbb869e9acbd-kube-api-access-d7xh9\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.023682 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e974dc06-da1a-49ae-af6e-80b3436df208-cni-binary-copy\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.023724 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-multus-cni-dir\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.023751 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-system-cni-dir\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.023792 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9afdf68c-a47a-4700-abc9-b877d7220836-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024004 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-run-netns\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024385 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-os-release\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024450 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-hostroot\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024471 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9frmg\" (UniqueName: \"kubernetes.io/projected/e974dc06-da1a-49ae-af6e-80b3436df208-kube-api-access-9frmg\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024509 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-cnibin\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024525 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3b30da39-cf26-480f-82fc-dbb869e9acbd-rootfs\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024513 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-multus-cni-dir\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024541 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024624 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-system-cni-dir\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024659 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-cnibin\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024692 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-hostroot\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024703 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-multus-conf-dir\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024740 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-os-release\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024770 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-var-lib-cni-bin\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024800 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-run-multus-certs\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024836 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9afdf68c-a47a-4700-abc9-b877d7220836-cni-binary-copy\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024857 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-os-release\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-run-k8s-cni-cncf-io\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024897 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-var-lib-kubelet\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024927 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e974dc06-da1a-49ae-af6e-80b3436df208-multus-daemon-config\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024938 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-multus-conf-dir\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024980 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-cnibin\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025047 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-cnibin\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025047 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-os-release\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025087 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-system-cni-dir\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025088 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-run-multus-certs\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024957 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b30da39-cf26-480f-82fc-dbb869e9acbd-proxy-tls\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025107 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-var-lib-cni-bin\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024854 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-system-cni-dir\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025087 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9afdf68c-a47a-4700-abc9-b877d7220836-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025152 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqqbn\" (UniqueName: \"kubernetes.io/projected/9afdf68c-a47a-4700-abc9-b877d7220836-kube-api-access-wqqbn\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025156 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3b30da39-cf26-480f-82fc-dbb869e9acbd-rootfs\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.024658 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b30da39-cf26-480f-82fc-dbb869e9acbd-mcd-auth-proxy-config\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025214 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-multus-socket-dir-parent\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025335 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-multus-socket-dir-parent\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025341 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-var-lib-kubelet\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025359 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-run-k8s-cni-cncf-io\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025396 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-var-lib-cni-multus\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025419 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-etc-kubernetes\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025396 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e974dc06-da1a-49ae-af6e-80b3436df208-cni-binary-copy\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025473 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9afdf68c-a47a-4700-abc9-b877d7220836-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025474 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-etc-kubernetes\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.025521 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e974dc06-da1a-49ae-af6e-80b3436df208-host-var-lib-cni-multus\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.026100 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e974dc06-da1a-49ae-af6e-80b3436df208-multus-daemon-config\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.026456 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9afdf68c-a47a-4700-abc9-b877d7220836-cni-binary-copy\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.030317 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b30da39-cf26-480f-82fc-dbb869e9acbd-proxy-tls\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.044631 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.049324 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqqbn\" (UniqueName: \"kubernetes.io/projected/9afdf68c-a47a-4700-abc9-b877d7220836-kube-api-access-wqqbn\") pod \"multus-additional-cni-plugins-hnlmd\" (UID: \"9afdf68c-a47a-4700-abc9-b877d7220836\") " pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.054056 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9frmg\" (UniqueName: \"kubernetes.io/projected/e974dc06-da1a-49ae-af6e-80b3436df208-kube-api-access-9frmg\") pod \"multus-2h9gn\" (UID: \"e974dc06-da1a-49ae-af6e-80b3436df208\") " pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.055925 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7xh9\" (UniqueName: \"kubernetes.io/projected/3b30da39-cf26-480f-82fc-dbb869e9acbd-kube-api-access-d7xh9\") pod \"machine-config-daemon-zxss5\" (UID: \"3b30da39-cf26-480f-82fc-dbb869e9acbd\") " pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.058367 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.060193 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.060240 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.060286 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.060310 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.060327 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.072486 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b30da39-cf26-480f-82fc-dbb869e9acbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxss5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.090953 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9afdf68c-a47a-4700-abc9-b877d7220836\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hnlmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.105676 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.112981 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-n59mx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6e80f78-4fca-4d88-b133-4da7db46c9b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdd7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-n59mx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.130741 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"050af11c-f014-4b1e-b033-63526858a53e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f55baa5845d1d93e807193c728c2adc29ac1396a864f97c8445094ba944820e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0737283a2d944dc412534ebe53bc180bf3005fb66e164e72abb7113a86452c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236bd9859aa14e2adf418323c049fdd9ba63540adc8d44feab7db2deb72af262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2be63f3aeb64760fc286645929a8863c2794930849e4d662a65a0ec04aee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c5f32b45ecafcde39a72bf4595128654203e513cb5660bdf8c5536dd0dd6fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.139977 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127cc120-a042-4dd7-8513-f6ee67014a46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:42:25.355394 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:42:25.355540 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:42:25.356824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1398027493/tls.crt::/tmp/serving-cert-1398027493/tls.key\\\\\\\"\\\\nI0226 15:42:25.948466 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:42:25.950731 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:42:25.950748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:42:25.950769 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:42:25.950774 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:42:25.958513 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 15:42:25.958527 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 15:42:25.958563 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958574 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958583 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:42:25.958589 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:42:25.958595 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:42:25.958600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 15:42:25.960644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.146354 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n59mx" event={"ID":"b6e80f78-4fca-4d88-b133-4da7db46c9b6","Type":"ContainerStarted","Data":"b8e96e01354bf98216deb1139db97f85cc29a10c11ca3214f19d58ae83b71502"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.146399 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n59mx" event={"ID":"b6e80f78-4fca-4d88-b133-4da7db46c9b6","Type":"ContainerStarted","Data":"368e392a8dd3257294328c9ac20a2d1eec86a1c368aa8bf84107ef3a437c2654"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.150400 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.162130 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.162167 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.162177 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.162191 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.162201 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.164677 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"050af11c-f014-4b1e-b033-63526858a53e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f55baa5845d1d93e807193c728c2adc29ac1396a864f97c8445094ba944820e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0737283a2d944dc412534ebe53bc180bf3005fb66e164e72abb7113a86452c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236bd9859aa14e2adf418323c049fdd9ba63540adc8d44feab7db2deb72af262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2be63f3aeb64760fc286645929a8863c2794930849e4d662a65a0ec04aee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c5f32b45ecafcde39a72bf4595128654203e513cb5660bdf8c5536dd0dd6fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.179878 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127cc120-a042-4dd7-8513-f6ee67014a46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:42:25.355394 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:42:25.355540 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:42:25.356824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1398027493/tls.crt::/tmp/serving-cert-1398027493/tls.key\\\\\\\"\\\\nI0226 15:42:25.948466 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:42:25.950731 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:42:25.950748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:42:25.950769 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:42:25.950774 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:42:25.958513 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 15:42:25.958527 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 15:42:25.958563 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958574 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958583 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:42:25.958589 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:42:25.958595 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:42:25.958600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 15:42:25.960644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.187871 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.191703 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.202395 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2h9gn" Feb 26 15:42:50 crc kubenswrapper[5033]: W0226 15:42:50.202598 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b30da39_cf26_480f_82fc_dbb869e9acbd.slice/crio-6d2c79e49481fd610ae4b4972c0cb804bd8b0d85ada5a246831f22fbd6713347 WatchSource:0}: Error finding container 6d2c79e49481fd610ae4b4972c0cb804bd8b0d85ada5a246831f22fbd6713347: Status 404 returned error can't find the container with id 6d2c79e49481fd610ae4b4972c0cb804bd8b0d85ada5a246831f22fbd6713347 Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.203443 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.209864 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.213715 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-n59mx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6e80f78-4fca-4d88-b133-4da7db46c9b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e96e01354bf98216deb1139db97f85cc29a10c11ca3214f19d58ae83b71502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdd7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-n59mx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: W0226 15:42:50.216613 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode974dc06_da1a_49ae_af6e_80b3436df208.slice/crio-7c890f692627898d8eeb5b3560b01daf8faf01e0481c698e29d77dfe738171f1 WatchSource:0}: Error finding container 7c890f692627898d8eeb5b3560b01daf8faf01e0481c698e29d77dfe738171f1: Status 404 returned error can't find the container with id 7c890f692627898d8eeb5b3560b01daf8faf01e0481c698e29d77dfe738171f1 Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.230467 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748cc274375fda9c75beb1d4155d07ba9b488c176d3f3c8bf69c667511898b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.234613 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fth8v"] Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.236138 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: W0226 15:42:50.237535 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9afdf68c_a47a_4700_abc9_b877d7220836.slice/crio-0c306057fe1830f6b07525e91b951d272c7e3475b714eebeb18e9eaf799c1cbf WatchSource:0}: Error finding container 0c306057fe1830f6b07525e91b951d272c7e3475b714eebeb18e9eaf799c1cbf: Status 404 returned error can't find the container with id 0c306057fe1830f6b07525e91b951d272c7e3475b714eebeb18e9eaf799c1cbf Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.238350 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.238687 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.238770 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.239194 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.239343 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.239477 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.239681 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.250306 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.263179 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.265766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.265799 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.265810 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.265830 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.265841 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.282367 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.292565 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b30da39-cf26-480f-82fc-dbb869e9acbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxss5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.304216 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2h9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e974dc06-da1a-49ae-af6e-80b3436df208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9frmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2h9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.326531 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9afdf68c-a47a-4700-abc9-b877d7220836\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hnlmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.328644 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovn-node-metrics-cert\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.329004 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-node-log\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.329368 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-bin\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.329542 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-netns\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.329698 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-env-overrides\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.329879 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-netd\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.330104 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-slash\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.330269 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-var-lib-openvswitch\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.330392 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-script-lib\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.330663 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4m9j\" (UniqueName: \"kubernetes.io/projected/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-kube-api-access-m4m9j\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.330872 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-kubelet\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.331020 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-etc-openvswitch\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.331204 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-systemd\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.331342 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-config\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.331492 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-ovn\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.331668 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-ovn-kubernetes\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.331853 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.332032 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-systemd-units\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.332173 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-log-socket\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.332287 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-openvswitch\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.341021 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9afdf68c-a47a-4700-abc9-b877d7220836\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hnlmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.355564 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.369620 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.370369 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.370390 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.370400 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.370414 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.370424 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.379437 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-n59mx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6e80f78-4fca-4d88-b133-4da7db46c9b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e96e01354bf98216deb1139db97f85cc29a10c11ca3214f19d58ae83b71502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdd7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-n59mx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.402859 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fth8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.423482 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"050af11c-f014-4b1e-b033-63526858a53e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f55baa5845d1d93e807193c728c2adc29ac1396a864f97c8445094ba944820e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0737283a2d944dc412534ebe53bc180bf3005fb66e164e72abb7113a86452c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236bd9859aa14e2adf418323c049fdd9ba63540adc8d44feab7db2deb72af262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2be63f3aeb64760fc286645929a8863c2794930849e4d662a65a0ec04aee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c5f32b45ecafcde39a72bf4595128654203e513cb5660bdf8c5536dd0dd6fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.432845 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-ovn\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433014 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-systemd-units\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433310 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-log-socket\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433333 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-ovn-kubernetes\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433348 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433366 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-openvswitch\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433381 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-node-log\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433395 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-bin\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433410 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovn-node-metrics-cert\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433427 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-netns\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433445 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-env-overrides\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433467 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-netd\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433484 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-slash\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433498 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-script-lib\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433500 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433513 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-var-lib-openvswitch\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433561 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-var-lib-openvswitch\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433569 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4m9j\" (UniqueName: \"kubernetes.io/projected/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-kube-api-access-m4m9j\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433591 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-kubelet\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433602 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-openvswitch\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433623 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-etc-openvswitch\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433697 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-kubelet\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433713 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-node-log\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433726 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-bin\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433752 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-systemd\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433779 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-config\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.433911 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-log-socket\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434001 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-etc-openvswitch\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434074 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127cc120-a042-4dd7-8513-f6ee67014a46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:42:25.355394 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:42:25.355540 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:42:25.356824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1398027493/tls.crt::/tmp/serving-cert-1398027493/tls.key\\\\\\\"\\\\nI0226 15:42:25.948466 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:42:25.950731 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:42:25.950748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:42:25.950769 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:42:25.950774 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:42:25.958513 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 15:42:25.958527 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 15:42:25.958563 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958574 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958583 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:42:25.958589 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:42:25.958595 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:42:25.958600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 15:42:25.960644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434278 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-netns\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434310 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-ovn\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434310 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-systemd-units\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434347 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-slash\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434368 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-netd\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434387 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-ovn-kubernetes\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434693 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-env-overrides\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.434822 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-systemd\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.435004 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-config\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.435129 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-script-lib\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.438749 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovn-node-metrics-cert\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.448338 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.448882 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4m9j\" (UniqueName: \"kubernetes.io/projected/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-kube-api-access-m4m9j\") pod \"ovnkube-node-fth8v\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.456661 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.463858 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b30da39-cf26-480f-82fc-dbb869e9acbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxss5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.472490 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2h9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e974dc06-da1a-49ae-af6e-80b3436df208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9frmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2h9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.473359 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.473387 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.473396 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.473408 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.473417 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.487554 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748cc274375fda9c75beb1d4155d07ba9b488c176d3f3c8bf69c667511898b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.497324 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.548858 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:50 crc kubenswrapper[5033]: W0226 15:42:50.570025 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e439a58_bd20_4da1_96e5_7e6a4329fd6b.slice/crio-37123b5dc05c886f85c0200969c75d459f4a40773fdfb69634d0c93a0f232ae2 WatchSource:0}: Error finding container 37123b5dc05c886f85c0200969c75d459f4a40773fdfb69634d0c93a0f232ae2: Status 404 returned error can't find the container with id 37123b5dc05c886f85c0200969c75d459f4a40773fdfb69634d0c93a0f232ae2 Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.575475 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.575510 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.575522 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.575540 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.575552 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.627860 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.627858 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.628105 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.628777 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.627777 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.631386 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.635769 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.635878 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:06.635863109 +0000 UTC m=+113.848461759 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.636031 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.636156 5033 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.636209 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:43:06.636197716 +0000 UTC m=+113.848796366 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.677314 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.677344 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.677355 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.677369 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.677380 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.736884 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.737174 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.737350 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.737536 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.737607 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.737664 5033 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.737758 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:43:06.737744779 +0000 UTC m=+113.950343429 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.738417 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.738448 5033 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.738460 5033 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.738506 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:43:06.738492237 +0000 UTC m=+113.951090877 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.739348 5033 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:50 crc kubenswrapper[5033]: E0226 15:42:50.739385 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:43:06.739376788 +0000 UTC m=+113.951975438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.779873 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.779900 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.779912 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.779930 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.779941 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.882327 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.882554 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.882623 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.882831 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.882902 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.985445 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.985678 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.985749 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.985810 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:50 crc kubenswrapper[5033]: I0226 15:42:50.985866 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:50Z","lastTransitionTime":"2026-02-26T15:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.089028 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.090460 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.090857 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.091240 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.091728 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.150538 5033 generic.go:334] "Generic (PLEG): container finished" podID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerID="496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b" exitCode=0 Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.150843 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.151006 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerStarted","Data":"37123b5dc05c886f85c0200969c75d459f4a40773fdfb69634d0c93a0f232ae2"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.155247 5033 generic.go:334] "Generic (PLEG): container finished" podID="9afdf68c-a47a-4700-abc9-b877d7220836" containerID="37aa74141ff0c21adf4dcb2e868e3bad1712ce985ea4457ef6bcd25e4e9d102f" exitCode=0 Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.155574 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" event={"ID":"9afdf68c-a47a-4700-abc9-b877d7220836","Type":"ContainerDied","Data":"37aa74141ff0c21adf4dcb2e868e3bad1712ce985ea4457ef6bcd25e4e9d102f"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.155767 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" event={"ID":"9afdf68c-a47a-4700-abc9-b877d7220836","Type":"ContainerStarted","Data":"0c306057fe1830f6b07525e91b951d272c7e3475b714eebeb18e9eaf799c1cbf"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.161301 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"63dbac392f65aa7fa6b20969129eb8d33a62c42c83a861f90e12cc4c58fd2bd9"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.161370 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"8c3b531c4becd39d370617a6d65865b0e8eb7fbdb2814420616e198b3e7ab4da"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.161392 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"6d2c79e49481fd610ae4b4972c0cb804bd8b0d85ada5a246831f22fbd6713347"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.166645 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2h9gn" event={"ID":"e974dc06-da1a-49ae-af6e-80b3436df208","Type":"ContainerStarted","Data":"1cbfbc700e95c4db0e198395ac967657336eadd3ff9df7a4f96fdfbe66689b9c"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.166701 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2h9gn" event={"ID":"e974dc06-da1a-49ae-af6e-80b3436df208","Type":"ContainerStarted","Data":"7c890f692627898d8eeb5b3560b01daf8faf01e0481c698e29d77dfe738171f1"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.170771 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2b96494183ff20706e32abc2049debdf1a9a5f66d5a8f5192c3a085d6bd0c274"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.171000 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"72f36dcd285d76edcd841d7c5c4e0dcd7ccde9dadeb595703aa01f1e8373f4e0"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.178406 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.198253 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.198813 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.199002 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.199171 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.199496 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.200160 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.224684 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b30da39-cf26-480f-82fc-dbb869e9acbd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7xh9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxss5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.238842 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.238904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.238920 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.238943 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.238960 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.242140 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2h9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e974dc06-da1a-49ae-af6e-80b3436df208\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9frmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2h9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: E0226 15:42:51.252145 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.256942 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://748cc274375fda9c75beb1d4155d07ba9b488c176d3f3c8bf69c667511898b12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.259854 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.260034 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.260184 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.260349 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.260493 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: E0226 15:42:51.272041 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.277094 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.282013 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.282160 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.282244 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.282379 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.282488 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.291631 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9afdf68c-a47a-4700-abc9-b877d7220836\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hnlmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: E0226 15:42:51.294349 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.298627 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.298824 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.298941 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.299029 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.299103 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.303624 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: E0226 15:42:51.307506 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.311993 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.312046 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.312061 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.312083 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.312101 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.316595 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: E0226 15:42:51.321608 5033 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e74ecb8-0e4e-4659-99ac-3abfb809185b\\\",\\\"systemUUID\\\":\\\"8ea4d39b-2c74-4703-9738-58e0d85cfdf9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: E0226 15:42:51.321748 5033 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.325054 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-n59mx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6e80f78-4fca-4d88-b133-4da7db46c9b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e96e01354bf98216deb1139db97f85cc29a10c11ca3214f19d58ae83b71502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdd7s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-n59mx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.327933 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.327966 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.327976 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.327992 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.328004 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.341195 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fth8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.376419 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"050af11c-f014-4b1e-b033-63526858a53e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f55baa5845d1d93e807193c728c2adc29ac1396a864f97c8445094ba944820e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0737283a2d944dc412534ebe53bc180bf3005fb66e164e72abb7113a86452c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236bd9859aa14e2adf418323c049fdd9ba63540adc8d44feab7db2deb72af262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2be63f3aeb64760fc286645929a8863c2794930849e4d662a65a0ec04aee78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c5f32b45ecafcde39a72bf4595128654203e513cb5660bdf8c5536dd0dd6fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7671b2d5f720afeebed5828ecdf1691376e4b27089a2a14097e6f9537f4cc33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea79bab3887c09558261b23e7211cfc9f05e635956c19869f432ab538876b68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cceca3e01729a4eb81f92e0c9663d5a43407600cf42cda772b912dc430955642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.387862 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"127cc120-a042-4dd7-8513-f6ee67014a46\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:41:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:42:25.355394 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:42:25.355540 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:42:25.356824 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1398027493/tls.crt::/tmp/serving-cert-1398027493/tls.key\\\\\\\"\\\\nI0226 15:42:25.948466 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:42:25.950731 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:42:25.950748 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:42:25.950769 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:42:25.950774 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:42:25.958513 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 15:42:25.958527 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 15:42:25.958563 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958574 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:42:25.958583 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:42:25.958589 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:42:25.958595 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:42:25.958600 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 15:42:25.960644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:41:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:41:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:41:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:41:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.395516 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9afdf68c-a47a-4700-abc9-b877d7220836\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37aa74141ff0c21adf4dcb2e868e3bad1712ce985ea4457ef6bcd25e4e9d102f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37aa74141ff0c21adf4dcb2e868e3bad1712ce985ea4457ef6bcd25e4e9d102f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqqbn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hnlmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.407487 5033 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4m9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fth8v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.431038 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.431462 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.431483 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.431505 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.431521 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.474833 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.474815064 podStartE2EDuration="3.474815064s" podCreationTimestamp="2026-02-26 15:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:42:51.4496129 +0000 UTC m=+98.662211620" watchObservedRunningTime="2026-02-26 15:42:51.474815064 +0000 UTC m=+98.687413724" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.535234 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.535284 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.535294 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.535310 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.535321 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.562856 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-n59mx" podStartSLOduration=36.562830992 podStartE2EDuration="36.562830992s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:42:51.562334341 +0000 UTC m=+98.774932991" watchObservedRunningTime="2026-02-26 15:42:51.562830992 +0000 UTC m=+98.775429662" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.637068 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.637101 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.637110 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.637121 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.637131 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.647606 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2h9gn" podStartSLOduration=36.647586623 podStartE2EDuration="36.647586623s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:42:51.647319566 +0000 UTC m=+98.859918206" watchObservedRunningTime="2026-02-26 15:42:51.647586623 +0000 UTC m=+98.860185273" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.648374 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podStartSLOduration=36.648365041 podStartE2EDuration="36.648365041s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:42:51.632676105 +0000 UTC m=+98.845274755" watchObservedRunningTime="2026-02-26 15:42:51.648365041 +0000 UTC m=+98.860963691" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.739123 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.739179 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.739190 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.739209 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.739222 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.783123 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zm5lz"] Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.783457 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.785139 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.785151 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.786152 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.786249 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.841589 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.841641 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.841656 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.841674 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.841686 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.851086 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eda1781b-337a-4379-a3d1-7a54208b969c-serviceca\") pod \"node-ca-zm5lz\" (UID: \"eda1781b-337a-4379-a3d1-7a54208b969c\") " pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.851160 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eda1781b-337a-4379-a3d1-7a54208b969c-host\") pod \"node-ca-zm5lz\" (UID: \"eda1781b-337a-4379-a3d1-7a54208b969c\") " pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.851186 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztfcs\" (UniqueName: \"kubernetes.io/projected/eda1781b-337a-4379-a3d1-7a54208b969c-kube-api-access-ztfcs\") pod \"node-ca-zm5lz\" (UID: \"eda1781b-337a-4379-a3d1-7a54208b969c\") " pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.943323 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb"] Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.944126 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.944207 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.944300 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.944314 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.944330 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.944341 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:51Z","lastTransitionTime":"2026-02-26T15:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.946046 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.946532 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.952576 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eda1781b-337a-4379-a3d1-7a54208b969c-serviceca\") pod \"node-ca-zm5lz\" (UID: \"eda1781b-337a-4379-a3d1-7a54208b969c\") " pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.952645 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eda1781b-337a-4379-a3d1-7a54208b969c-host\") pod \"node-ca-zm5lz\" (UID: \"eda1781b-337a-4379-a3d1-7a54208b969c\") " pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.952677 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztfcs\" (UniqueName: \"kubernetes.io/projected/eda1781b-337a-4379-a3d1-7a54208b969c-kube-api-access-ztfcs\") pod \"node-ca-zm5lz\" (UID: \"eda1781b-337a-4379-a3d1-7a54208b969c\") " pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.952765 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eda1781b-337a-4379-a3d1-7a54208b969c-host\") pod \"node-ca-zm5lz\" (UID: \"eda1781b-337a-4379-a3d1-7a54208b969c\") " pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.953429 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eda1781b-337a-4379-a3d1-7a54208b969c-serviceca\") pod \"node-ca-zm5lz\" (UID: \"eda1781b-337a-4379-a3d1-7a54208b969c\") " pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.959605 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4bq8z"] Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.960027 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:51 crc kubenswrapper[5033]: E0226 15:42:51.960087 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4bq8z" podUID="857e0324-95f6-4524-b676-8919c08e97ca" Feb 26 15:42:51 crc kubenswrapper[5033]: I0226 15:42:51.970101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztfcs\" (UniqueName: \"kubernetes.io/projected/eda1781b-337a-4379-a3d1-7a54208b969c-kube-api-access-ztfcs\") pod \"node-ca-zm5lz\" (UID: \"eda1781b-337a-4379-a3d1-7a54208b969c\") " pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.046305 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.046333 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.046341 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.046354 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.046363 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.053545 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f3a4def-6ccf-4d49-bfb9-69906decb852-env-overrides\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.053596 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f3a4def-6ccf-4d49-bfb9-69906decb852-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.053620 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f3a4def-6ccf-4d49-bfb9-69906decb852-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.053644 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.053689 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkgrg\" (UniqueName: \"kubernetes.io/projected/857e0324-95f6-4524-b676-8919c08e97ca-kube-api-access-nkgrg\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.053771 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2c28\" (UniqueName: \"kubernetes.io/projected/6f3a4def-6ccf-4d49-bfb9-69906decb852-kube-api-access-h2c28\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.149383 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.149658 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.149666 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.149680 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.149689 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.154940 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2c28\" (UniqueName: \"kubernetes.io/projected/6f3a4def-6ccf-4d49-bfb9-69906decb852-kube-api-access-h2c28\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.155021 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f3a4def-6ccf-4d49-bfb9-69906decb852-env-overrides\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.155048 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f3a4def-6ccf-4d49-bfb9-69906decb852-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.155072 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f3a4def-6ccf-4d49-bfb9-69906decb852-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.155095 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.155129 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkgrg\" (UniqueName: \"kubernetes.io/projected/857e0324-95f6-4524-b676-8919c08e97ca-kube-api-access-nkgrg\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:52 crc kubenswrapper[5033]: E0226 15:42:52.155509 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:52 crc kubenswrapper[5033]: E0226 15:42:52.155553 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs podName:857e0324-95f6-4524-b676-8919c08e97ca nodeName:}" failed. No retries permitted until 2026-02-26 15:42:52.6555386 +0000 UTC m=+99.868137250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs") pod "network-metrics-daemon-4bq8z" (UID: "857e0324-95f6-4524-b676-8919c08e97ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.155888 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f3a4def-6ccf-4d49-bfb9-69906decb852-env-overrides\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.155923 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f3a4def-6ccf-4d49-bfb9-69906decb852-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.163058 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f3a4def-6ccf-4d49-bfb9-69906decb852-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.169962 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zm5lz" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.177912 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2c28\" (UniqueName: \"kubernetes.io/projected/6f3a4def-6ccf-4d49-bfb9-69906decb852-kube-api-access-h2c28\") pod \"ovnkube-control-plane-749d76644c-drlbb\" (UID: \"6f3a4def-6ccf-4d49-bfb9-69906decb852\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.177940 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkgrg\" (UniqueName: \"kubernetes.io/projected/857e0324-95f6-4524-b676-8919c08e97ca-kube-api-access-nkgrg\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.180427 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerStarted","Data":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.180536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerStarted","Data":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.180561 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerStarted","Data":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.180580 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerStarted","Data":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.183794 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" event={"ID":"9afdf68c-a47a-4700-abc9-b877d7220836","Type":"ContainerStarted","Data":"f17dc7c11c406489b0e6f6c0a3f9e83252eca11f83dfbfc6003ddae6c2bddd42"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.251607 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.251646 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.251660 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.251677 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.251691 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.270671 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" Feb 26 15:42:52 crc kubenswrapper[5033]: W0226 15:42:52.281599 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3a4def_6ccf_4d49_bfb9_69906decb852.slice/crio-8b8384968e7db5b8dbda0d756f9e2876f848e1dc315faf35debb868e92b84822 WatchSource:0}: Error finding container 8b8384968e7db5b8dbda0d756f9e2876f848e1dc315faf35debb868e92b84822: Status 404 returned error can't find the container with id 8b8384968e7db5b8dbda0d756f9e2876f848e1dc315faf35debb868e92b84822 Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.355037 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.355326 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.355338 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.355353 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.355365 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.457951 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.457982 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.457989 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.458002 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.458011 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.560483 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.560513 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.560521 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.560535 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.560543 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.627871 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.627939 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:52 crc kubenswrapper[5033]: E0226 15:42:52.627985 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:52 crc kubenswrapper[5033]: E0226 15:42:52.628186 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.628396 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:52 crc kubenswrapper[5033]: E0226 15:42:52.628708 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.660786 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:52 crc kubenswrapper[5033]: E0226 15:42:52.660914 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:52 crc kubenswrapper[5033]: E0226 15:42:52.660962 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs podName:857e0324-95f6-4524-b676-8919c08e97ca nodeName:}" failed. No retries permitted until 2026-02-26 15:42:53.660950206 +0000 UTC m=+100.873548856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs") pod "network-metrics-daemon-4bq8z" (UID: "857e0324-95f6-4524-b676-8919c08e97ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.662673 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.662702 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.662713 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.662727 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.662736 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.765305 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.765374 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.765392 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.765416 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.765433 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.868555 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.868850 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.868990 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.869086 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.869176 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.971938 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.972444 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.972683 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.972864 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:52 crc kubenswrapper[5033]: I0226 15:42:52.973012 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:52Z","lastTransitionTime":"2026-02-26T15:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.076457 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.076934 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.077183 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.077433 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.077638 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:53Z","lastTransitionTime":"2026-02-26T15:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.180552 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.181386 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.181423 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.181455 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.181476 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:53Z","lastTransitionTime":"2026-02-26T15:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.191658 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerStarted","Data":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.191754 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerStarted","Data":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.193902 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zm5lz" event={"ID":"eda1781b-337a-4379-a3d1-7a54208b969c","Type":"ContainerStarted","Data":"4a872a0d63c68f59f222dab9f68271a709b3efa446531f0a6f115084ed8e713b"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.193949 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zm5lz" event={"ID":"eda1781b-337a-4379-a3d1-7a54208b969c","Type":"ContainerStarted","Data":"77c878b26b6c004e41cb9208be55ff0b64d08e8d7f13e6ac5d2cafd2bf8c3ba3"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.197621 5033 generic.go:334] "Generic (PLEG): container finished" podID="9afdf68c-a47a-4700-abc9-b877d7220836" containerID="f17dc7c11c406489b0e6f6c0a3f9e83252eca11f83dfbfc6003ddae6c2bddd42" exitCode=0 Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.197733 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" event={"ID":"9afdf68c-a47a-4700-abc9-b877d7220836","Type":"ContainerDied","Data":"f17dc7c11c406489b0e6f6c0a3f9e83252eca11f83dfbfc6003ddae6c2bddd42"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.204194 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" event={"ID":"6f3a4def-6ccf-4d49-bfb9-69906decb852","Type":"ContainerStarted","Data":"78729245a2abccb9fb46120bd06c40d911cbe24853aa957c2a634d519f92b140"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.204298 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" event={"ID":"6f3a4def-6ccf-4d49-bfb9-69906decb852","Type":"ContainerStarted","Data":"d67c719d7bf039c32a50d740e03a266ae2c7c3b0440e54f53011f8b2e1af7f13"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.204336 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" event={"ID":"6f3a4def-6ccf-4d49-bfb9-69906decb852","Type":"ContainerStarted","Data":"8b8384968e7db5b8dbda0d756f9e2876f848e1dc315faf35debb868e92b84822"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.215108 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zm5lz" podStartSLOduration=38.215080008 podStartE2EDuration="38.215080008s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:42:53.21431058 +0000 UTC m=+100.426909290" watchObservedRunningTime="2026-02-26 15:42:53.215080008 +0000 UTC m=+100.427678698" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.273815 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-drlbb" podStartSLOduration=38.273787444 podStartE2EDuration="38.273787444s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:42:53.272538234 +0000 UTC m=+100.485136884" watchObservedRunningTime="2026-02-26 15:42:53.273787444 +0000 UTC m=+100.486386134" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.284459 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.284766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.284963 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.285096 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.285345 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:53Z","lastTransitionTime":"2026-02-26T15:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.389753 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.389786 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.389794 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.389807 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.389816 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:53Z","lastTransitionTime":"2026-02-26T15:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.492252 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.492318 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.492330 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.492346 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.492357 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:53Z","lastTransitionTime":"2026-02-26T15:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.594890 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.594932 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.594941 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.594955 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.594966 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:53Z","lastTransitionTime":"2026-02-26T15:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.627298 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:53 crc kubenswrapper[5033]: E0226 15:42:53.627773 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4bq8z" podUID="857e0324-95f6-4524-b676-8919c08e97ca" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.671544 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:53 crc kubenswrapper[5033]: E0226 15:42:53.671765 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:53 crc kubenswrapper[5033]: E0226 15:42:53.671901 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs podName:857e0324-95f6-4524-b676-8919c08e97ca nodeName:}" failed. No retries permitted until 2026-02-26 15:42:55.67186599 +0000 UTC m=+102.884464680 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs") pod "network-metrics-daemon-4bq8z" (UID: "857e0324-95f6-4524-b676-8919c08e97ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.700592 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.700677 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.700702 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.700731 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.700764 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:53Z","lastTransitionTime":"2026-02-26T15:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.803366 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.803426 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.803434 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.803448 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.803457 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:53Z","lastTransitionTime":"2026-02-26T15:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.905897 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.905951 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.905967 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.905991 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:53 crc kubenswrapper[5033]: I0226 15:42:53.906009 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:53Z","lastTransitionTime":"2026-02-26T15:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.009424 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.009486 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.009513 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.009542 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.009565 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.112481 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.112527 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.112538 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.112557 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.112569 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.212670 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"53929e9745cda4663c2d2c54bca1d58bb917263266d737f010f88528a414d443"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.214740 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.214795 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.214810 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.214829 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.214845 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.221612 5033 generic.go:334] "Generic (PLEG): container finished" podID="9afdf68c-a47a-4700-abc9-b877d7220836" containerID="fb293f71b8a4ed0d8aef6293461cac09191f470ad21892e815246d3519f782ef" exitCode=0 Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.221670 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" event={"ID":"9afdf68c-a47a-4700-abc9-b877d7220836","Type":"ContainerDied","Data":"fb293f71b8a4ed0d8aef6293461cac09191f470ad21892e815246d3519f782ef"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.317869 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.317899 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.317906 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.317919 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.317927 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.426462 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.426756 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.426769 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.426786 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.426797 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.529939 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.530023 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.530039 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.530063 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.530080 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.627648 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.627738 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.627850 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:54 crc kubenswrapper[5033]: E0226 15:42:54.627930 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:54 crc kubenswrapper[5033]: E0226 15:42:54.627781 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:54 crc kubenswrapper[5033]: E0226 15:42:54.628077 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.632698 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.632737 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.632746 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.632760 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.632771 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.734744 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.734821 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.734847 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.734913 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.734939 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.837893 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.837959 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.837977 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.838003 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.838026 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.941087 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.941148 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.941164 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.941188 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:54 crc kubenswrapper[5033]: I0226 15:42:54.941205 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:54Z","lastTransitionTime":"2026-02-26T15:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.044061 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.044102 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.044113 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.044129 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.044140 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.146989 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.147068 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.147091 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.147124 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.147150 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.230390 5033 generic.go:334] "Generic (PLEG): container finished" podID="9afdf68c-a47a-4700-abc9-b877d7220836" containerID="eddc60c13476ab71eeb24fedf8968dc0aed0b5480d7979a41771b876b9e5ec5d" exitCode=0 Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.230494 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" event={"ID":"9afdf68c-a47a-4700-abc9-b877d7220836","Type":"ContainerDied","Data":"eddc60c13476ab71eeb24fedf8968dc0aed0b5480d7979a41771b876b9e5ec5d"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.236441 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerStarted","Data":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.249830 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.249894 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.249918 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.249948 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.249996 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.352880 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.352938 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.352952 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.352971 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.352983 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.455566 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.455604 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.455616 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.455634 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.455647 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.558174 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.558247 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.558315 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.558348 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.558368 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.628039 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:55 crc kubenswrapper[5033]: E0226 15:42:55.628215 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4bq8z" podUID="857e0324-95f6-4524-b676-8919c08e97ca" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.661298 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.661365 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.661381 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.661406 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.661427 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.696196 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:55 crc kubenswrapper[5033]: E0226 15:42:55.696482 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:55 crc kubenswrapper[5033]: E0226 15:42:55.696657 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs podName:857e0324-95f6-4524-b676-8919c08e97ca nodeName:}" failed. No retries permitted until 2026-02-26 15:42:59.696624288 +0000 UTC m=+106.909223138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs") pod "network-metrics-daemon-4bq8z" (UID: "857e0324-95f6-4524-b676-8919c08e97ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.764624 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.764691 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.764700 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.764720 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.764732 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.867600 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.867690 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.867708 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.867776 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.867799 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.971860 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.971944 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.972031 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.972156 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:55 crc kubenswrapper[5033]: I0226 15:42:55.972177 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:55Z","lastTransitionTime":"2026-02-26T15:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.075341 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.075418 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.075444 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.075474 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.075498 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:56Z","lastTransitionTime":"2026-02-26T15:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.179541 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.179598 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.179617 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.179641 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.179660 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:56Z","lastTransitionTime":"2026-02-26T15:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.246165 5033 generic.go:334] "Generic (PLEG): container finished" podID="9afdf68c-a47a-4700-abc9-b877d7220836" containerID="65941a1656aab4a4994ca6010c01d1512e622ef05cfd102fe054f57e60174c8b" exitCode=0 Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.246233 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" event={"ID":"9afdf68c-a47a-4700-abc9-b877d7220836","Type":"ContainerDied","Data":"65941a1656aab4a4994ca6010c01d1512e622ef05cfd102fe054f57e60174c8b"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.282766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.282801 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.282813 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.282827 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.282840 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:56Z","lastTransitionTime":"2026-02-26T15:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.386744 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.386809 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.386828 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.386857 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.386877 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:56Z","lastTransitionTime":"2026-02-26T15:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.490072 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.490693 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.490717 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.490736 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.490748 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:56Z","lastTransitionTime":"2026-02-26T15:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.594567 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.594632 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.594657 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.594691 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.594778 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:56Z","lastTransitionTime":"2026-02-26T15:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.627554 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.627602 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.627722 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:56 crc kubenswrapper[5033]: E0226 15:42:56.627718 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:56 crc kubenswrapper[5033]: E0226 15:42:56.627813 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:56 crc kubenswrapper[5033]: E0226 15:42:56.628172 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.628419 5033 scope.go:117] "RemoveContainer" containerID="3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b" Feb 26 15:42:56 crc kubenswrapper[5033]: E0226 15:42:56.628582 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.697491 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.697561 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.697574 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.697599 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.697613 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:56Z","lastTransitionTime":"2026-02-26T15:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.800741 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.800786 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.800799 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.800819 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.800836 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:56Z","lastTransitionTime":"2026-02-26T15:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.903190 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.903215 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.903223 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.903237 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:56 crc kubenswrapper[5033]: I0226 15:42:56.903247 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:56Z","lastTransitionTime":"2026-02-26T15:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.005600 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.005660 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.005668 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.005681 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.005691 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.108887 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.108987 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.109007 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.109032 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.109050 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.214317 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.214383 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.214403 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.214435 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.214457 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.256356 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerStarted","Data":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.261668 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.270941 5033 generic.go:334] "Generic (PLEG): container finished" podID="9afdf68c-a47a-4700-abc9-b877d7220836" containerID="d090bbce4493ee8ef8d6e3a488d1ecb93560d67be0045a6ec50b92506a33bf59" exitCode=0 Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.271010 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" event={"ID":"9afdf68c-a47a-4700-abc9-b877d7220836","Type":"ContainerDied","Data":"d090bbce4493ee8ef8d6e3a488d1ecb93560d67be0045a6ec50b92506a33bf59"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.302454 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" podStartSLOduration=42.302424041 podStartE2EDuration="42.302424041s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:42:57.301178921 +0000 UTC m=+104.513777621" watchObservedRunningTime="2026-02-26 15:42:57.302424041 +0000 UTC m=+104.515022691" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.318066 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.318130 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.318153 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.318185 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.318208 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.321907 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.425248 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.425973 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.426060 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.426129 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.426146 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.530604 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.530665 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.530678 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.530695 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.530707 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.629101 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:57 crc kubenswrapper[5033]: E0226 15:42:57.629249 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4bq8z" podUID="857e0324-95f6-4524-b676-8919c08e97ca" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.634968 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.635005 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.635016 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.635031 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.635044 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.737286 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.737592 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.737723 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.737857 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.737991 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.840963 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.841379 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.841597 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.841837 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.842050 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.945494 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.945796 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.945936 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.946070 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:57 crc kubenswrapper[5033]: I0226 15:42:57.946199 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:57Z","lastTransitionTime":"2026-02-26T15:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.048646 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.048865 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.048975 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.049106 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.049222 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.152670 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.152744 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.152766 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.152793 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.152813 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.256561 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.256631 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.256649 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.256737 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.256764 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.280747 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" event={"ID":"9afdf68c-a47a-4700-abc9-b877d7220836","Type":"ContainerStarted","Data":"a16cc3055617a946a74129ae1aaf90cdadaaed1dd96fd0ac164d05108511df23"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.281111 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.281162 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.313825 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.315086 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hnlmd" podStartSLOduration=43.315024436 podStartE2EDuration="43.315024436s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:42:58.313194511 +0000 UTC m=+105.525793191" watchObservedRunningTime="2026-02-26 15:42:58.315024436 +0000 UTC m=+105.527623126" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.359856 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.359947 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.359973 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.360004 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.360088 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.469184 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.469226 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.469235 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.469251 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.469275 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.571904 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.571937 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.571946 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.571960 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.571969 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.627010 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:42:58 crc kubenswrapper[5033]: E0226 15:42:58.627127 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.627453 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:42:58 crc kubenswrapper[5033]: E0226 15:42:58.627502 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.627524 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:42:58 crc kubenswrapper[5033]: E0226 15:42:58.627561 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.674709 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.674761 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.674773 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.674792 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.674804 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.777548 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.777582 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.777590 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.777605 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.777615 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.879832 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.879873 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.879885 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.879903 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.879915 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.983075 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.983121 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.983135 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.983152 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:58 crc kubenswrapper[5033]: I0226 15:42:58.983164 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:58Z","lastTransitionTime":"2026-02-26T15:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.086165 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.086231 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.086249 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.086299 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.086317 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:59Z","lastTransitionTime":"2026-02-26T15:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.122608 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4bq8z"] Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.123164 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:59 crc kubenswrapper[5033]: E0226 15:42:59.123612 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4bq8z" podUID="857e0324-95f6-4524-b676-8919c08e97ca" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.189770 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.189826 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.189843 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.189869 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.189888 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:59Z","lastTransitionTime":"2026-02-26T15:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.293116 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.294010 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.294171 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.294370 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.294528 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:59Z","lastTransitionTime":"2026-02-26T15:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.397618 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.397702 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.397730 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.397762 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.397790 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:59Z","lastTransitionTime":"2026-02-26T15:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.500971 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.501041 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.501064 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.501092 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.501113 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:59Z","lastTransitionTime":"2026-02-26T15:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.603489 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.603546 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.603562 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.603585 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.603600 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:59Z","lastTransitionTime":"2026-02-26T15:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.705714 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.705772 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.705786 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.705804 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.705817 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:59Z","lastTransitionTime":"2026-02-26T15:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.753418 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:42:59 crc kubenswrapper[5033]: E0226 15:42:59.753549 5033 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:59 crc kubenswrapper[5033]: E0226 15:42:59.753623 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs podName:857e0324-95f6-4524-b676-8919c08e97ca nodeName:}" failed. No retries permitted until 2026-02-26 15:43:07.753603033 +0000 UTC m=+114.966201683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs") pod "network-metrics-daemon-4bq8z" (UID: "857e0324-95f6-4524-b676-8919c08e97ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.808135 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.808176 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.808188 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.808203 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.808213 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:59Z","lastTransitionTime":"2026-02-26T15:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.910801 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.910833 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.910841 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.910858 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:42:59 crc kubenswrapper[5033]: I0226 15:42:59.910869 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:42:59Z","lastTransitionTime":"2026-02-26T15:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.013972 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.014031 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.014044 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.014060 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.014071 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.117046 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.117099 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.117110 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.117124 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.117134 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.219716 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.219782 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.219801 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.219827 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.219846 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.322935 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.322977 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.322986 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.323001 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.323014 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.425534 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.425617 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.425630 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.425649 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.425663 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.529063 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.529120 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.529133 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.529153 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.529168 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.627565 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:43:00 crc kubenswrapper[5033]: E0226 15:43:00.627723 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.627803 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.627850 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:43:00 crc kubenswrapper[5033]: E0226 15:43:00.628062 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.627868 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:43:00 crc kubenswrapper[5033]: E0226 15:43:00.628154 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:43:00 crc kubenswrapper[5033]: E0226 15:43:00.628190 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4bq8z" podUID="857e0324-95f6-4524-b676-8919c08e97ca" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.632771 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.632820 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.632837 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.632858 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.632878 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.735751 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.735798 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.735807 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.735821 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.735830 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.839043 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.839109 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.839135 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.839209 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.839235 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.942443 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.942500 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.942517 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.942542 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:00 crc kubenswrapper[5033]: I0226 15:43:00.942558 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:00Z","lastTransitionTime":"2026-02-26T15:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.054443 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.054515 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.054533 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.054558 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.054582 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:01Z","lastTransitionTime":"2026-02-26T15:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.157320 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.157630 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.157729 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.157826 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.157919 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:01Z","lastTransitionTime":"2026-02-26T15:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.267164 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.267233 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.267249 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.267310 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.267323 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:01Z","lastTransitionTime":"2026-02-26T15:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.370809 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.372085 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.372242 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.372447 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.372614 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:01Z","lastTransitionTime":"2026-02-26T15:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.475738 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.475799 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.475817 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.475842 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.475861 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:01Z","lastTransitionTime":"2026-02-26T15:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.578738 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.578817 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.578842 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.578874 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.578896 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:01Z","lastTransitionTime":"2026-02-26T15:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.681043 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.681078 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.681086 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.681102 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.681111 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:01Z","lastTransitionTime":"2026-02-26T15:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.684467 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.684577 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.684673 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.684745 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.684816 5033 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:01Z","lastTransitionTime":"2026-02-26T15:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.746507 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv"] Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.747078 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.749624 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.749826 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.749894 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.750063 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.776146 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d47c7563-042b-4112-97d5-cd6bde999c8e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.776235 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d47c7563-042b-4112-97d5-cd6bde999c8e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.776373 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d47c7563-042b-4112-97d5-cd6bde999c8e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.776410 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d47c7563-042b-4112-97d5-cd6bde999c8e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.776448 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d47c7563-042b-4112-97d5-cd6bde999c8e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.877726 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d47c7563-042b-4112-97d5-cd6bde999c8e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.878066 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d47c7563-042b-4112-97d5-cd6bde999c8e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.878184 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d47c7563-042b-4112-97d5-cd6bde999c8e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.878405 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d47c7563-042b-4112-97d5-cd6bde999c8e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.878548 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d47c7563-042b-4112-97d5-cd6bde999c8e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.877889 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d47c7563-042b-4112-97d5-cd6bde999c8e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.878304 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d47c7563-042b-4112-97d5-cd6bde999c8e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.915209 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d47c7563-042b-4112-97d5-cd6bde999c8e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.921442 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d47c7563-042b-4112-97d5-cd6bde999c8e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.922410 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d47c7563-042b-4112-97d5-cd6bde999c8e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kd5sv\" (UID: \"d47c7563-042b-4112-97d5-cd6bde999c8e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.975368 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 26 15:43:01 crc kubenswrapper[5033]: I0226 15:43:01.985887 5033 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.067120 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" Feb 26 15:43:02 crc kubenswrapper[5033]: W0226 15:43:02.083093 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd47c7563_042b_4112_97d5_cd6bde999c8e.slice/crio-693c07f34563f7880d0042d9f9357ad15aae987fe41b501ff48502bfa30e9e05 WatchSource:0}: Error finding container 693c07f34563f7880d0042d9f9357ad15aae987fe41b501ff48502bfa30e9e05: Status 404 returned error can't find the container with id 693c07f34563f7880d0042d9f9357ad15aae987fe41b501ff48502bfa30e9e05 Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.296077 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" event={"ID":"d47c7563-042b-4112-97d5-cd6bde999c8e","Type":"ContainerStarted","Data":"693c07f34563f7880d0042d9f9357ad15aae987fe41b501ff48502bfa30e9e05"} Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.505946 5033 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.506193 5033 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.561734 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rgjzx"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.562674 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.565020 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.565589 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.570720 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.571845 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.573404 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.573536 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.573588 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.574419 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.574481 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.574522 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.574755 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.574938 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.575101 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.575122 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.575157 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.577974 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7l2wv"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.578772 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.580164 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bdlcv"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.580688 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.580948 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.581193 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.585225 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gs8vj"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.593667 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.587068 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-image-import-ca\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595068 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba76c4b-90a7-4c05-9994-ef6275950675-config\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595107 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ba76c4b-90a7-4c05-9994-ef6275950675-etcd-service-ca\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595150 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/76657875-63c5-479a-86ef-abd020c2e4ed-auth-proxy-config\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595176 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-etcd-serving-ca\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595217 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aaf15fb-2839-42d5-8628-341eed11eee8-serving-cert\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595273 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44efad7-1dfe-42f1-9b37-483b29e3e4b0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lgwp2\" (UID: \"a44efad7-1dfe-42f1-9b37-483b29e3e4b0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595312 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-config\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595342 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76657875-63c5-479a-86ef-abd020c2e4ed-config\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595371 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp8xr\" (UniqueName: \"kubernetes.io/projected/0ba76c4b-90a7-4c05-9994-ef6275950675-kube-api-access-pp8xr\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595403 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4aaf15fb-2839-42d5-8628-341eed11eee8-node-pullsecrets\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595437 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595485 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-audit\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595508 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4aaf15fb-2839-42d5-8628-341eed11eee8-etcd-client\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595561 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/76657875-63c5-479a-86ef-abd020c2e4ed-machine-approver-tls\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595604 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ba76c4b-90a7-4c05-9994-ef6275950675-serving-cert\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595629 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm2cw\" (UniqueName: \"kubernetes.io/projected/a44efad7-1dfe-42f1-9b37-483b29e3e4b0-kube-api-access-rm2cw\") pod \"openshift-apiserver-operator-796bbdcf4f-lgwp2\" (UID: \"a44efad7-1dfe-42f1-9b37-483b29e3e4b0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595655 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4aaf15fb-2839-42d5-8628-341eed11eee8-encryption-config\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595714 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf15fb-2839-42d5-8628-341eed11eee8-audit-dir\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595777 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0ba76c4b-90a7-4c05-9994-ef6275950675-etcd-client\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595804 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8d8h\" (UniqueName: \"kubernetes.io/projected/76657875-63c5-479a-86ef-abd020c2e4ed-kube-api-access-p8d8h\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595829 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44efad7-1dfe-42f1-9b37-483b29e3e4b0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lgwp2\" (UID: \"a44efad7-1dfe-42f1-9b37-483b29e3e4b0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595857 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqvb\" (UniqueName: \"kubernetes.io/projected/4aaf15fb-2839-42d5-8628-341eed11eee8-kube-api-access-rgqvb\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.595884 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0ba76c4b-90a7-4c05-9994-ef6275950675-etcd-ca\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.599010 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.599602 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-b6z8t"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.590336 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.586606 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.599995 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.594948 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.600037 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.601354 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.601402 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.601812 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.601691 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.601756 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.605552 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.605909 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.606182 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.606465 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.606772 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.606825 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.606869 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.607126 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.607168 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.609952 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.610200 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.610767 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.611654 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.612540 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.613422 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cnhpr"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.614094 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.628425 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.641628 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.642218 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.642371 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nq49v"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.642659 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.642864 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.642951 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.643173 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.643321 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.643404 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.643435 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.643495 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.643700 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.643873 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.644541 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.644578 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.644797 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.644817 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.649250 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.649507 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.649568 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.649646 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.649681 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.649696 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.649814 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.651390 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.652540 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.652989 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.664284 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.664519 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.664608 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.665082 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.666286 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8l772"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.666792 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8l772" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.666876 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.666964 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.667030 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.667088 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.668167 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.672241 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.672317 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.672508 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.672587 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.672718 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.672857 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.672989 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.673002 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.673086 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.673171 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.673184 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.673318 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.673455 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.673534 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.673698 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.675727 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.675896 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.676003 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.676720 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.676760 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jdtrx"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.677208 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.677438 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.677435 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.677487 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.677519 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.677561 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.678185 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.678403 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.678571 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.678870 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.681631 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.681999 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p4fzm"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.682484 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.682850 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.684464 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.684696 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.685079 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.685102 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.685499 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.685678 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.687249 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.687906 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.688105 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.689174 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.689203 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.689400 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.689421 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.689645 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.689756 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.689850 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.689934 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.689816 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.690118 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.690541 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.690696 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.691034 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.691081 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.691106 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.691166 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.691168 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.703819 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-l4bg7"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.710002 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.711964 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.713498 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.713630 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ba76c4b-90a7-4c05-9994-ef6275950675-serving-cert\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.713779 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.713991 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm2cw\" (UniqueName: \"kubernetes.io/projected/a44efad7-1dfe-42f1-9b37-483b29e3e4b0-kube-api-access-rm2cw\") pod \"openshift-apiserver-operator-796bbdcf4f-lgwp2\" (UID: \"a44efad7-1dfe-42f1-9b37-483b29e3e4b0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.714096 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4aaf15fb-2839-42d5-8628-341eed11eee8-encryption-config\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.714192 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf15fb-2839-42d5-8628-341eed11eee8-audit-dir\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.714317 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.714418 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0ba76c4b-90a7-4c05-9994-ef6275950675-etcd-client\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.714560 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8d8h\" (UniqueName: \"kubernetes.io/projected/76657875-63c5-479a-86ef-abd020c2e4ed-kube-api-access-p8d8h\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.714643 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.714745 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44efad7-1dfe-42f1-9b37-483b29e3e4b0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lgwp2\" (UID: \"a44efad7-1dfe-42f1-9b37-483b29e3e4b0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.714828 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqvb\" (UniqueName: \"kubernetes.io/projected/4aaf15fb-2839-42d5-8628-341eed11eee8-kube-api-access-rgqvb\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.718472 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.724715 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0ba76c4b-90a7-4c05-9994-ef6275950675-etcd-ca\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.724780 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e8507b4-c6ad-476a-b2a6-71c17ad0014c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7lvqc\" (UID: \"0e8507b4-c6ad-476a-b2a6-71c17ad0014c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.725653 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf15fb-2839-42d5-8628-341eed11eee8-audit-dir\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.726278 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0ba76c4b-90a7-4c05-9994-ef6275950675-etcd-ca\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.727130 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-image-import-ca\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.727227 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.727682 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ba76c4b-90a7-4c05-9994-ef6275950675-serving-cert\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.727736 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728020 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g7xhm"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728419 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728465 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728599 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4aaf15fb-2839-42d5-8628-341eed11eee8-encryption-config\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728703 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba76c4b-90a7-4c05-9994-ef6275950675-config\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728770 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ba76c4b-90a7-4c05-9994-ef6275950675-etcd-service-ca\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728800 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-etcd-serving-ca\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728820 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/76657875-63c5-479a-86ef-abd020c2e4ed-auth-proxy-config\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728845 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k49w\" (UniqueName: \"kubernetes.io/projected/09c96f57-a56c-4519-8101-b3aa96065974-kube-api-access-7k49w\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728871 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh9wk\" (UniqueName: \"kubernetes.io/projected/ecb847f9-d250-4d91-b74c-104acafc26fd-kube-api-access-lh9wk\") pod \"migrator-59844c95c7-zndq6\" (UID: \"ecb847f9-d250-4d91-b74c-104acafc26fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728879 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729135 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729154 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba76c4b-90a7-4c05-9994-ef6275950675-config\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728803 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7dpkv"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729295 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729394 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ba76c4b-90a7-4c05-9994-ef6275950675-etcd-service-ca\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729434 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.728887 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729491 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729515 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aaf15fb-2839-42d5-8628-341eed11eee8-serving-cert\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729535 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44efad7-1dfe-42f1-9b37-483b29e3e4b0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lgwp2\" (UID: \"a44efad7-1dfe-42f1-9b37-483b29e3e4b0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729559 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-config\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729578 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76657875-63c5-479a-86ef-abd020c2e4ed-config\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729595 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09c96f57-a56c-4519-8101-b3aa96065974-audit-dir\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729617 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp8xr\" (UniqueName: \"kubernetes.io/projected/0ba76c4b-90a7-4c05-9994-ef6275950675-kube-api-access-pp8xr\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729636 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4aaf15fb-2839-42d5-8628-341eed11eee8-node-pullsecrets\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729657 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729674 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729696 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-audit\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729712 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4aaf15fb-2839-42d5-8628-341eed11eee8-etcd-client\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729730 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729733 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/76657875-63c5-479a-86ef-abd020c2e4ed-auth-proxy-config\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729746 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e8507b4-c6ad-476a-b2a6-71c17ad0014c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7lvqc\" (UID: \"0e8507b4-c6ad-476a-b2a6-71c17ad0014c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729784 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-audit-policies\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729832 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbfg2\" (UniqueName: \"kubernetes.io/projected/0e8507b4-c6ad-476a-b2a6-71c17ad0014c-kube-api-access-vbfg2\") pod \"kube-storage-version-migrator-operator-b67b599dd-7lvqc\" (UID: \"0e8507b4-c6ad-476a-b2a6-71c17ad0014c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.729967 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730178 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76657875-63c5-479a-86ef-abd020c2e4ed-config\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730250 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730286 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730312 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730330 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730380 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/76657875-63c5-479a-86ef-abd020c2e4ed-machine-approver-tls\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730334 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-image-import-ca\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730534 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44efad7-1dfe-42f1-9b37-483b29e3e4b0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lgwp2\" (UID: \"a44efad7-1dfe-42f1-9b37-483b29e3e4b0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730594 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-audit\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.730642 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4aaf15fb-2839-42d5-8628-341eed11eee8-node-pullsecrets\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.731213 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44efad7-1dfe-42f1-9b37-483b29e3e4b0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lgwp2\" (UID: \"a44efad7-1dfe-42f1-9b37-483b29e3e4b0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.731353 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.731559 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-etcd-serving-ca\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.731659 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8mgvk"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.731822 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aaf15fb-2839-42d5-8628-341eed11eee8-config\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.732322 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.732493 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/76657875-63c5-479a-86ef-abd020c2e4ed-machine-approver-tls\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.732542 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.732992 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.733005 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.733368 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.733875 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aaf15fb-2839-42d5-8628-341eed11eee8-serving-cert\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.734193 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.734562 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.734612 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.735514 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wmk55"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.735977 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.736580 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.737053 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4aaf15fb-2839-42d5-8628-341eed11eee8-etcd-client\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.737365 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.737417 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.737519 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0ba76c4b-90a7-4c05-9994-ef6275950675-etcd-client\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.738096 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.742350 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sql8k"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.743141 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.744877 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-b6z8t"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.745083 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.746081 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rgjzx"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.747518 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7l2wv"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.748746 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.750023 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nq49v"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.751373 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gs8vj"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.752820 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.754082 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.755650 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bdlcv"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.758054 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.759565 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-tz87k"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.760498 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.760734 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.761857 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.763203 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.764440 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.764447 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.765614 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p4fzm"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.767143 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.769406 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jdtrx"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.770933 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.784617 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.790743 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.792554 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.797786 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.799881 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sql8k"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.800680 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.802587 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8l772"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.802655 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cnhpr"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.803664 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g7xhm"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.804666 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7dpkv"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.805633 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wmk55"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.806237 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.806564 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.808272 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.809629 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8bjnz"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.810526 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.810637 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8bjnz" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.812643 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.815372 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.815434 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8bjnz"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.815451 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.816511 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.817497 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8mgvk"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.818587 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fb82g"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.819487 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.819871 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fkt29"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.821038 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fb82g"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.821195 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.821713 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fkt29"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.825214 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831193 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh9wk\" (UniqueName: \"kubernetes.io/projected/ecb847f9-d250-4d91-b74c-104acafc26fd-kube-api-access-lh9wk\") pod \"migrator-59844c95c7-zndq6\" (UID: \"ecb847f9-d250-4d91-b74c-104acafc26fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831326 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831430 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831531 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09c96f57-a56c-4519-8101-b3aa96065974-audit-dir\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831613 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09c96f57-a56c-4519-8101-b3aa96065974-audit-dir\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831629 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831685 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831719 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e8507b4-c6ad-476a-b2a6-71c17ad0014c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7lvqc\" (UID: \"0e8507b4-c6ad-476a-b2a6-71c17ad0014c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831763 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-audit-policies\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831787 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbfg2\" (UniqueName: \"kubernetes.io/projected/0e8507b4-c6ad-476a-b2a6-71c17ad0014c-kube-api-access-vbfg2\") pod \"kube-storage-version-migrator-operator-b67b599dd-7lvqc\" (UID: \"0e8507b4-c6ad-476a-b2a6-71c17ad0014c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831813 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831839 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831859 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831879 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831921 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.831985 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.832035 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.832065 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e8507b4-c6ad-476a-b2a6-71c17ad0014c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7lvqc\" (UID: \"0e8507b4-c6ad-476a-b2a6-71c17ad0014c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.832097 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k49w\" (UniqueName: \"kubernetes.io/projected/09c96f57-a56c-4519-8101-b3aa96065974-kube-api-access-7k49w\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.832924 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-audit-policies\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.842416 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.843619 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.843770 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.843838 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.845454 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.847200 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.857101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.881048 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.887221 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.893745 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-rx9pn"] Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.896224 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.904886 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.912964 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.916374 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.925701 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.925770 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.934004 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.957588 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.964530 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.965111 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 15:43:02 crc kubenswrapper[5033]: I0226 15:43:02.985791 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.005596 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.025367 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.045009 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.065246 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.084857 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.106135 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.125716 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.136101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e8507b4-c6ad-476a-b2a6-71c17ad0014c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7lvqc\" (UID: \"0e8507b4-c6ad-476a-b2a6-71c17ad0014c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.146571 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.153513 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e8507b4-c6ad-476a-b2a6-71c17ad0014c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7lvqc\" (UID: \"0e8507b4-c6ad-476a-b2a6-71c17ad0014c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.165940 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.185448 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.205467 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.226179 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.246069 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.265071 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.285837 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.301854 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" event={"ID":"d47c7563-042b-4112-97d5-cd6bde999c8e","Type":"ContainerStarted","Data":"c61839f3540004936e125f4994a4d629f85fbf1d918c6237ad7536c05130f776"} Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.305524 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.345931 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.378345 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.386194 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.407795 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.426519 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.445798 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.505913 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm2cw\" (UniqueName: \"kubernetes.io/projected/a44efad7-1dfe-42f1-9b37-483b29e3e4b0-kube-api-access-rm2cw\") pod \"openshift-apiserver-operator-796bbdcf4f-lgwp2\" (UID: \"a44efad7-1dfe-42f1-9b37-483b29e3e4b0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.524377 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8d8h\" (UniqueName: \"kubernetes.io/projected/76657875-63c5-479a-86ef-abd020c2e4ed-kube-api-access-p8d8h\") pod \"machine-approver-56656f9798-pb8px\" (UID: \"76657875-63c5-479a-86ef-abd020c2e4ed\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.525807 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.525868 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.546035 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.546581 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: W0226 15:43:03.561997 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76657875_63c5_479a_86ef_abd020c2e4ed.slice/crio-aac8acb3a8dda3d2ba7e4d126e986f2d6e335497d6817230e4534d3de7c44238 WatchSource:0}: Error finding container aac8acb3a8dda3d2ba7e4d126e986f2d6e335497d6817230e4534d3de7c44238: Status 404 returned error can't find the container with id aac8acb3a8dda3d2ba7e4d126e986f2d6e335497d6817230e4534d3de7c44238 Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.565184 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.587302 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.605700 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.627042 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.645877 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.682295 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqvb\" (UniqueName: \"kubernetes.io/projected/4aaf15fb-2839-42d5-8628-341eed11eee8-kube-api-access-rgqvb\") pod \"apiserver-76f77b778f-rgjzx\" (UID: \"4aaf15fb-2839-42d5-8628-341eed11eee8\") " pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.685399 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.705019 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.725230 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.746470 5033 request.go:700] Waited for 1.017697467s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver-operator/configmaps?fieldSelector=metadata.name%3Dkube-apiserver-operator-config&limit=500&resourceVersion=0 Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.748391 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.766231 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.785557 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.805641 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.810400 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.812794 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2"] Feb 26 15:43:03 crc kubenswrapper[5033]: W0226 15:43:03.821806 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda44efad7_1dfe_42f1_9b37_483b29e3e4b0.slice/crio-0aa1ad97ebb6640708b1629fc0e1ab10b94072e8fc95208f1ec5b4e6c3ae40f8 WatchSource:0}: Error finding container 0aa1ad97ebb6640708b1629fc0e1ab10b94072e8fc95208f1ec5b4e6c3ae40f8: Status 404 returned error can't find the container with id 0aa1ad97ebb6640708b1629fc0e1ab10b94072e8fc95208f1ec5b4e6c3ae40f8 Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.824820 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.845531 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.865592 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.886701 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.908042 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.927322 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.947393 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.966209 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 15:43:03 crc kubenswrapper[5033]: I0226 15:43:03.986043 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.005235 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.021725 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rgjzx"] Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.025157 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 15:43:04 crc kubenswrapper[5033]: W0226 15:43:04.033107 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aaf15fb_2839_42d5_8628_341eed11eee8.slice/crio-ed2a82a38b46723a73ab10bafb0e0ebd3b5a1c2fd42581946485f38e615b6ba9 WatchSource:0}: Error finding container ed2a82a38b46723a73ab10bafb0e0ebd3b5a1c2fd42581946485f38e615b6ba9: Status 404 returned error can't find the container with id ed2a82a38b46723a73ab10bafb0e0ebd3b5a1c2fd42581946485f38e615b6ba9 Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.047179 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.065490 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.086190 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.124735 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.127998 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp8xr\" (UniqueName: \"kubernetes.io/projected/0ba76c4b-90a7-4c05-9994-ef6275950675-kube-api-access-pp8xr\") pod \"etcd-operator-b45778765-7l2wv\" (UID: \"0ba76c4b-90a7-4c05-9994-ef6275950675\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.151284 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.166037 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.188193 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.188745 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.207508 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.227595 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.246170 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.274044 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.286179 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.304961 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.313850 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" event={"ID":"4aaf15fb-2839-42d5-8628-341eed11eee8","Type":"ContainerStarted","Data":"9a69206ee4e1696a7c727da825531d5a4c08c718491d524e659c531b33467619"} Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.313888 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" event={"ID":"4aaf15fb-2839-42d5-8628-341eed11eee8","Type":"ContainerStarted","Data":"ed2a82a38b46723a73ab10bafb0e0ebd3b5a1c2fd42581946485f38e615b6ba9"} Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.315686 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" event={"ID":"a44efad7-1dfe-42f1-9b37-483b29e3e4b0","Type":"ContainerStarted","Data":"575b750d8a01620bbb4f9905ef6b5c24325e4c9e9844817668ebfbe1ac53b66a"} Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.315707 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" event={"ID":"a44efad7-1dfe-42f1-9b37-483b29e3e4b0","Type":"ContainerStarted","Data":"0aa1ad97ebb6640708b1629fc0e1ab10b94072e8fc95208f1ec5b4e6c3ae40f8"} Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.317351 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" event={"ID":"76657875-63c5-479a-86ef-abd020c2e4ed","Type":"ContainerStarted","Data":"116f4baf83a7a3dc5690ca4cca6460c9ffb87cf8432288129fcbd444227301f4"} Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.317385 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" event={"ID":"76657875-63c5-479a-86ef-abd020c2e4ed","Type":"ContainerStarted","Data":"b528fe4cad0e9e7dd23a6e49c09a4825fdaa7eb759ec6d5bd0e9f522b0b7f584"} Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.317403 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" event={"ID":"76657875-63c5-479a-86ef-abd020c2e4ed","Type":"ContainerStarted","Data":"aac8acb3a8dda3d2ba7e4d126e986f2d6e335497d6817230e4534d3de7c44238"} Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.326690 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.345836 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.365231 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.369823 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7l2wv"] Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.386100 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 15:43:04 crc kubenswrapper[5033]: W0226 15:43:04.387707 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ba76c4b_90a7_4c05_9994_ef6275950675.slice/crio-5e9bc5590eb664470e6432a947788f660badf3ec27a0c54b43b69b4ad7702cde WatchSource:0}: Error finding container 5e9bc5590eb664470e6432a947788f660badf3ec27a0c54b43b69b4ad7702cde: Status 404 returned error can't find the container with id 5e9bc5590eb664470e6432a947788f660badf3ec27a0c54b43b69b4ad7702cde Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.406273 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.425833 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.445764 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.465538 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.485176 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.505458 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.531409 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.546078 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.567428 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.586129 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.605217 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.625695 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.645806 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.666292 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.686001 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.706595 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.727807 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.748859 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.763478 5033 request.go:700] Waited for 1.941982226s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.764719 5033 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.785078 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.825852 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh9wk\" (UniqueName: \"kubernetes.io/projected/ecb847f9-d250-4d91-b74c-104acafc26fd-kube-api-access-lh9wk\") pod \"migrator-59844c95c7-zndq6\" (UID: \"ecb847f9-d250-4d91-b74c-104acafc26fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.850845 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbfg2\" (UniqueName: \"kubernetes.io/projected/0e8507b4-c6ad-476a-b2a6-71c17ad0014c-kube-api-access-vbfg2\") pod \"kube-storage-version-migrator-operator-b67b599dd-7lvqc\" (UID: \"0e8507b4-c6ad-476a-b2a6-71c17ad0014c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.858877 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k49w\" (UniqueName: \"kubernetes.io/projected/09c96f57-a56c-4519-8101-b3aa96065974-kube-api-access-7k49w\") pod \"oauth-openshift-558db77b4-jdtrx\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.865401 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.885574 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.905686 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.926351 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53508: no serving certificate available for the kubelet" Feb 26 15:43:04 crc kubenswrapper[5033]: I0226 15:43:04.965508 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53510: no serving certificate available for the kubelet" Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.020138 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53524: no serving certificate available for the kubelet" Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.118500 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53526: no serving certificate available for the kubelet" Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.223896 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53532: no serving certificate available for the kubelet" Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.324721 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" event={"ID":"0ba76c4b-90a7-4c05-9994-ef6275950675","Type":"ContainerStarted","Data":"693abf2c2dac69180128b4fdba636ec65d4a6ba5341b05058c7472efb40ae7bd"} Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.324761 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" event={"ID":"0ba76c4b-90a7-4c05-9994-ef6275950675","Type":"ContainerStarted","Data":"5e9bc5590eb664470e6432a947788f660badf3ec27a0c54b43b69b4ad7702cde"} Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.326539 5033 generic.go:334] "Generic (PLEG): container finished" podID="4aaf15fb-2839-42d5-8628-341eed11eee8" containerID="9a69206ee4e1696a7c727da825531d5a4c08c718491d524e659c531b33467619" exitCode=0 Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.326889 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" event={"ID":"4aaf15fb-2839-42d5-8628-341eed11eee8","Type":"ContainerDied","Data":"9a69206ee4e1696a7c727da825531d5a4c08c718491d524e659c531b33467619"} Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.332396 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53534: no serving certificate available for the kubelet" Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.422376 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53550: no serving certificate available for the kubelet" Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.525116 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53564: no serving certificate available for the kubelet" Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.540437 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgwp2" podStartSLOduration=50.540414123 podStartE2EDuration="50.540414123s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:05.538065828 +0000 UTC m=+112.750664488" watchObservedRunningTime="2026-02-26 15:43:05.540414123 +0000 UTC m=+112.753012783" Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.666221 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kd5sv" podStartSLOduration=50.666191636 podStartE2EDuration="50.666191636s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:05.658921032 +0000 UTC m=+112.871519732" watchObservedRunningTime="2026-02-26 15:43:05.666191636 +0000 UTC m=+112.878790306" Feb 26 15:43:05 crc kubenswrapper[5033]: I0226 15:43:05.905375 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pb8px" podStartSLOduration=50.905347815 podStartE2EDuration="50.905347815s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:05.902794703 +0000 UTC m=+113.115393413" watchObservedRunningTime="2026-02-26 15:43:05.905347815 +0000 UTC m=+113.117946495" Feb 26 15:43:06 crc kubenswrapper[5033]: I0226 15:43:06.194011 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53580: no serving certificate available for the kubelet" Feb 26 15:43:06 crc kubenswrapper[5033]: I0226 15:43:06.297547 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7l2wv" podStartSLOduration=51.297529268 podStartE2EDuration="51.297529268s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:06.295167402 +0000 UTC m=+113.507766082" watchObservedRunningTime="2026-02-26 15:43:06.297529268 +0000 UTC m=+113.510127938" Feb 26 15:43:07 crc kubenswrapper[5033]: I0226 15:43:07.495010 5033 ???:1] "http: TLS handshake error from 192.168.126.11:53584: no serving certificate available for the kubelet" Feb 26 15:43:10 crc kubenswrapper[5033]: I0226 15:43:10.080876 5033 ???:1] "http: TLS handshake error from 192.168.126.11:51192: no serving certificate available for the kubelet" Feb 26 15:43:11 crc kubenswrapper[5033]: I0226 15:43:11.638608 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 26 15:43:13 crc kubenswrapper[5033]: I0226 15:43:13.656168 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.656141148 podStartE2EDuration="2.656141148s" podCreationTimestamp="2026-02-26 15:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:13.653418813 +0000 UTC m=+120.866017473" watchObservedRunningTime="2026-02-26 15:43:13.656141148 +0000 UTC m=+120.868739838" Feb 26 15:43:15 crc kubenswrapper[5033]: I0226 15:43:15.228932 5033 ???:1] "http: TLS handshake error from 192.168.126.11:51196: no serving certificate available for the kubelet" Feb 26 15:43:20 crc kubenswrapper[5033]: I0226 15:43:20.576729 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.422997 5033 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.431265 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.432487 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.433888 5033 scope.go:117] "RemoveContainer" containerID="3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.434205 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.437113 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:21 crc kubenswrapper[5033]: E0226 15:43:21.438855 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:53.438812414 +0000 UTC m=+160.651411074 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.438969 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.439096 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.439316 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.440688 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.440768 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.440807 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.442318 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: E0226 15:43:21.443230 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:21.943217519 +0000 UTC m=+129.155816179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.443292 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.443441 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-tls\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.445072 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dcc382cb-23f7-44b0-89a2-43c41292c21e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.445603 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.445781 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.446152 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.446505 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-certificates\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.446555 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.450110 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.453239 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.454292 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.458482 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/857e0324-95f6-4524-b676-8919c08e97ca-metrics-certs\") pod \"network-metrics-daemon-4bq8z\" (UID: \"857e0324-95f6-4524-b676-8919c08e97ca\") " pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.464059 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.468160 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.468867 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.548076 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:21 crc kubenswrapper[5033]: E0226 15:43:21.548171 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.048150923 +0000 UTC m=+129.260749573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.548860 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-client-ca\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.548910 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x44jf\" (UniqueName: \"kubernetes.io/projected/ef433248-3b40-4b06-86ff-f9fd9377c779-kube-api-access-x44jf\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.548940 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-images\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.548964 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-oauth-config\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.548987 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-metrics-tls\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549025 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-oauth-serving-cert\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549050 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0b3f9223-93d9-4270-bbc2-c3235636f93f-etcd-client\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549109 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw2bs\" (UniqueName: \"kubernetes.io/projected/c80d44f2-ac5d-41e2-8abb-773410cde2a6-kube-api-access-kw2bs\") pod \"dns-operator-744455d44c-g7xhm\" (UID: \"c80d44f2-ac5d-41e2-8abb-773410cde2a6\") " pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549138 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86sc6\" (UniqueName: \"kubernetes.io/projected/db1da5ad-1d5b-4c87-9d5d-340cf0536968-kube-api-access-86sc6\") pod \"catalog-operator-68c6474976-h294z\" (UID: \"db1da5ad-1d5b-4c87-9d5d-340cf0536968\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549217 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-config\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549319 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-service-ca\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549345 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549470 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0b3f9223-93d9-4270-bbc2-c3235636f93f-encryption-config\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549516 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef433248-3b40-4b06-86ff-f9fd9377c779-config\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549544 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-apiservice-cert\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: E0226 15:43:21.549586 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.049577307 +0000 UTC m=+129.262175957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549888 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-config\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549909 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lbng4\" (UID: \"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.549926 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8mgvk\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550009 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db1da5ad-1d5b-4c87-9d5d-340cf0536968-srv-cert\") pod \"catalog-operator-68c6474976-h294z\" (UID: \"db1da5ad-1d5b-4c87-9d5d-340cf0536968\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550117 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0b3f9223-93d9-4270-bbc2-c3235636f93f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550148 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-webhook-cert\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550220 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-config\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550241 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550324 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550349 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgxf\" (UniqueName: \"kubernetes.io/projected/d80de8ac-f823-4cb8-93e7-f3a672b83729-kube-api-access-hmgxf\") pod \"machine-config-controller-84d6567774-zfxb8\" (UID: \"d80de8ac-f823-4cb8-93e7-f3a672b83729\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550379 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dcc382cb-23f7-44b0-89a2-43c41292c21e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550398 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0b3f9223-93d9-4270-bbc2-c3235636f93f-audit-policies\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550417 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tvn5\" (UniqueName: \"kubernetes.io/projected/4b423453-fa78-454a-83c7-934c0fce8992-kube-api-access-5tvn5\") pod \"control-plane-machine-set-operator-78cbb6b69f-xtmb5\" (UID: \"4b423453-fa78-454a-83c7-934c0fce8992\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550433 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-config\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550455 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550672 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bx2x\" (UniqueName: \"kubernetes.io/projected/2371a2b2-ba78-48f3-839d-cd97b08f3243-kube-api-access-6bx2x\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550723 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c263b9f1-4544-4400-a1d0-484b593ceab3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hzp8j\" (UID: \"c263b9f1-4544-4400-a1d0-484b593ceab3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550743 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxsj\" (UniqueName: \"kubernetes.io/projected/c852046c-87e9-42a6-9082-2d7390d6f71e-kube-api-access-gdxsj\") pod \"downloads-7954f5f757-8l772\" (UID: \"c852046c-87e9-42a6-9082-2d7390d6f71e\") " pod="openshift-console/downloads-7954f5f757-8l772" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550762 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dcc382cb-23f7-44b0-89a2-43c41292c21e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550784 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-bound-sa-token\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550800 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e612fd1f-3961-484f-a94c-18a54b4bf1ab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5jgbd\" (UID: \"e612fd1f-3961-484f-a94c-18a54b4bf1ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550818 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6c2a8653-71cf-4142-9fe7-9294ac24e231-stats-auth\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550834 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8mgvk\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.550893 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6mhw\" (UniqueName: \"kubernetes.io/projected/3bdab87a-cacc-4b04-87d0-1439b3db3efb-kube-api-access-d6mhw\") pod \"multus-admission-controller-857f4d67dd-7dpkv\" (UID: \"3bdab87a-cacc-4b04-87d0-1439b3db3efb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551135 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dcc382cb-23f7-44b0-89a2-43c41292c21e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551183 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c63de0b0-6c5f-4ae1-960e-86a483b1ac68-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bxjw5\" (UID: \"c63de0b0-6c5f-4ae1-960e-86a483b1ac68\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551344 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnz8p\" (UniqueName: \"kubernetes.io/projected/0b3f9223-93d9-4270-bbc2-c3235636f93f-kube-api-access-bnz8p\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551389 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4547700c-d872-4fa8-bd19-d5d442dc38f7-serving-cert\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551482 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c2a8653-71cf-4142-9fe7-9294ac24e231-service-ca-bundle\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551519 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccchf\" (UniqueName: \"kubernetes.io/projected/e612fd1f-3961-484f-a94c-18a54b4bf1ab-kube-api-access-ccchf\") pod \"cluster-samples-operator-665b6dd947-5jgbd\" (UID: \"e612fd1f-3961-484f-a94c-18a54b4bf1ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551550 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c63de0b0-6c5f-4ae1-960e-86a483b1ac68-serving-cert\") pod \"openshift-config-operator-7777fb866f-bxjw5\" (UID: \"c63de0b0-6c5f-4ae1-960e-86a483b1ac68\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551591 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4547700c-d872-4fa8-bd19-d5d442dc38f7-config\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551615 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt8w6\" (UniqueName: \"kubernetes.io/projected/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-kube-api-access-wt8w6\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551661 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdkgp\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-kube-api-access-zdkgp\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551685 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df4d19a5-387f-464f-ad1f-373b31520bcf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f79xz\" (UID: \"df4d19a5-387f-464f-ad1f-373b31520bcf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551722 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5k2q\" (UniqueName: \"kubernetes.io/projected/bb3fd26d-006b-46ba-864b-b79db95a0bda-kube-api-access-s5k2q\") pod \"machine-config-server-rx9pn\" (UID: \"bb3fd26d-006b-46ba-864b-b79db95a0bda\") " pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551744 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7875ee6c-fa47-4620-b515-aad264da7c5a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.551841 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef433248-3b40-4b06-86ff-f9fd9377c779-trusted-ca\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.552126 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b3f9223-93d9-4270-bbc2-c3235636f93f-serving-cert\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.552301 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db1da5ad-1d5b-4c87-9d5d-340cf0536968-profile-collector-cert\") pod \"catalog-operator-68c6474976-h294z\" (UID: \"db1da5ad-1d5b-4c87-9d5d-340cf0536968\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.552373 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vl5x\" (UniqueName: \"kubernetes.io/projected/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-kube-api-access-2vl5x\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.552395 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c-config\") pod \"kube-controller-manager-operator-78b949d7b-lbng4\" (UID: \"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.552464 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bb3fd26d-006b-46ba-864b-b79db95a0bda-node-bootstrap-token\") pod \"machine-config-server-rx9pn\" (UID: \"bb3fd26d-006b-46ba-864b-b79db95a0bda\") " pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.552515 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql8t7\" (UniqueName: \"kubernetes.io/projected/df4d19a5-387f-464f-ad1f-373b31520bcf-kube-api-access-ql8t7\") pod \"olm-operator-6b444d44fb-f79xz\" (UID: \"df4d19a5-387f-464f-ad1f-373b31520bcf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.552547 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q442s\" (UniqueName: \"kubernetes.io/projected/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-kube-api-access-q442s\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.552955 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpm5v\" (UniqueName: \"kubernetes.io/projected/4547700c-d872-4fa8-bd19-d5d442dc38f7-kube-api-access-zpm5v\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.553030 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2371a2b2-ba78-48f3-839d-cd97b08f3243-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.553301 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b3f9223-93d9-4270-bbc2-c3235636f93f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.553368 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-serving-cert\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.553452 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bb3fd26d-006b-46ba-864b-b79db95a0bda-certs\") pod \"machine-config-server-rx9pn\" (UID: \"bb3fd26d-006b-46ba-864b-b79db95a0bda\") " pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.553486 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df4d19a5-387f-464f-ad1f-373b31520bcf-srv-cert\") pod \"olm-operator-6b444d44fb-f79xz\" (UID: \"df4d19a5-387f-464f-ad1f-373b31520bcf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.553714 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c263b9f1-4544-4400-a1d0-484b593ceab3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hzp8j\" (UID: \"c263b9f1-4544-4400-a1d0-484b593ceab3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554071 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65c7bb02-dc66-4754-8623-8d1be2932557-serving-cert\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554106 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2371a2b2-ba78-48f3-839d-cd97b08f3243-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554129 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lbng4\" (UID: \"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554164 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7875ee6c-fa47-4620-b515-aad264da7c5a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554195 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52nw\" (UniqueName: \"kubernetes.io/projected/65c7bb02-dc66-4754-8623-8d1be2932557-kube-api-access-l52nw\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554238 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-tls\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554350 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vn2p\" (UniqueName: \"kubernetes.io/projected/f3517c8c-cdca-439b-84ee-57d1b41ebcd2-kube-api-access-9vn2p\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nmrx\" (UID: \"f3517c8c-cdca-439b-84ee-57d1b41ebcd2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554593 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzj7\" (UniqueName: \"kubernetes.io/projected/7875ee6c-fa47-4620-b515-aad264da7c5a-kube-api-access-pdzj7\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554652 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpqd7\" (UniqueName: \"kubernetes.io/projected/d6d762dc-b9e3-44fd-a835-78df146dc35a-kube-api-access-mpqd7\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.554686 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c80d44f2-ac5d-41e2-8abb-773410cde2a6-metrics-tls\") pod \"dns-operator-744455d44c-g7xhm\" (UID: \"c80d44f2-ac5d-41e2-8abb-773410cde2a6\") " pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555465 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-trusted-ca\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555507 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c263b9f1-4544-4400-a1d0-484b593ceab3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hzp8j\" (UID: \"c263b9f1-4544-4400-a1d0-484b593ceab3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555534 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d80de8ac-f823-4cb8-93e7-f3a672b83729-proxy-tls\") pod \"machine-config-controller-84d6567774-zfxb8\" (UID: \"d80de8ac-f823-4cb8-93e7-f3a672b83729\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555571 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef433248-3b40-4b06-86ff-f9fd9377c779-serving-cert\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555597 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-trusted-ca\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555620 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2371a2b2-ba78-48f3-839d-cd97b08f3243-ready\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555644 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3517c8c-cdca-439b-84ee-57d1b41ebcd2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nmrx\" (UID: \"f3517c8c-cdca-439b-84ee-57d1b41ebcd2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555666 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c2a8653-71cf-4142-9fe7-9294ac24e231-metrics-certs\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555716 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3517c8c-cdca-439b-84ee-57d1b41ebcd2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nmrx\" (UID: \"f3517c8c-cdca-439b-84ee-57d1b41ebcd2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555753 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-serving-cert\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.555774 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28t87\" (UniqueName: \"kubernetes.io/projected/4fdf7757-4d06-43cc-be27-0be0a8155b11-kube-api-access-28t87\") pod \"marketplace-operator-79b997595-8mgvk\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.556180 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.556809 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-certificates\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.556888 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4547700c-d872-4fa8-bd19-d5d442dc38f7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.556984 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6c2a8653-71cf-4142-9fe7-9294ac24e231-default-certificate\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.557124 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-trusted-ca-bundle\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.557174 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-tmpfs\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.557363 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4547700c-d872-4fa8-bd19-d5d442dc38f7-service-ca-bundle\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.557394 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srqlr\" (UniqueName: \"kubernetes.io/projected/c63de0b0-6c5f-4ae1-960e-86a483b1ac68-kube-api-access-srqlr\") pod \"openshift-config-operator-7777fb866f-bxjw5\" (UID: \"c63de0b0-6c5f-4ae1-960e-86a483b1ac68\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.557442 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b423453-fa78-454a-83c7-934c0fce8992-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xtmb5\" (UID: \"4b423453-fa78-454a-83c7-934c0fce8992\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.557948 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p9jk\" (UniqueName: \"kubernetes.io/projected/6c2a8653-71cf-4142-9fe7-9294ac24e231-kube-api-access-7p9jk\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.558102 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0b3f9223-93d9-4270-bbc2-c3235636f93f-audit-dir\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.558153 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7875ee6c-fa47-4620-b515-aad264da7c5a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.558296 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-847nm\" (UniqueName: \"kubernetes.io/projected/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-kube-api-access-847nm\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.558415 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-client-ca\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.558436 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3bdab87a-cacc-4b04-87d0-1439b3db3efb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7dpkv\" (UID: \"3bdab87a-cacc-4b04-87d0-1439b3db3efb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.558501 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d80de8ac-f823-4cb8-93e7-f3a672b83729-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zfxb8\" (UID: \"d80de8ac-f823-4cb8-93e7-f3a672b83729\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.559091 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-certificates\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.570486 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-tls\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.661448 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:21 crc kubenswrapper[5033]: E0226 15:43:21.661657 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.161628681 +0000 UTC m=+129.374227331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.661867 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86sc6\" (UniqueName: \"kubernetes.io/projected/db1da5ad-1d5b-4c87-9d5d-340cf0536968-kube-api-access-86sc6\") pod \"catalog-operator-68c6474976-h294z\" (UID: \"db1da5ad-1d5b-4c87-9d5d-340cf0536968\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.661893 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0b3f9223-93d9-4270-bbc2-c3235636f93f-etcd-client\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.661909 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw2bs\" (UniqueName: \"kubernetes.io/projected/c80d44f2-ac5d-41e2-8abb-773410cde2a6-kube-api-access-kw2bs\") pod \"dns-operator-744455d44c-g7xhm\" (UID: \"c80d44f2-ac5d-41e2-8abb-773410cde2a6\") " pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.661926 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-config\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.661942 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-service-ca\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.661958 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef433248-3b40-4b06-86ff-f9fd9377c779-config\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.661982 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-apiservice-cert\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.662008 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gdld\" (UniqueName: \"kubernetes.io/projected/c42a886d-9392-4ba9-92d4-49763cfe4d10-kube-api-access-7gdld\") pod \"ingress-canary-8bjnz\" (UID: \"c42a886d-9392-4ba9-92d4-49763cfe4d10\") " pod="openshift-ingress-canary/ingress-canary-8bjnz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.662032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.662048 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0b3f9223-93d9-4270-bbc2-c3235636f93f-encryption-config\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.662065 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lbng4\" (UID: \"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.662084 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-config\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.662099 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db1da5ad-1d5b-4c87-9d5d-340cf0536968-srv-cert\") pod \"catalog-operator-68c6474976-h294z\" (UID: \"db1da5ad-1d5b-4c87-9d5d-340cf0536968\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.662114 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8mgvk\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.662129 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-webhook-cert\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.662144 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-config-volume\") pod \"collect-profiles-29535330-57d7w\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:21 crc kubenswrapper[5033]: E0226 15:43:21.662811 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.162800429 +0000 UTC m=+129.375399089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663331 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-mountpoint-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663433 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0b3f9223-93d9-4270-bbc2-c3235636f93f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663481 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-config\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663500 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663525 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663545 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb42c6ae-cf0c-43e8-95e8-ab56d339ba31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fmqqx\" (UID: \"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663565 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0b3f9223-93d9-4270-bbc2-c3235636f93f-audit-policies\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663584 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgxf\" (UniqueName: \"kubernetes.io/projected/d80de8ac-f823-4cb8-93e7-f3a672b83729-kube-api-access-hmgxf\") pod \"machine-config-controller-84d6567774-zfxb8\" (UID: \"d80de8ac-f823-4cb8-93e7-f3a672b83729\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663606 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnh26\" (UniqueName: \"kubernetes.io/projected/b7c878e1-790a-487b-967e-bf476f6f22a9-kube-api-access-xnh26\") pod \"package-server-manager-789f6589d5-fghqv\" (UID: \"b7c878e1-790a-487b-967e-bf476f6f22a9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663625 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-secret-volume\") pod \"collect-profiles-29535330-57d7w\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663647 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-config\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663665 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tvn5\" (UniqueName: \"kubernetes.io/projected/4b423453-fa78-454a-83c7-934c0fce8992-kube-api-access-5tvn5\") pod \"control-plane-machine-set-operator-78cbb6b69f-xtmb5\" (UID: \"4b423453-fa78-454a-83c7-934c0fce8992\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663683 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663702 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea04ee0d-86a7-4603-9060-8852dd019a22-signing-cabundle\") pod \"service-ca-9c57cc56f-sql8k\" (UID: \"ea04ee0d-86a7-4603-9060-8852dd019a22\") " pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663721 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-socket-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663742 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bx2x\" (UniqueName: \"kubernetes.io/projected/2371a2b2-ba78-48f3-839d-cd97b08f3243-kube-api-access-6bx2x\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663759 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c263b9f1-4544-4400-a1d0-484b593ceab3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hzp8j\" (UID: \"c263b9f1-4544-4400-a1d0-484b593ceab3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663780 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxsj\" (UniqueName: \"kubernetes.io/projected/c852046c-87e9-42a6-9082-2d7390d6f71e-kube-api-access-gdxsj\") pod \"downloads-7954f5f757-8l772\" (UID: \"c852046c-87e9-42a6-9082-2d7390d6f71e\") " pod="openshift-console/downloads-7954f5f757-8l772" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663806 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dcc382cb-23f7-44b0-89a2-43c41292c21e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663825 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6c2a8653-71cf-4142-9fe7-9294ac24e231-stats-auth\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663841 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8mgvk\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663866 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-bound-sa-token\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663883 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e612fd1f-3961-484f-a94c-18a54b4bf1ab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5jgbd\" (UID: \"e612fd1f-3961-484f-a94c-18a54b4bf1ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663899 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6mhw\" (UniqueName: \"kubernetes.io/projected/3bdab87a-cacc-4b04-87d0-1439b3db3efb-kube-api-access-d6mhw\") pod \"multus-admission-controller-857f4d67dd-7dpkv\" (UID: \"3bdab87a-cacc-4b04-87d0-1439b3db3efb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663921 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnz8p\" (UniqueName: \"kubernetes.io/projected/0b3f9223-93d9-4270-bbc2-c3235636f93f-kube-api-access-bnz8p\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663937 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4547700c-d872-4fa8-bd19-d5d442dc38f7-serving-cert\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663955 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c63de0b0-6c5f-4ae1-960e-86a483b1ac68-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bxjw5\" (UID: \"c63de0b0-6c5f-4ae1-960e-86a483b1ac68\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663975 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c2a8653-71cf-4142-9fe7-9294ac24e231-service-ca-bundle\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.663992 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c63de0b0-6c5f-4ae1-960e-86a483b1ac68-serving-cert\") pod \"openshift-config-operator-7777fb866f-bxjw5\" (UID: \"c63de0b0-6c5f-4ae1-960e-86a483b1ac68\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664010 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzs2j\" (UniqueName: \"kubernetes.io/projected/40828720-7437-44c4-993e-a757c56e571b-kube-api-access-mzs2j\") pod \"dns-default-fb82g\" (UID: \"40828720-7437-44c4-993e-a757c56e571b\") " pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664027 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccchf\" (UniqueName: \"kubernetes.io/projected/e612fd1f-3961-484f-a94c-18a54b4bf1ab-kube-api-access-ccchf\") pod \"cluster-samples-operator-665b6dd947-5jgbd\" (UID: \"e612fd1f-3961-484f-a94c-18a54b4bf1ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664044 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4547700c-d872-4fa8-bd19-d5d442dc38f7-config\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664062 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt8w6\" (UniqueName: \"kubernetes.io/projected/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-kube-api-access-wt8w6\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664080 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdkgp\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-kube-api-access-zdkgp\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664096 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-plugins-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664115 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5k2q\" (UniqueName: \"kubernetes.io/projected/bb3fd26d-006b-46ba-864b-b79db95a0bda-kube-api-access-s5k2q\") pod \"machine-config-server-rx9pn\" (UID: \"bb3fd26d-006b-46ba-864b-b79db95a0bda\") " pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664131 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df4d19a5-387f-464f-ad1f-373b31520bcf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f79xz\" (UID: \"df4d19a5-387f-464f-ad1f-373b31520bcf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664147 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfbhz\" (UniqueName: \"kubernetes.io/projected/ea04ee0d-86a7-4603-9060-8852dd019a22-kube-api-access-gfbhz\") pod \"service-ca-9c57cc56f-sql8k\" (UID: \"ea04ee0d-86a7-4603-9060-8852dd019a22\") " pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664171 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef433248-3b40-4b06-86ff-f9fd9377c779-trusted-ca\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664194 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7875ee6c-fa47-4620-b515-aad264da7c5a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664228 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b3f9223-93d9-4270-bbc2-c3235636f93f-serving-cert\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664250 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vl5x\" (UniqueName: \"kubernetes.io/projected/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-kube-api-access-2vl5x\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664284 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c-config\") pod \"kube-controller-manager-operator-78b949d7b-lbng4\" (UID: \"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664300 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db1da5ad-1d5b-4c87-9d5d-340cf0536968-profile-collector-cert\") pod \"catalog-operator-68c6474976-h294z\" (UID: \"db1da5ad-1d5b-4c87-9d5d-340cf0536968\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664333 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bb3fd26d-006b-46ba-864b-b79db95a0bda-node-bootstrap-token\") pod \"machine-config-server-rx9pn\" (UID: \"bb3fd26d-006b-46ba-864b-b79db95a0bda\") " pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664350 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql8t7\" (UniqueName: \"kubernetes.io/projected/df4d19a5-387f-464f-ad1f-373b31520bcf-kube-api-access-ql8t7\") pod \"olm-operator-6b444d44fb-f79xz\" (UID: \"df4d19a5-387f-464f-ad1f-373b31520bcf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664682 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q442s\" (UniqueName: \"kubernetes.io/projected/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-kube-api-access-q442s\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664702 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2371a2b2-ba78-48f3-839d-cd97b08f3243-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664719 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22eaaf47-a99c-484f-9c2f-1aa78da1111c-config\") pod \"service-ca-operator-777779d784-wmk55\" (UID: \"22eaaf47-a99c-484f-9c2f-1aa78da1111c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664736 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpm5v\" (UniqueName: \"kubernetes.io/projected/4547700c-d872-4fa8-bd19-d5d442dc38f7-kube-api-access-zpm5v\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664754 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b3f9223-93d9-4270-bbc2-c3235636f93f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664771 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-serving-cert\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664790 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c263b9f1-4544-4400-a1d0-484b593ceab3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hzp8j\" (UID: \"c263b9f1-4544-4400-a1d0-484b593ceab3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664811 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bb3fd26d-006b-46ba-864b-b79db95a0bda-certs\") pod \"machine-config-server-rx9pn\" (UID: \"bb3fd26d-006b-46ba-864b-b79db95a0bda\") " pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664828 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df4d19a5-387f-464f-ad1f-373b31520bcf-srv-cert\") pod \"olm-operator-6b444d44fb-f79xz\" (UID: \"df4d19a5-387f-464f-ad1f-373b31520bcf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664845 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea04ee0d-86a7-4603-9060-8852dd019a22-signing-key\") pod \"service-ca-9c57cc56f-sql8k\" (UID: \"ea04ee0d-86a7-4603-9060-8852dd019a22\") " pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65c7bb02-dc66-4754-8623-8d1be2932557-serving-cert\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664882 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2371a2b2-ba78-48f3-839d-cd97b08f3243-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664901 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7875ee6c-fa47-4620-b515-aad264da7c5a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664918 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lbng4\" (UID: \"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664936 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c42a886d-9392-4ba9-92d4-49763cfe4d10-cert\") pod \"ingress-canary-8bjnz\" (UID: \"c42a886d-9392-4ba9-92d4-49763cfe4d10\") " pod="openshift-ingress-canary/ingress-canary-8bjnz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664956 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l52nw\" (UniqueName: \"kubernetes.io/projected/65c7bb02-dc66-4754-8623-8d1be2932557-kube-api-access-l52nw\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664977 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vn2p\" (UniqueName: \"kubernetes.io/projected/f3517c8c-cdca-439b-84ee-57d1b41ebcd2-kube-api-access-9vn2p\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nmrx\" (UID: \"f3517c8c-cdca-439b-84ee-57d1b41ebcd2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.664992 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzj7\" (UniqueName: \"kubernetes.io/projected/7875ee6c-fa47-4620-b515-aad264da7c5a-kube-api-access-pdzj7\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665188 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpqd7\" (UniqueName: \"kubernetes.io/projected/d6d762dc-b9e3-44fd-a835-78df146dc35a-kube-api-access-mpqd7\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665238 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlgxn\" (UniqueName: \"kubernetes.io/projected/22eaaf47-a99c-484f-9c2f-1aa78da1111c-kube-api-access-dlgxn\") pod \"service-ca-operator-777779d784-wmk55\" (UID: \"22eaaf47-a99c-484f-9c2f-1aa78da1111c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665287 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c263b9f1-4544-4400-a1d0-484b593ceab3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hzp8j\" (UID: \"c263b9f1-4544-4400-a1d0-484b593ceab3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665313 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d80de8ac-f823-4cb8-93e7-f3a672b83729-proxy-tls\") pod \"machine-config-controller-84d6567774-zfxb8\" (UID: \"d80de8ac-f823-4cb8-93e7-f3a672b83729\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665385 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c80d44f2-ac5d-41e2-8abb-773410cde2a6-metrics-tls\") pod \"dns-operator-744455d44c-g7xhm\" (UID: \"c80d44f2-ac5d-41e2-8abb-773410cde2a6\") " pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665412 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4vbq\" (UniqueName: \"kubernetes.io/projected/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-kube-api-access-v4vbq\") pod \"collect-profiles-29535330-57d7w\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665444 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-trusted-ca\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665466 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-trusted-ca\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665490 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2371a2b2-ba78-48f3-839d-cd97b08f3243-ready\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665519 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef433248-3b40-4b06-86ff-f9fd9377c779-serving-cert\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665546 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3517c8c-cdca-439b-84ee-57d1b41ebcd2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nmrx\" (UID: \"f3517c8c-cdca-439b-84ee-57d1b41ebcd2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665568 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3517c8c-cdca-439b-84ee-57d1b41ebcd2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nmrx\" (UID: \"f3517c8c-cdca-439b-84ee-57d1b41ebcd2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665589 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c2a8653-71cf-4142-9fe7-9294ac24e231-metrics-certs\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665618 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-serving-cert\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665642 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28t87\" (UniqueName: \"kubernetes.io/projected/4fdf7757-4d06-43cc-be27-0be0a8155b11-kube-api-access-28t87\") pod \"marketplace-operator-79b997595-8mgvk\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665665 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7c878e1-790a-487b-967e-bf476f6f22a9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fghqv\" (UID: \"b7c878e1-790a-487b-967e-bf476f6f22a9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665735 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-images\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665760 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22eaaf47-a99c-484f-9c2f-1aa78da1111c-serving-cert\") pod \"service-ca-operator-777779d784-wmk55\" (UID: \"22eaaf47-a99c-484f-9c2f-1aa78da1111c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665791 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb42c6ae-cf0c-43e8-95e8-ab56d339ba31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fmqqx\" (UID: \"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665818 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4547700c-d872-4fa8-bd19-d5d442dc38f7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665869 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6c2a8653-71cf-4142-9fe7-9294ac24e231-default-certificate\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665905 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-trusted-ca-bundle\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665928 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-tmpfs\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665952 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4547700c-d872-4fa8-bd19-d5d442dc38f7-service-ca-bundle\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.665975 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srqlr\" (UniqueName: \"kubernetes.io/projected/c63de0b0-6c5f-4ae1-960e-86a483b1ac68-kube-api-access-srqlr\") pod \"openshift-config-operator-7777fb866f-bxjw5\" (UID: \"c63de0b0-6c5f-4ae1-960e-86a483b1ac68\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666003 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-registration-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666038 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b423453-fa78-454a-83c7-934c0fce8992-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xtmb5\" (UID: \"4b423453-fa78-454a-83c7-934c0fce8992\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666063 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42c6ae-cf0c-43e8-95e8-ab56d339ba31-config\") pod \"kube-apiserver-operator-766d6c64bb-fmqqx\" (UID: \"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666127 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40828720-7437-44c4-993e-a757c56e571b-config-volume\") pod \"dns-default-fb82g\" (UID: \"40828720-7437-44c4-993e-a757c56e571b\") " pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666151 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-csi-data-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666180 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p9jk\" (UniqueName: \"kubernetes.io/projected/6c2a8653-71cf-4142-9fe7-9294ac24e231-kube-api-access-7p9jk\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666203 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40828720-7437-44c4-993e-a757c56e571b-metrics-tls\") pod \"dns-default-fb82g\" (UID: \"40828720-7437-44c4-993e-a757c56e571b\") " pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666230 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7875ee6c-fa47-4620-b515-aad264da7c5a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666271 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0b3f9223-93d9-4270-bbc2-c3235636f93f-audit-dir\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666295 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-847nm\" (UniqueName: \"kubernetes.io/projected/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-kube-api-access-847nm\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666323 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3bdab87a-cacc-4b04-87d0-1439b3db3efb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7dpkv\" (UID: \"3bdab87a-cacc-4b04-87d0-1439b3db3efb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666344 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d80de8ac-f823-4cb8-93e7-f3a672b83729-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zfxb8\" (UID: \"d80de8ac-f823-4cb8-93e7-f3a672b83729\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666369 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-proxy-tls\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666398 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-client-ca\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666423 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxmcp\" (UniqueName: \"kubernetes.io/projected/de9d193e-d622-4bc0-8827-3d55d3ecf448-kube-api-access-xxmcp\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666448 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-client-ca\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666476 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x44jf\" (UniqueName: \"kubernetes.io/projected/ef433248-3b40-4b06-86ff-f9fd9377c779-kube-api-access-x44jf\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666502 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666545 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-oauth-config\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666569 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-metrics-tls\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666592 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-images\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666615 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-oauth-serving-cert\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.666651 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhvcb\" (UniqueName: \"kubernetes.io/projected/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-kube-api-access-dhvcb\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.668249 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2371a2b2-ba78-48f3-839d-cd97b08f3243-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.668309 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c63de0b0-6c5f-4ae1-960e-86a483b1ac68-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bxjw5\" (UID: \"c63de0b0-6c5f-4ae1-960e-86a483b1ac68\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.668390 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0b3f9223-93d9-4270-bbc2-c3235636f93f-audit-dir\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.668782 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2371a2b2-ba78-48f3-839d-cd97b08f3243-ready\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.677804 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.678132 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.678214 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.678316 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.678418 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.678547 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.678661 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.678791 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.678883 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.679280 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-tmpfs\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.684145 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-config\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.684213 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc"] Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.685005 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef433248-3b40-4b06-86ff-f9fd9377c779-config\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.685298 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-config\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.685832 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d80de8ac-f823-4cb8-93e7-f3a672b83729-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zfxb8\" (UID: \"d80de8ac-f823-4cb8-93e7-f3a672b83729\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.693281 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0b3f9223-93d9-4270-bbc2-c3235636f93f-encryption-config\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.693354 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-service-ca\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.693593 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4547700c-d872-4fa8-bd19-d5d442dc38f7-config\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.693937 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db1da5ad-1d5b-4c87-9d5d-340cf0536968-srv-cert\") pod \"catalog-operator-68c6474976-h294z\" (UID: \"db1da5ad-1d5b-4c87-9d5d-340cf0536968\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.693981 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.694105 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lbng4\" (UID: \"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.694672 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0b3f9223-93d9-4270-bbc2-c3235636f93f-etcd-client\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.699976 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.708999 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.719530 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-oauth-serving-cert\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.726477 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.726743 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.726994 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.727284 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.729640 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.730151 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.730972 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.731055 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.731171 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.730978 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.731372 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b3f9223-93d9-4270-bbc2-c3235636f93f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.731594 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.731638 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.731817 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.731932 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bx2x\" (UniqueName: \"kubernetes.io/projected/2371a2b2-ba78-48f3-839d-cd97b08f3243-kube-api-access-6bx2x\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.732063 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.732122 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c263b9f1-4544-4400-a1d0-484b593ceab3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hzp8j\" (UID: \"c263b9f1-4544-4400-a1d0-484b593ceab3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.732274 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.732480 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.733749 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.734052 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.734176 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.734682 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-images\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.734833 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.735144 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.735601 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4547700c-d872-4fa8-bd19-d5d442dc38f7-service-ca-bundle\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.735992 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0b3f9223-93d9-4270-bbc2-c3235636f93f-audit-policies\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.736059 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.736325 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.736444 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.742039 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.742311 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.742470 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.742562 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgxf\" (UniqueName: \"kubernetes.io/projected/d80de8ac-f823-4cb8-93e7-f3a672b83729-kube-api-access-hmgxf\") pod \"machine-config-controller-84d6567774-zfxb8\" (UID: \"d80de8ac-f823-4cb8-93e7-f3a672b83729\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.743384 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df4d19a5-387f-464f-ad1f-373b31520bcf-srv-cert\") pod \"olm-operator-6b444d44fb-f79xz\" (UID: \"df4d19a5-387f-464f-ad1f-373b31520bcf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.744606 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.744984 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.745466 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2371a2b2-ba78-48f3-839d-cd97b08f3243-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-tz87k\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.745531 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-client-ca\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.746407 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c2a8653-71cf-4142-9fe7-9294ac24e231-service-ca-bundle\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.749346 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdkgp\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-kube-api-access-zdkgp\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.749378 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.749771 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.750027 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.751293 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.751586 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzj7\" (UniqueName: \"kubernetes.io/projected/7875ee6c-fa47-4620-b515-aad264da7c5a-kube-api-access-pdzj7\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.752286 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bb3fd26d-006b-46ba-864b-b79db95a0bda-certs\") pod \"machine-config-server-rx9pn\" (UID: \"bb3fd26d-006b-46ba-864b-b79db95a0bda\") " pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.756192 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6mhw\" (UniqueName: \"kubernetes.io/projected/3bdab87a-cacc-4b04-87d0-1439b3db3efb-kube-api-access-d6mhw\") pod \"multus-admission-controller-857f4d67dd-7dpkv\" (UID: \"3bdab87a-cacc-4b04-87d0-1439b3db3efb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.757649 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0b3f9223-93d9-4270-bbc2-c3235636f93f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.758422 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-serving-cert\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.758749 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c-config\") pod \"kube-controller-manager-operator-78b949d7b-lbng4\" (UID: \"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.758855 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-webhook-cert\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.759361 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65c7bb02-dc66-4754-8623-8d1be2932557-serving-cert\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.759385 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7875ee6c-fa47-4620-b515-aad264da7c5a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.759673 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-apiservice-cert\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.759745 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7875ee6c-fa47-4620-b515-aad264da7c5a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.760068 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3517c8c-cdca-439b-84ee-57d1b41ebcd2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nmrx\" (UID: \"f3517c8c-cdca-439b-84ee-57d1b41ebcd2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.760096 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-bound-sa-token\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.760896 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.760980 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-serving-cert\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.762337 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.762676 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.762956 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.763066 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.763295 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.763761 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.763976 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.765218 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.766624 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-config\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.766867 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef433248-3b40-4b06-86ff-f9fd9377c779-trusted-ca\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.767016 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.767459 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.767539 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-oauth-config\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.768657 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.769411 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.769657 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-trusted-ca-bundle\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.769878 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.769985 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6c2a8653-71cf-4142-9fe7-9294ac24e231-stats-auth\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.770694 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.773100 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/df4d19a5-387f-464f-ad1f-373b31520bcf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-f79xz\" (UID: \"df4d19a5-387f-464f-ad1f-373b31520bcf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.773603 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.776136 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dcc382cb-23f7-44b0-89a2-43c41292c21e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.776741 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.776888 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3517c8c-cdca-439b-84ee-57d1b41ebcd2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nmrx\" (UID: \"f3517c8c-cdca-439b-84ee-57d1b41ebcd2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777104 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb42c6ae-cf0c-43e8-95e8-ab56d339ba31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fmqqx\" (UID: \"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777322 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-registration-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777376 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42c6ae-cf0c-43e8-95e8-ab56d339ba31-config\") pod \"kube-apiserver-operator-766d6c64bb-fmqqx\" (UID: \"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777402 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40828720-7437-44c4-993e-a757c56e571b-config-volume\") pod \"dns-default-fb82g\" (UID: \"40828720-7437-44c4-993e-a757c56e571b\") " pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777424 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-csi-data-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777431 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c263b9f1-4544-4400-a1d0-484b593ceab3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hzp8j\" (UID: \"c263b9f1-4544-4400-a1d0-484b593ceab3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777485 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40828720-7437-44c4-993e-a757c56e571b-metrics-tls\") pod \"dns-default-fb82g\" (UID: \"40828720-7437-44c4-993e-a757c56e571b\") " pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777528 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-proxy-tls\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777558 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c80d44f2-ac5d-41e2-8abb-773410cde2a6-metrics-tls\") pod \"dns-operator-744455d44c-g7xhm\" (UID: \"c80d44f2-ac5d-41e2-8abb-773410cde2a6\") " pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777564 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxmcp\" (UniqueName: \"kubernetes.io/projected/de9d193e-d622-4bc0-8827-3d55d3ecf448-kube-api-access-xxmcp\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777627 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777657 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db1da5ad-1d5b-4c87-9d5d-340cf0536968-profile-collector-cert\") pod \"catalog-operator-68c6474976-h294z\" (UID: \"db1da5ad-1d5b-4c87-9d5d-340cf0536968\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777664 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhvcb\" (UniqueName: \"kubernetes.io/projected/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-kube-api-access-dhvcb\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777763 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gdld\" (UniqueName: \"kubernetes.io/projected/c42a886d-9392-4ba9-92d4-49763cfe4d10-kube-api-access-7gdld\") pod \"ingress-canary-8bjnz\" (UID: \"c42a886d-9392-4ba9-92d4-49763cfe4d10\") " pod="openshift-ingress-canary/ingress-canary-8bjnz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777818 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-mountpoint-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.778379 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5k2q\" (UniqueName: \"kubernetes.io/projected/bb3fd26d-006b-46ba-864b-b79db95a0bda-kube-api-access-s5k2q\") pod \"machine-config-server-rx9pn\" (UID: \"bb3fd26d-006b-46ba-864b-b79db95a0bda\") " pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.778698 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3bdab87a-cacc-4b04-87d0-1439b3db3efb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7dpkv\" (UID: \"3bdab87a-cacc-4b04-87d0-1439b3db3efb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.778996 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4547700c-d872-4fa8-bd19-d5d442dc38f7-serving-cert\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.779269 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e612fd1f-3961-484f-a94c-18a54b4bf1ab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5jgbd\" (UID: \"e612fd1f-3961-484f-a94c-18a54b4bf1ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.780445 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-config\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.777848 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-config-volume\") pod \"collect-profiles-29535330-57d7w\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.783001 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b3f9223-93d9-4270-bbc2-c3235636f93f-serving-cert\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.791140 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8mgvk\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.791215 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.791576 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.794942 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-config\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.799622 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-trusted-ca\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.800129 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c63de0b0-6c5f-4ae1-960e-86a483b1ac68-serving-cert\") pod \"openshift-config-operator-7777fb866f-bxjw5\" (UID: \"c63de0b0-6c5f-4ae1-960e-86a483b1ac68\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.800210 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.801154 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef433248-3b40-4b06-86ff-f9fd9377c779-serving-cert\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.801282 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.801437 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.801537 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.801695 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.803884 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jdtrx"] Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.803998 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-client-ca\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804039 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-mountpoint-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804055 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-metrics-tls\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6c2a8653-71cf-4142-9fe7-9294ac24e231-default-certificate\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804251 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8mgvk\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804408 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-registration-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804430 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-csi-data-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: E0226 15:43:21.804515 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.304500883 +0000 UTC m=+129.517099533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804546 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb42c6ae-cf0c-43e8-95e8-ab56d339ba31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fmqqx\" (UID: \"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804593 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-config\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804718 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804771 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804835 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnh26\" (UniqueName: \"kubernetes.io/projected/b7c878e1-790a-487b-967e-bf476f6f22a9-kube-api-access-xnh26\") pod \"package-server-manager-789f6589d5-fghqv\" (UID: \"b7c878e1-790a-487b-967e-bf476f6f22a9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805030 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-secret-volume\") pod \"collect-profiles-29535330-57d7w\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805085 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea04ee0d-86a7-4603-9060-8852dd019a22-signing-cabundle\") pod \"service-ca-9c57cc56f-sql8k\" (UID: \"ea04ee0d-86a7-4603-9060-8852dd019a22\") " pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805109 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-socket-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805210 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzs2j\" (UniqueName: \"kubernetes.io/projected/40828720-7437-44c4-993e-a757c56e571b-kube-api-access-mzs2j\") pod \"dns-default-fb82g\" (UID: \"40828720-7437-44c4-993e-a757c56e571b\") " pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805249 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-plugins-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805299 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfbhz\" (UniqueName: \"kubernetes.io/projected/ea04ee0d-86a7-4603-9060-8852dd019a22-kube-api-access-gfbhz\") pod \"service-ca-9c57cc56f-sql8k\" (UID: \"ea04ee0d-86a7-4603-9060-8852dd019a22\") " pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805411 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22eaaf47-a99c-484f-9c2f-1aa78da1111c-config\") pod \"service-ca-operator-777779d784-wmk55\" (UID: \"22eaaf47-a99c-484f-9c2f-1aa78da1111c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805459 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea04ee0d-86a7-4603-9060-8852dd019a22-signing-key\") pod \"service-ca-9c57cc56f-sql8k\" (UID: \"ea04ee0d-86a7-4603-9060-8852dd019a22\") " pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805507 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805553 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c42a886d-9392-4ba9-92d4-49763cfe4d10-cert\") pod \"ingress-canary-8bjnz\" (UID: \"c42a886d-9392-4ba9-92d4-49763cfe4d10\") " pod="openshift-ingress-canary/ingress-canary-8bjnz" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.804871 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.805907 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.806026 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.807184 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bb3fd26d-006b-46ba-864b-b79db95a0bda-node-bootstrap-token\") pod \"machine-config-server-rx9pn\" (UID: \"bb3fd26d-006b-46ba-864b-b79db95a0bda\") " pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.807793 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.807934 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.808206 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.808245 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-plugins-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.808443 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de9d193e-d622-4bc0-8827-3d55d3ecf448-socket-dir\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.808478 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlgxn\" (UniqueName: \"kubernetes.io/projected/22eaaf47-a99c-484f-9c2f-1aa78da1111c-kube-api-access-dlgxn\") pod \"service-ca-operator-777779d784-wmk55\" (UID: \"22eaaf47-a99c-484f-9c2f-1aa78da1111c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.808653 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4vbq\" (UniqueName: \"kubernetes.io/projected/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-kube-api-access-v4vbq\") pod \"collect-profiles-29535330-57d7w\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.808740 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7c878e1-790a-487b-967e-bf476f6f22a9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fghqv\" (UID: \"b7c878e1-790a-487b-967e-bf476f6f22a9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.808801 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-images\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.808820 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22eaaf47-a99c-484f-9c2f-1aa78da1111c-serving-cert\") pod \"service-ca-operator-777779d784-wmk55\" (UID: \"22eaaf47-a99c-484f-9c2f-1aa78da1111c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.809430 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-trusted-ca\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.810564 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7875ee6c-fa47-4620-b515-aad264da7c5a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2tf4r\" (UID: \"7875ee6c-fa47-4620-b515-aad264da7c5a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.810637 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-secret-volume\") pod \"collect-profiles-29535330-57d7w\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.810998 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.811379 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.811759 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c2a8653-71cf-4142-9fe7-9294ac24e231-metrics-certs\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.811997 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6"] Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.814859 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw2bs\" (UniqueName: \"kubernetes.io/projected/c80d44f2-ac5d-41e2-8abb-773410cde2a6-kube-api-access-kw2bs\") pod \"dns-operator-744455d44c-g7xhm\" (UID: \"c80d44f2-ac5d-41e2-8abb-773410cde2a6\") " pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.817237 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.817917 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.818219 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.819377 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.819538 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d80de8ac-f823-4cb8-93e7-f3a672b83729-proxy-tls\") pod \"machine-config-controller-84d6567774-zfxb8\" (UID: \"d80de8ac-f823-4cb8-93e7-f3a672b83729\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.819985 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4547700c-d872-4fa8-bd19-d5d442dc38f7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:21 crc kubenswrapper[5033]: W0226 15:43:21.821210 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecb847f9_d250_4d91_b74c_104acafc26fd.slice/crio-79850ad3dfb0ef9c771b22e0f062c2c1ba44988068572a99ea2ac93cc952e4c3 WatchSource:0}: Error finding container 79850ad3dfb0ef9c771b22e0f062c2c1ba44988068572a99ea2ac93cc952e4c3: Status 404 returned error can't find the container with id 79850ad3dfb0ef9c771b22e0f062c2c1ba44988068572a99ea2ac93cc952e4c3 Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.821897 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.822411 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.822917 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c263b9f1-4544-4400-a1d0-484b593ceab3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hzp8j\" (UID: \"c263b9f1-4544-4400-a1d0-484b593ceab3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.823045 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4bq8z" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.825042 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccchf\" (UniqueName: \"kubernetes.io/projected/e612fd1f-3961-484f-a94c-18a54b4bf1ab-kube-api-access-ccchf\") pod \"cluster-samples-operator-665b6dd947-5jgbd\" (UID: \"e612fd1f-3961-484f-a94c-18a54b4bf1ab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.833111 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b423453-fa78-454a-83c7-934c0fce8992-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xtmb5\" (UID: \"4b423453-fa78-454a-83c7-934c0fce8992\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.844066 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.857133 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lbng4\" (UID: \"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.865784 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.871938 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpqd7\" (UniqueName: \"kubernetes.io/projected/d6d762dc-b9e3-44fd-a835-78df146dc35a-kube-api-access-mpqd7\") pod \"console-f9d7485db-nq49v\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.881098 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxsj\" (UniqueName: \"kubernetes.io/projected/c852046c-87e9-42a6-9082-2d7390d6f71e-kube-api-access-gdxsj\") pod \"downloads-7954f5f757-8l772\" (UID: \"c852046c-87e9-42a6-9082-2d7390d6f71e\") " pod="openshift-console/downloads-7954f5f757-8l772" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.895979 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.900867 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-847nm\" (UniqueName: \"kubernetes.io/projected/e773b05c-cf02-4b49-8a2d-6e9a5e8bea28-kube-api-access-847nm\") pod \"ingress-operator-5b745b69d9-6ggkg\" (UID: \"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.914400 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.914823 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 15:43:21 crc kubenswrapper[5033]: E0226 15:43:21.914888 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.414872607 +0000 UTC m=+129.627471257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.915962 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb42c6ae-cf0c-43e8-95e8-ab56d339ba31-config\") pod \"kube-apiserver-operator-766d6c64bb-fmqqx\" (UID: \"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.925568 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.939170 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40828720-7437-44c4-993e-a757c56e571b-config-volume\") pod \"dns-default-fb82g\" (UID: \"40828720-7437-44c4-993e-a757c56e571b\") " pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.966530 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.976066 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-config-volume\") pod \"collect-profiles-29535330-57d7w\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.978213 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:21 crc kubenswrapper[5033]: I0226 15:43:21.994548 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.001882 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.009841 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-proxy-tls\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.013871 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb42c6ae-cf0c-43e8-95e8-ab56d339ba31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fmqqx\" (UID: \"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.015633 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.016375 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.516355638 +0000 UTC m=+129.728954288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: W0226 15:43:22.045473 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-5d85bb31365ab4113f08034d4b4b95d0067f4abb63542c4642fb74e137c3239c WatchSource:0}: Error finding container 5d85bb31365ab4113f08034d4b4b95d0067f4abb63542c4642fb74e137c3239c: Status 404 returned error can't find the container with id 5d85bb31365ab4113f08034d4b4b95d0067f4abb63542c4642fb74e137c3239c Feb 26 15:43:22 crc kubenswrapper[5033]: W0226 15:43:22.052141 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2371a2b2_ba78_48f3_839d_cd97b08f3243.slice/crio-ac27a332db38a10a59f56600b53805bfbde2300e428c82ab52ecd00966976c41 WatchSource:0}: Error finding container ac27a332db38a10a59f56600b53805bfbde2300e428c82ab52ecd00966976c41: Status 404 returned error can't find the container with id ac27a332db38a10a59f56600b53805bfbde2300e428c82ab52ecd00966976c41 Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.057585 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhvcb\" (UniqueName: \"kubernetes.io/projected/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-kube-api-access-dhvcb\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.062043 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.067479 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40828720-7437-44c4-993e-a757c56e571b-metrics-tls\") pod \"dns-default-fb82g\" (UID: \"40828720-7437-44c4-993e-a757c56e571b\") " pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.104515 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.109043 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52nw\" (UniqueName: \"kubernetes.io/projected/65c7bb02-dc66-4754-8623-8d1be2932557-kube-api-access-l52nw\") pod \"controller-manager-879f6c89f-bdlcv\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.117680 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.118077 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.618061994 +0000 UTC m=+129.830660644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.122482 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.123680 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4bq8z"] Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.134146 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p9jk\" (UniqueName: \"kubernetes.io/projected/6c2a8653-71cf-4142-9fe7-9294ac24e231-kube-api-access-7p9jk\") pod \"router-default-5444994796-l4bg7\" (UID: \"6c2a8653-71cf-4142-9fe7-9294ac24e231\") " pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:22 crc kubenswrapper[5033]: W0226 15:43:22.141675 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod857e0324_95f6_4524_b676_8919c08e97ca.slice/crio-6eddf224d5155ea21b4030544b7e20f13599043223c448f33f8c9d8c8837f44e WatchSource:0}: Error finding container 6eddf224d5155ea21b4030544b7e20f13599043223c448f33f8c9d8c8837f44e: Status 404 returned error can't find the container with id 6eddf224d5155ea21b4030544b7e20f13599043223c448f33f8c9d8c8837f44e Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.162800 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.171029 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tvn5\" (UniqueName: \"kubernetes.io/projected/4b423453-fa78-454a-83c7-934c0fce8992-kube-api-access-5tvn5\") pod \"control-plane-machine-set-operator-78cbb6b69f-xtmb5\" (UID: \"4b423453-fa78-454a-83c7-934c0fce8992\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.177628 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vl5x\" (UniqueName: \"kubernetes.io/projected/9807d6e2-9450-4da2-a3f5-aea1a3b70cb8-kube-api-access-2vl5x\") pod \"machine-api-operator-5694c8668f-cnhpr\" (UID: \"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.184103 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.203275 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vn2p\" (UniqueName: \"kubernetes.io/projected/f3517c8c-cdca-439b-84ee-57d1b41ebcd2-kube-api-access-9vn2p\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nmrx\" (UID: \"f3517c8c-cdca-439b-84ee-57d1b41ebcd2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.219665 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzs2j\" (UniqueName: \"kubernetes.io/projected/40828720-7437-44c4-993e-a757c56e571b-kube-api-access-mzs2j\") pod \"dns-default-fb82g\" (UID: \"40828720-7437-44c4-993e-a757c56e571b\") " pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.220329 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.220432 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.720412175 +0000 UTC m=+129.933010825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.220684 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.221163 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.721145143 +0000 UTC m=+129.933743793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.243853 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.258286 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22eaaf47-a99c-484f-9c2f-1aa78da1111c-config\") pod \"service-ca-operator-777779d784-wmk55\" (UID: \"22eaaf47-a99c-484f-9c2f-1aa78da1111c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.262581 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.269225 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea04ee0d-86a7-4603-9060-8852dd019a22-signing-cabundle\") pod \"service-ca-9c57cc56f-sql8k\" (UID: \"ea04ee0d-86a7-4603-9060-8852dd019a22\") " pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.282311 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.292847 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea04ee0d-86a7-4603-9060-8852dd019a22-signing-key\") pod \"service-ca-9c57cc56f-sql8k\" (UID: \"ea04ee0d-86a7-4603-9060-8852dd019a22\") " pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.303667 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.314990 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c42a886d-9392-4ba9-92d4-49763cfe4d10-cert\") pod \"ingress-canary-8bjnz\" (UID: \"c42a886d-9392-4ba9-92d4-49763cfe4d10\") " pod="openshift-ingress-canary/ingress-canary-8bjnz" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.326797 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.327725 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.827703775 +0000 UTC m=+130.040302425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.361864 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.373819 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7c878e1-790a-487b-967e-bf476f6f22a9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fghqv\" (UID: \"b7c878e1-790a-487b-967e-bf476f6f22a9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.382867 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.390195 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/73c0a4a0-c4d4-4167-ae1a-05ffd7291c71-images\") pod \"machine-config-operator-74547568cd-vrz4d\" (UID: \"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.395136 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4cddffa7ccaa5e24db04abf2674da7c5e81ef09a0103af42f13b9f3c5883a520"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.395219 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0218e4a79165b9795cb79273f414ecbbed2fc55a082e36c31514283cbe607f60"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.397751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"108150a9b0e1130a876f4081bce7c5e126a143a28a4e017610a66988b75e4338"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.397812 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"647e71b8c9438fe942f6baf969997bf5c88a859b0654e0cafc899754a0dddb4a"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.400079 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4bq8z" event={"ID":"857e0324-95f6-4524-b676-8919c08e97ca","Type":"ContainerStarted","Data":"6eddf224d5155ea21b4030544b7e20f13599043223c448f33f8c9d8c8837f44e"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.402192 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.402536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" event={"ID":"0e8507b4-c6ad-476a-b2a6-71c17ad0014c","Type":"ContainerStarted","Data":"e0a7466ecba1fa7a2b54cade275881b826d53ab702be8441855af11d25437d78"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.402568 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" event={"ID":"0e8507b4-c6ad-476a-b2a6-71c17ad0014c","Type":"ContainerStarted","Data":"703ff6ce9bf383008f06db6bf9f394177afa2f72672ea26b7c2bfe199226ecb2"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.405551 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.411247 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2da09d43f341fda6117b35af9902a0dadd423ab611361a601e53c33f7c424abf"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.412384 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.412628 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22eaaf47-a99c-484f-9c2f-1aa78da1111c-serving-cert\") pod \"service-ca-operator-777779d784-wmk55\" (UID: \"22eaaf47-a99c-484f-9c2f-1aa78da1111c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.414165 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" event={"ID":"2371a2b2-ba78-48f3-839d-cd97b08f3243","Type":"ContainerStarted","Data":"fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.414194 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" event={"ID":"2371a2b2-ba78-48f3-839d-cd97b08f3243","Type":"ContainerStarted","Data":"ac27a332db38a10a59f56600b53805bfbde2300e428c82ab52ecd00966976c41"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.414613 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.416601 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6" event={"ID":"ecb847f9-d250-4d91-b74c-104acafc26fd","Type":"ContainerStarted","Data":"7db65176fa8f84a95d600908ce3a8e03bd6ccce2db11eadeb6a30674492b6831"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.416647 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6" event={"ID":"ecb847f9-d250-4d91-b74c-104acafc26fd","Type":"ContainerStarted","Data":"4b99475b71d92425bd1b0fe4cfa852f42ec41ff9c1e3da9dbafa016edc9b6c07"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.416663 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6" event={"ID":"ecb847f9-d250-4d91-b74c-104acafc26fd","Type":"ContainerStarted","Data":"79850ad3dfb0ef9c771b22e0f062c2c1ba44988068572a99ea2ac93cc952e4c3"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.420316 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" event={"ID":"09c96f57-a56c-4519-8101-b3aa96065974","Type":"ContainerStarted","Data":"82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.420457 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" event={"ID":"09c96f57-a56c-4519-8101-b3aa96065974","Type":"ContainerStarted","Data":"05ad84ce3c72b8bf9a02e0caed86a6405657fb180f477c83ed8dd15b897f2565"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.421076 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.422379 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.424851 5033 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jdtrx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.424899 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" podUID="09c96f57-a56c-4519-8101-b3aa96065974" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.425032 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5ef69ab11bc1ff9a96854cbc88742d0ebd90721af734eef06cc8c67737a91281"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.425078 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5d85bb31365ab4113f08034d4b4b95d0067f4abb63542c4642fb74e137c3239c"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.426068 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.427834 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnh26\" (UniqueName: \"kubernetes.io/projected/b7c878e1-790a-487b-967e-bf476f6f22a9-kube-api-access-xnh26\") pod \"package-server-manager-789f6589d5-fghqv\" (UID: \"b7c878e1-790a-487b-967e-bf476f6f22a9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.428620 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.429314 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" event={"ID":"4aaf15fb-2839-42d5-8628-341eed11eee8","Type":"ContainerStarted","Data":"06f0e2b62520a0eb33c4ca89b66e90afe3103d54d556023498516a0d65413c31"} Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.429352 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" event={"ID":"4aaf15fb-2839-42d5-8628-341eed11eee8","Type":"ContainerStarted","Data":"042e9f3a634de55928c68fe5c1f5da57b6aa7bd2cb78b68607cf82c0895adb37"} Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.430059 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:22.930041546 +0000 UTC m=+130.142640276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.434951 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4vbq\" (UniqueName: \"kubernetes.io/projected/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-kube-api-access-v4vbq\") pod \"collect-profiles-29535330-57d7w\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.438670 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86sc6\" (UniqueName: \"kubernetes.io/projected/db1da5ad-1d5b-4c87-9d5d-340cf0536968-kube-api-access-86sc6\") pod \"catalog-operator-68c6474976-h294z\" (UID: \"db1da5ad-1d5b-4c87-9d5d-340cf0536968\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.439127 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql8t7\" (UniqueName: \"kubernetes.io/projected/df4d19a5-387f-464f-ad1f-373b31520bcf-kube-api-access-ql8t7\") pod \"olm-operator-6b444d44fb-f79xz\" (UID: \"df4d19a5-387f-464f-ad1f-373b31520bcf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.441359 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.441619 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q442s\" (UniqueName: \"kubernetes.io/projected/f545eb5d-2778-4ad4-84b0-7ca6b46e64e0-kube-api-access-q442s\") pod \"packageserver-d55dfcdfc-fnszf\" (UID: \"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.455974 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28t87\" (UniqueName: \"kubernetes.io/projected/4fdf7757-4d06-43cc-be27-0be0a8155b11-kube-api-access-28t87\") pod \"marketplace-operator-79b997595-8mgvk\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.463233 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.472469 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x44jf\" (UniqueName: \"kubernetes.io/projected/ef433248-3b40-4b06-86ff-f9fd9377c779-kube-api-access-x44jf\") pod \"console-operator-58897d9998-gs8vj\" (UID: \"ef433248-3b40-4b06-86ff-f9fd9377c779\") " pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.482769 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.491656 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpm5v\" (UniqueName: \"kubernetes.io/projected/4547700c-d872-4fa8-bd19-d5d442dc38f7-kube-api-access-zpm5v\") pod \"authentication-operator-69f744f599-b6z8t\" (UID: \"4547700c-d872-4fa8-bd19-d5d442dc38f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.502061 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.509413 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt8w6\" (UniqueName: \"kubernetes.io/projected/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-kube-api-access-wt8w6\") pod \"route-controller-manager-6576b87f9c-nggcp\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.521805 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.530910 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.531070 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srqlr\" (UniqueName: \"kubernetes.io/projected/c63de0b0-6c5f-4ae1-960e-86a483b1ac68-kube-api-access-srqlr\") pod \"openshift-config-operator-7777fb866f-bxjw5\" (UID: \"c63de0b0-6c5f-4ae1-960e-86a483b1ac68\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.531178 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.031160818 +0000 UTC m=+130.243759468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.542494 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.550820 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnz8p\" (UniqueName: \"kubernetes.io/projected/0b3f9223-93d9-4270-bbc2-c3235636f93f-kube-api-access-bnz8p\") pod \"apiserver-7bbb656c7d-kgzj7\" (UID: \"0b3f9223-93d9-4270-bbc2-c3235636f93f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.562587 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.568099 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.582115 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.587746 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.603426 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.610488 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.622568 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.623488 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.633521 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.633998 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.133975721 +0000 UTC m=+130.346574451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.642248 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.666815 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.677573 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.681321 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.684753 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.701762 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.711206 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rx9pn" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.722382 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.736628 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.737134 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.237115511 +0000 UTC m=+130.449714161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.742365 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.750829 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.762099 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.764071 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8l772" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.783377 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.791634 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.802238 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.822573 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.826856 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.837792 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.838137 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.338124071 +0000 UTC m=+130.550722721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.843648 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.844124 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.863483 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.870292 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.872878 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb42c6ae-cf0c-43e8-95e8-ab56d339ba31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fmqqx\" (UID: \"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:22 crc kubenswrapper[5033]: W0226 15:43:22.879276 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c2a8653_71cf_4142_9fe7_9294ac24e231.slice/crio-ce1ce5181fd9134da774052cbc508fcc248a7a6c44eb90cc249f9a7721e5bcb6 WatchSource:0}: Error finding container ce1ce5181fd9134da774052cbc508fcc248a7a6c44eb90cc249f9a7721e5bcb6: Status 404 returned error can't find the container with id ce1ce5181fd9134da774052cbc508fcc248a7a6c44eb90cc249f9a7721e5bcb6 Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.885377 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.906223 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.906501 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.922765 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.924789 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.939918 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:22 crc kubenswrapper[5033]: E0226 15:43:22.940340 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.440312868 +0000 UTC m=+130.652911528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.952905 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.958949 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:22 crc kubenswrapper[5033]: I0226 15:43:22.965702 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.006833 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.010117 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.029964 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.030214 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.030873 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.041604 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.042211 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.043196 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.044082 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.544038844 +0000 UTC m=+130.756637494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.044782 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.053666 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.063655 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.068893 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.101627 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.103042 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.106318 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.114502 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8"] Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.114692 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.118974 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4"] Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.135480 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.138927 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.141979 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.142912 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.145964 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.146102 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.646086367 +0000 UTC m=+130.858685017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.146319 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.146624 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.64661157 +0000 UTC m=+130.859210220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.154201 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g7xhm"] Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.165093 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.174036 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.186036 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.193352 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxmcp\" (UniqueName: \"kubernetes.io/projected/de9d193e-d622-4bc0-8827-3d55d3ecf448-kube-api-access-xxmcp\") pod \"csi-hostpathplugin-fkt29\" (UID: \"de9d193e-d622-4bc0-8827-3d55d3ecf448\") " pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.203505 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.203655 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.229114 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.241989 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.247997 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.248358 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.748343376 +0000 UTC m=+130.960942026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.248642 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gdld\" (UniqueName: \"kubernetes.io/projected/c42a886d-9392-4ba9-92d4-49763cfe4d10-kube-api-access-7gdld\") pod \"ingress-canary-8bjnz\" (UID: \"c42a886d-9392-4ba9-92d4-49763cfe4d10\") " pod="openshift-ingress-canary/ingress-canary-8bjnz" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.257113 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfbhz\" (UniqueName: \"kubernetes.io/projected/ea04ee0d-86a7-4603-9060-8852dd019a22-kube-api-access-gfbhz\") pod \"service-ca-9c57cc56f-sql8k\" (UID: \"ea04ee0d-86a7-4603-9060-8852dd019a22\") " pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.262572 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.301917 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.306355 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.322756 5033 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.325782 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fkt29" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.349627 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.350083 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.850065143 +0000 UTC m=+131.062663783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.383147 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=43.383124466 podStartE2EDuration="43.383124466s" podCreationTimestamp="2026-02-26 15:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:23.381673571 +0000 UTC m=+130.594272221" watchObservedRunningTime="2026-02-26 15:43:23.383124466 +0000 UTC m=+130.595723116" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.418565 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" podStartSLOduration=68.418546934 podStartE2EDuration="1m8.418546934s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:23.415304186 +0000 UTC m=+130.627902856" watchObservedRunningTime="2026-02-26 15:43:23.418546934 +0000 UTC m=+130.631145594" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.434201 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-l4bg7" event={"ID":"6c2a8653-71cf-4142-9fe7-9294ac24e231","Type":"ContainerStarted","Data":"7a9a4d9d0d2309bd9a8c8fc1ec2f0fdc55f4e4e2f09d753d6d2a536c2c111cce"} Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.434368 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-l4bg7" event={"ID":"6c2a8653-71cf-4142-9fe7-9294ac24e231","Type":"ContainerStarted","Data":"ce1ce5181fd9134da774052cbc508fcc248a7a6c44eb90cc249f9a7721e5bcb6"} Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.435912 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" event={"ID":"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c","Type":"ContainerStarted","Data":"cf45c0078077cade411e4d8c8a01724d46a2b9b070199b8ccac8fe9ff32f372a"} Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.439419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rx9pn" event={"ID":"bb3fd26d-006b-46ba-864b-b79db95a0bda","Type":"ContainerStarted","Data":"53fac7a96afb9c8155339e524eddb7535fe0bf55ad40892bbaded8a0c508c6d9"} Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.439453 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rx9pn" event={"ID":"bb3fd26d-006b-46ba-864b-b79db95a0bda","Type":"ContainerStarted","Data":"e88ddb46fd20c13f832604c607b477d51e96f57bc731943c384467df038b4d00"} Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.442014 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4bq8z" event={"ID":"857e0324-95f6-4524-b676-8919c08e97ca","Type":"ContainerStarted","Data":"252035cce4c99f5a76d15086e74b3cdcb6e1eaaafa799ca95af65532f2c06e34"} Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.442091 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4bq8z" event={"ID":"857e0324-95f6-4524-b676-8919c08e97ca","Type":"ContainerStarted","Data":"af69c5f19fb41b1bd2a022ba887629725ceb9ab18df23007f4712eda64f2b1b0"} Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.450352 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.450565 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.95053985 +0000 UTC m=+131.163138510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.450714 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.451045 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:23.951030272 +0000 UTC m=+131.163628922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.451462 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zndq6" podStartSLOduration=68.451441522 podStartE2EDuration="1m8.451441522s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:23.449534466 +0000 UTC m=+130.662133116" watchObservedRunningTime="2026-02-26 15:43:23.451441522 +0000 UTC m=+130.664040172" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.475189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlgxn\" (UniqueName: \"kubernetes.io/projected/22eaaf47-a99c-484f-9c2f-1aa78da1111c-kube-api-access-dlgxn\") pod \"service-ca-operator-777779d784-wmk55\" (UID: \"22eaaf47-a99c-484f-9c2f-1aa78da1111c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.492686 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.502944 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.512236 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.526485 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.529450 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8bjnz" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.535697 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" podStartSLOduration=68.53567442 podStartE2EDuration="1m8.53567442s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:23.505418434 +0000 UTC m=+130.718017104" watchObservedRunningTime="2026-02-26 15:43:23.53567442 +0000 UTC m=+130.748273090" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.554826 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.554982 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.054952251 +0000 UTC m=+131.267550911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.555741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.557015 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.05699886 +0000 UTC m=+131.269597600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.584643 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.655294 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" podStartSLOduration=21.655225013 podStartE2EDuration="21.655225013s" podCreationTimestamp="2026-02-26 15:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:23.649297611 +0000 UTC m=+130.861896281" watchObservedRunningTime="2026-02-26 15:43:23.655225013 +0000 UTC m=+130.867823663" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.656431 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.656598 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.156578315 +0000 UTC m=+131.369176985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.708717 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7lvqc" podStartSLOduration=68.708658853 podStartE2EDuration="1m8.708658853s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:23.704680947 +0000 UTC m=+130.917279597" watchObservedRunningTime="2026-02-26 15:43:23.708658853 +0000 UTC m=+130.921257503" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.757531 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.757930 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.257914592 +0000 UTC m=+131.470513242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.783427 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-l4bg7" podStartSLOduration=68.783410444 podStartE2EDuration="1m8.783410444s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:23.783007363 +0000 UTC m=+130.995606013" watchObservedRunningTime="2026-02-26 15:43:23.783410444 +0000 UTC m=+130.996009094" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.784454 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.794350 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.810810 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.811932 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.845882 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.856472 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:23 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:23 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:23 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.856536 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.858479 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.858847 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.358833049 +0000 UTC m=+131.571431699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.874208 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-rx9pn" podStartSLOduration=21.874193651 podStartE2EDuration="21.874193651s" podCreationTimestamp="2026-02-26 15:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:23.873155982 +0000 UTC m=+131.085754632" watchObservedRunningTime="2026-02-26 15:43:23.874193651 +0000 UTC m=+131.086792301" Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.933772 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7dpkv"] Feb 26 15:43:23 crc kubenswrapper[5033]: I0226 15:43:23.959497 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:23 crc kubenswrapper[5033]: E0226 15:43:23.959807 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.459796249 +0000 UTC m=+131.672394899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.001843 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4bq8z" podStartSLOduration=69.001823271 podStartE2EDuration="1m9.001823271s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:23.995997605 +0000 UTC m=+131.208596335" watchObservedRunningTime="2026-02-26 15:43:24.001823271 +0000 UTC m=+131.214421911" Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.060427 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.060686 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.56067222 +0000 UTC m=+131.773270870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.162089 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.162549 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.662529298 +0000 UTC m=+131.875127948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.262891 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.263235 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.763218583 +0000 UTC m=+131.975817233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.297634 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-tz87k"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.326009 5033 patch_prober.go:28] interesting pod/apiserver-76f77b778f-rgjzx container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 26 15:43:24 crc kubenswrapper[5033]: [+]log ok Feb 26 15:43:24 crc kubenswrapper[5033]: [+]etcd ok Feb 26 15:43:24 crc kubenswrapper[5033]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 26 15:43:24 crc kubenswrapper[5033]: [+]poststarthook/generic-apiserver-start-informers ok Feb 26 15:43:24 crc kubenswrapper[5033]: [+]poststarthook/max-in-flight-filter ok Feb 26 15:43:24 crc kubenswrapper[5033]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 26 15:43:24 crc kubenswrapper[5033]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 26 15:43:24 crc kubenswrapper[5033]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 26 15:43:24 crc kubenswrapper[5033]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 26 15:43:24 crc kubenswrapper[5033]: [+]poststarthook/project.openshift.io-projectcache ok Feb 26 15:43:24 crc kubenswrapper[5033]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 26 15:43:24 crc kubenswrapper[5033]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Feb 26 15:43:24 crc kubenswrapper[5033]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 26 15:43:24 crc kubenswrapper[5033]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 26 15:43:24 crc kubenswrapper[5033]: livez check failed Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.326071 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" podUID="4aaf15fb-2839-42d5-8628-341eed11eee8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.363996 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.364344 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.864331471 +0000 UTC m=+132.076930121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.464824 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.465246 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:24.965216462 +0000 UTC m=+132.177815112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.528438 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" event={"ID":"93272b5a-1ba9-4f23-b5ad-e51e6f6ad30c","Type":"ContainerStarted","Data":"2d2caf548097cb3116c86e7f47d487d8eeb785f80cf144962d4cedf24c8a7976"} Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.532839 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" event={"ID":"d80de8ac-f823-4cb8-93e7-f3a672b83729","Type":"ContainerStarted","Data":"9a212108062b6173a84cd1d217986646cdc96ecc2e57751131fd5d36a1c70323"} Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.532889 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" event={"ID":"d80de8ac-f823-4cb8-93e7-f3a672b83729","Type":"ContainerStarted","Data":"b6351de7bfff2d48d3093743d29ffa43c82ad0cb280121276c2e0c080469064f"} Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.532901 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" event={"ID":"d80de8ac-f823-4cb8-93e7-f3a672b83729","Type":"ContainerStarted","Data":"177eb61ca38ffa89b8dbac0f08ac6c476e04a79c6ada617822cc6329f145be5f"} Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.542458 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" event={"ID":"3bdab87a-cacc-4b04-87d0-1439b3db3efb","Type":"ContainerStarted","Data":"6be1c2efc5ec297a3386de412de2d6ed25a97e604e161151de3da5c742a8a12e"} Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.542499 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" event={"ID":"3bdab87a-cacc-4b04-87d0-1439b3db3efb","Type":"ContainerStarted","Data":"ce4bc74eebf5e3f73746c7ea3bfdfa905df3ee9ce36bca524eaefe95c3cb9650"} Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.550027 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" event={"ID":"c80d44f2-ac5d-41e2-8abb-773410cde2a6","Type":"ContainerStarted","Data":"4d3f28b70f1a391a4d7170afeeaea637682852d44d4b0a32c19bc82269f8859f"} Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.550068 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" event={"ID":"c80d44f2-ac5d-41e2-8abb-773410cde2a6","Type":"ContainerStarted","Data":"6c4b8fb8f1b5c6c094597ddd15880560548029db3af02135154500a174a089aa"} Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.550897 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbng4" podStartSLOduration=69.550884701 podStartE2EDuration="1m9.550884701s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:24.548238126 +0000 UTC m=+131.760836776" watchObservedRunningTime="2026-02-26 15:43:24.550884701 +0000 UTC m=+131.763483351" Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.568007 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.568364 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.068350116 +0000 UTC m=+132.280948766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.573652 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zfxb8" podStartSLOduration=69.573616475 podStartE2EDuration="1m9.573616475s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:24.573189014 +0000 UTC m=+131.785787664" watchObservedRunningTime="2026-02-26 15:43:24.573616475 +0000 UTC m=+131.786215125" Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.593594 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bdlcv"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.608564 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nq49v"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.627222 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.632557 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.635620 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.651970 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.657759 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.665294 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cnhpr"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.669574 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.670593 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.170574585 +0000 UTC m=+132.383173235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.779849 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.780243 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.280226425 +0000 UTC m=+132.492825075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.856658 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:24 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:24 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:24 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.856700 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.881298 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.881428 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.381399523 +0000 UTC m=+132.593998173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.881546 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.881868 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.381859266 +0000 UTC m=+132.594457916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.966100 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf"] Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.982764 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.982897 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.482881181 +0000 UTC m=+132.695479831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:24 crc kubenswrapper[5033]: I0226 15:43:24.983458 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:24 crc kubenswrapper[5033]: E0226 15:43:24.983843 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.483830458 +0000 UTC m=+132.696429108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.004390 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fkt29"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.012044 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fb82g"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.049171 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gs8vj"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.064844 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-b6z8t"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.068925 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.076983 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.085425 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.085667 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.585651746 +0000 UTC m=+132.798250396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.085958 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.086236 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.586228822 +0000 UTC m=+132.798827462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.089184 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8mgvk"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.090543 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.095156 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.095212 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8bjnz"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.097589 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.099814 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sql8k"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.101898 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.105746 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.107322 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz"] Feb 26 15:43:25 crc kubenswrapper[5033]: W0226 15:43:25.107462 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b423453_fa78_454a_83c7_934c0fce8992.slice/crio-f88b857cdc78d4a61f090b7651f950bff9e48a60655cfea5f32d3eb9368a3dc6 WatchSource:0}: Error finding container f88b857cdc78d4a61f090b7651f950bff9e48a60655cfea5f32d3eb9368a3dc6: Status 404 returned error can't find the container with id f88b857cdc78d4a61f090b7651f950bff9e48a60655cfea5f32d3eb9368a3dc6 Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.108923 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8l772"] Feb 26 15:43:25 crc kubenswrapper[5033]: W0226 15:43:25.112352 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde9d193e_d622_4bc0_8827_3d55d3ecf448.slice/crio-8489b9b6552eef1b2d73c8964b016401b90f987958db3eae20148b7440d81ffb WatchSource:0}: Error finding container 8489b9b6552eef1b2d73c8964b016401b90f987958db3eae20148b7440d81ffb: Status 404 returned error can't find the container with id 8489b9b6552eef1b2d73c8964b016401b90f987958db3eae20148b7440d81ffb Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.112424 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.112460 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d"] Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.114007 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wmk55"] Feb 26 15:43:25 crc kubenswrapper[5033]: W0226 15:43:25.128796 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef433248_3b40_4b06_86ff_f9fd9377c779.slice/crio-a080652be467ba9b109e3ca8de026b6dc2f193fb166a480e1685ef3441173232 WatchSource:0}: Error finding container a080652be467ba9b109e3ca8de026b6dc2f193fb166a480e1685ef3441173232: Status 404 returned error can't find the container with id a080652be467ba9b109e3ca8de026b6dc2f193fb166a480e1685ef3441173232 Feb 26 15:43:25 crc kubenswrapper[5033]: W0226 15:43:25.130986 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4547700c_d872_4fa8_bd19_d5d442dc38f7.slice/crio-5771e469cf31cb75e7ed994f113051f5325428889243b310e547f8529e57472a WatchSource:0}: Error finding container 5771e469cf31cb75e7ed994f113051f5325428889243b310e547f8529e57472a: Status 404 returned error can't find the container with id 5771e469cf31cb75e7ed994f113051f5325428889243b310e547f8529e57472a Feb 26 15:43:25 crc kubenswrapper[5033]: W0226 15:43:25.153851 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb42c6ae_cf0c_43e8_95e8_ab56d339ba31.slice/crio-4020db7817a6fe6fca82f29bf52e379d0f890f68259b81129f8e303f985ebb2e WatchSource:0}: Error finding container 4020db7817a6fe6fca82f29bf52e379d0f890f68259b81129f8e303f985ebb2e: Status 404 returned error can't find the container with id 4020db7817a6fe6fca82f29bf52e379d0f890f68259b81129f8e303f985ebb2e Feb 26 15:43:25 crc kubenswrapper[5033]: W0226 15:43:25.171892 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea04ee0d_86a7_4603_9060_8852dd019a22.slice/crio-048da203519f769799ff2dbc0ee265574f3c98d04909a59195151185fdc402bd WatchSource:0}: Error finding container 048da203519f769799ff2dbc0ee265574f3c98d04909a59195151185fdc402bd: Status 404 returned error can't find the container with id 048da203519f769799ff2dbc0ee265574f3c98d04909a59195151185fdc402bd Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.186715 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.187371 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.687338349 +0000 UTC m=+132.899936999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.289066 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.289450 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.789437114 +0000 UTC m=+133.002035754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.390553 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.391040 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:25.891010305 +0000 UTC m=+133.103608945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.500896 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.501984 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.001966631 +0000 UTC m=+133.214565281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.525085 5033 ???:1] "http: TLS handshake error from 192.168.126.11:38414: no serving certificate available for the kubelet" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.557269 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" event={"ID":"b7c878e1-790a-487b-967e-bf476f6f22a9","Type":"ContainerStarted","Data":"cd878b80d279c75f19cea39cb04b58dcf2e57148e01a81f8ef8fdba9dd87ed13"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.558175 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" event={"ID":"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8","Type":"ContainerStarted","Data":"59651b49c8850a79b5ef516926552b0874ee164434c1ce2d8bb25fef6bd1f76c"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.558198 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" event={"ID":"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8","Type":"ContainerStarted","Data":"1336f7e96f042a33ef9fac9427f93166ec9d80e2ce17535cef9cbcaea0a70d61"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.558871 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" event={"ID":"4547700c-d872-4fa8-bd19-d5d442dc38f7","Type":"ContainerStarted","Data":"5771e469cf31cb75e7ed994f113051f5325428889243b310e547f8529e57472a"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.567726 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" event={"ID":"22eaaf47-a99c-484f-9c2f-1aa78da1111c","Type":"ContainerStarted","Data":"479c16a46581a73f4ec497d82c4c0d726ef1ce6589e8e701f5dd1448e3861c03"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.580937 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" event={"ID":"4fdf7757-4d06-43cc-be27-0be0a8155b11","Type":"ContainerStarted","Data":"e2ce26c3cb0b01b8fd00c000e9750607af4d5b091ef2066cfd45cda39500fe43"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.589485 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" event={"ID":"db1da5ad-1d5b-4c87-9d5d-340cf0536968","Type":"ContainerStarted","Data":"f2cd5b8e0a52c546e1fb71c3dcbcbac1bbaa9dee5688ec42e03572bd7c7a8be1"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.590570 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" event={"ID":"ea04ee0d-86a7-4603-9060-8852dd019a22","Type":"ContainerStarted","Data":"048da203519f769799ff2dbc0ee265574f3c98d04909a59195151185fdc402bd"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.602012 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.602347 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.102333057 +0000 UTC m=+133.314931697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.608053 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" event={"ID":"c263b9f1-4544-4400-a1d0-484b593ceab3","Type":"ContainerStarted","Data":"e6f3de0b3e8862b24a293dfa339d459e43b2e8d95b769c1988e059a71fffd79c"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.608099 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" event={"ID":"c263b9f1-4544-4400-a1d0-484b593ceab3","Type":"ContainerStarted","Data":"c8569104448b62fac270ee5743d05e320ecba3debd4673b7d97d7d49ac8dd02b"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.618714 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" event={"ID":"7875ee6c-fa47-4620-b515-aad264da7c5a","Type":"ContainerStarted","Data":"3fcb5a265dac35aa055d6deacfe668c077c250af32fc5a013a14126d2228b042"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.618763 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" event={"ID":"7875ee6c-fa47-4620-b515-aad264da7c5a","Type":"ContainerStarted","Data":"fbcbe226368f672d3ba5222bc564f0188c9e40a02e6daaf68279225c481220f5"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.622847 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8bjnz" event={"ID":"c42a886d-9392-4ba9-92d4-49763cfe4d10","Type":"ContainerStarted","Data":"f25d507a303d939b0f471420a0e920f875ca4ecafadb71cad159d265a9d974e7"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.631295 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hzp8j" podStartSLOduration=70.631271537 podStartE2EDuration="1m10.631271537s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.626465661 +0000 UTC m=+132.839064331" watchObservedRunningTime="2026-02-26 15:43:25.631271537 +0000 UTC m=+132.843870197" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.669094 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2tf4r" podStartSLOduration=70.66906611900001 podStartE2EDuration="1m10.669066119s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.661639649 +0000 UTC m=+132.874238319" watchObservedRunningTime="2026-02-26 15:43:25.669066119 +0000 UTC m=+132.881664769" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.683689 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" event={"ID":"3bdab87a-cacc-4b04-87d0-1439b3db3efb","Type":"ContainerStarted","Data":"4ed65d86d46d49aa507bad1156a8c64517989288f8b9fe1079b45030079d3949"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.683829 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" event={"ID":"0b3f9223-93d9-4270-bbc2-c3235636f93f","Type":"ContainerStarted","Data":"fdce6c171d26924450d3518c2e4a927934925f355db7b9eea7a6738fa01dc7ff"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.683901 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" event={"ID":"4b423453-fa78-454a-83c7-934c0fce8992","Type":"ContainerStarted","Data":"6a54afb263ea8c342d9fa2eda0ef3a383fa0ab62ef44631e5682bf56bbf71f51"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.683973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" event={"ID":"4b423453-fa78-454a-83c7-934c0fce8992","Type":"ContainerStarted","Data":"f88b857cdc78d4a61f090b7651f950bff9e48a60655cfea5f32d3eb9368a3dc6"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.687214 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" event={"ID":"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28","Type":"ContainerStarted","Data":"6785d2ffa802839a2f7e54b98b5c7ffe0c41eaf265c5f9b7ae68162e01f398fe"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.687268 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" event={"ID":"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28","Type":"ContainerStarted","Data":"16d37fff6ffeb502522ddfb1a11c3a23c63d63c6402baa510bd3f57f9a48c352"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.687279 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" event={"ID":"e773b05c-cf02-4b49-8a2d-6e9a5e8bea28","Type":"ContainerStarted","Data":"debc9893d5076b024086cf131452dcc86a2c4ba2078793be2afb97ed61e15dc8"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.689144 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" event={"ID":"df4d19a5-387f-464f-ad1f-373b31520bcf","Type":"ContainerStarted","Data":"ba845460fd850b219651c662fe1cf1e2bfbbec5369b50cac5c96d293f355bb01"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.701603 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nq49v" event={"ID":"d6d762dc-b9e3-44fd-a835-78df146dc35a","Type":"ContainerStarted","Data":"db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.701660 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nq49v" event={"ID":"d6d762dc-b9e3-44fd-a835-78df146dc35a","Type":"ContainerStarted","Data":"6503095f95f4ec7e815ae94bdd698f6c1fc9811a91dcd2ab3b9587b8edf5a28f"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.703798 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.704197 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.204182735 +0000 UTC m=+133.416781385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.724555 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" event={"ID":"c80d44f2-ac5d-41e2-8abb-773410cde2a6","Type":"ContainerStarted","Data":"77b48f775f4325a6fee730a5aa0649f8d883ffe5463dcb8ed7004bf35a4ba396"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.726700 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7dpkv" podStartSLOduration=70.726685034 podStartE2EDuration="1m10.726685034s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.725830639 +0000 UTC m=+132.938429299" watchObservedRunningTime="2026-02-26 15:43:25.726685034 +0000 UTC m=+132.939283684" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.728223 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fkt29" event={"ID":"de9d193e-d622-4bc0-8827-3d55d3ecf448","Type":"ContainerStarted","Data":"8489b9b6552eef1b2d73c8964b016401b90f987958db3eae20148b7440d81ffb"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.748878 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" event={"ID":"65c7bb02-dc66-4754-8623-8d1be2932557","Type":"ContainerStarted","Data":"a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.748928 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" event={"ID":"65c7bb02-dc66-4754-8623-8d1be2932557","Type":"ContainerStarted","Data":"da51268416a6118244b5cbcea2c67909a863c1ef3a36f82df7440ec24fb560b7"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.749556 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.752872 5033 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bdlcv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.752935 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" podUID="65c7bb02-dc66-4754-8623-8d1be2932557" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.754747 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" event={"ID":"f3517c8c-cdca-439b-84ee-57d1b41ebcd2","Type":"ContainerStarted","Data":"08b4dc787bf70fdd6599983e77ec466d4a93b91bad632dbeae8b1831bc672cf9"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.754786 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" event={"ID":"f3517c8c-cdca-439b-84ee-57d1b41ebcd2","Type":"ContainerStarted","Data":"133f68863b249e6a41e146af708af056ba5b44f454210b1fe81fac256030a154"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.759318 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" event={"ID":"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31","Type":"ContainerStarted","Data":"4020db7817a6fe6fca82f29bf52e379d0f890f68259b81129f8e303f985ebb2e"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.762630 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" event={"ID":"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74","Type":"ContainerStarted","Data":"7bfd31ae5d49a3dab79e982baaa5055b867806d6588c1d185055d7c56c2030cb"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.767227 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nq49v" podStartSLOduration=70.767206522 podStartE2EDuration="1m10.767206522s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.753454503 +0000 UTC m=+132.966053163" watchObservedRunningTime="2026-02-26 15:43:25.767206522 +0000 UTC m=+132.979805162" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.793455 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" event={"ID":"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38","Type":"ContainerStarted","Data":"a2e2f87ef37d1e4eced88681c061c4355c00ab710ee49a6344e2b05c2529b096"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.801948 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-g7xhm" podStartSLOduration=70.801932547 podStartE2EDuration="1m10.801932547s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.777477564 +0000 UTC m=+132.990076214" watchObservedRunningTime="2026-02-26 15:43:25.801932547 +0000 UTC m=+133.014531197" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.802622 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xtmb5" podStartSLOduration=70.802617487 podStartE2EDuration="1m10.802617487s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.800987341 +0000 UTC m=+133.013585991" watchObservedRunningTime="2026-02-26 15:43:25.802617487 +0000 UTC m=+133.015216137" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.811088 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" event={"ID":"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71","Type":"ContainerStarted","Data":"9a95206f6a5368c455833da7c4314a30c836c481d14d84b35e9b9f9421d5362d"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.813461 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.813647 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.313620889 +0000 UTC m=+133.526219539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.814137 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.817783 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.317772517 +0000 UTC m=+133.530371167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.821442 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8l772" event={"ID":"c852046c-87e9-42a6-9082-2d7390d6f71e","Type":"ContainerStarted","Data":"dfa23a5c0072a74c4361540f9bb4a9b9e9dc40b3baff85f7bb015cb7bdc5a8e0"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.828997 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6ggkg" podStartSLOduration=70.828982895 podStartE2EDuration="1m10.828982895s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.827417439 +0000 UTC m=+133.040016109" watchObservedRunningTime="2026-02-26 15:43:25.828982895 +0000 UTC m=+133.041581545" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.837695 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" event={"ID":"e612fd1f-3961-484f-a94c-18a54b4bf1ab","Type":"ContainerStarted","Data":"1c70480b8a5ba5d590578e13f33ae177363e8b80e56ae439f45837aa22cd93c6"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.837753 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" event={"ID":"e612fd1f-3961-484f-a94c-18a54b4bf1ab","Type":"ContainerStarted","Data":"32262493bd3ba8c259a68a24d06235b42e74d3076220c1e40b71a8ab99e1b1b5"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.839100 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" event={"ID":"c63de0b0-6c5f-4ae1-960e-86a483b1ac68","Type":"ContainerStarted","Data":"eb2fd4b27ac26af18e02bdc812fb4da0744cefc41fbcd127ad19f84fc829ef62"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.848971 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" podStartSLOduration=70.848952461 podStartE2EDuration="1m10.848952461s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.848131027 +0000 UTC m=+133.060729687" watchObservedRunningTime="2026-02-26 15:43:25.848952461 +0000 UTC m=+133.061551121" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.850582 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fb82g" event={"ID":"40828720-7437-44c4-993e-a757c56e571b","Type":"ContainerStarted","Data":"d4be066bb3daac8c808492a57fde20e336dda6e75cc9c0c028d7af9b422bf058"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.857951 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:25 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:25 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:25 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.858008 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.858531 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" event={"ID":"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0","Type":"ContainerStarted","Data":"794594778bd202efad6d009b5e629a8a8befedca184bebf262d1c24217e4cd38"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.858558 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" event={"ID":"f545eb5d-2778-4ad4-84b0-7ca6b46e64e0","Type":"ContainerStarted","Data":"3ceccbb3a61c4b202735ccd2f7e67d20bf80cdb29d015311e3e3049fdd2d8f80"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.859844 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.862152 5033 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fnszf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.862207 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" podUID="f545eb5d-2778-4ad4-84b0-7ca6b46e64e0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.876935 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" podUID="2371a2b2-ba78-48f3-839d-cd97b08f3243" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" gracePeriod=30 Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.877063 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gs8vj" event={"ID":"ef433248-3b40-4b06-86ff-f9fd9377c779","Type":"ContainerStarted","Data":"a080652be467ba9b109e3ca8de026b6dc2f193fb166a480e1685ef3441173232"} Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.889466 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nmrx" podStartSLOduration=70.889447859 podStartE2EDuration="1m10.889447859s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.888957025 +0000 UTC m=+133.101555675" watchObservedRunningTime="2026-02-26 15:43:25.889447859 +0000 UTC m=+133.102046509" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.889829 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" podStartSLOduration=70.889820849 podStartE2EDuration="1m10.889820849s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.867761054 +0000 UTC m=+133.080359694" watchObservedRunningTime="2026-02-26 15:43:25.889820849 +0000 UTC m=+133.102419499" Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.919035 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.919892 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.419862201 +0000 UTC m=+133.632460851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:25 crc kubenswrapper[5033]: I0226 15:43:25.920410 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:25 crc kubenswrapper[5033]: E0226 15:43:25.926637 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.426621643 +0000 UTC m=+133.639220293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.021547 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.021634 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.521598606 +0000 UTC m=+133.734197256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.022105 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.022480 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.522467261 +0000 UTC m=+133.735065911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.123209 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.123615 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.623581298 +0000 UTC m=+133.836179948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.223866 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.224156 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.72414095 +0000 UTC m=+133.936739600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.325396 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.325605 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.825577066 +0000 UTC m=+134.038175716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.326040 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.326446 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.826436661 +0000 UTC m=+134.039035311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.427816 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.428041 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.928009781 +0000 UTC m=+134.140608431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.428367 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.428691 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:26.92868293 +0000 UTC m=+134.141281580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.528863 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.529152 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.029124159 +0000 UTC m=+134.241722819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.630789 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.631290 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.131272066 +0000 UTC m=+134.343870716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.732370 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.732598 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.232563458 +0000 UTC m=+134.445162108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.733064 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.733436 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.233425963 +0000 UTC m=+134.446024613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.834457 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.834675 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.334660433 +0000 UTC m=+134.547259073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.848150 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:26 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:26 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:26 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.848195 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.885041 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" event={"ID":"ea04ee0d-86a7-4603-9060-8852dd019a22","Type":"ContainerStarted","Data":"112263c23348851f49cd6b1afa96336bf6a36ab272b1f3fab20912b2d6bf28ec"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.886853 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" event={"ID":"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74","Type":"ContainerStarted","Data":"321ba68642c60ac3c778428e35e4e851d36924d7675f9116bfc94d29f1047d9e"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.888174 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" event={"ID":"4fdf7757-4d06-43cc-be27-0be0a8155b11","Type":"ContainerStarted","Data":"5d8830706bf9e10a080568e69feea28da653a9a09f7e195348be9abcb238cdff"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.888539 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.889784 5033 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8mgvk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.889867 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.890155 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fkt29" event={"ID":"de9d193e-d622-4bc0-8827-3d55d3ecf448","Type":"ContainerStarted","Data":"323864cd04b26621172ce793840e0fdbc37172669f7d586410be02a8f77e33f3"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.895924 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" event={"ID":"e612fd1f-3961-484f-a94c-18a54b4bf1ab","Type":"ContainerStarted","Data":"94ccdc5734998be70922102c745fa9a0bf60bb7aea805d6d295b2fe86cc00b7c"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.918931 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" event={"ID":"9807d6e2-9450-4da2-a3f5-aea1a3b70cb8","Type":"ContainerStarted","Data":"74a71ef9b5c0d5f08ed9e38d95efc817b718ad6fb4427b7e5990dd704b7a25ce"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.924248 5033 generic.go:334] "Generic (PLEG): container finished" podID="0b3f9223-93d9-4270-bbc2-c3235636f93f" containerID="014f07c4e1fffa4365d5034e646e9a9fb149cf931a906ad1fd9b265643165705" exitCode=0 Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.924640 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" event={"ID":"0b3f9223-93d9-4270-bbc2-c3235636f93f","Type":"ContainerDied","Data":"014f07c4e1fffa4365d5034e646e9a9fb149cf931a906ad1fd9b265643165705"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.937369 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" podStartSLOduration=71.937345545 podStartE2EDuration="1m11.937345545s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:25.917892305 +0000 UTC m=+133.130490955" watchObservedRunningTime="2026-02-26 15:43:26.937345545 +0000 UTC m=+134.149944195" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.938365 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-sql8k" podStartSLOduration=71.938358034 podStartE2EDuration="1m11.938358034s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:26.923004638 +0000 UTC m=+134.135603288" watchObservedRunningTime="2026-02-26 15:43:26.938358034 +0000 UTC m=+134.150956684" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.939217 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:26 crc kubenswrapper[5033]: E0226 15:43:26.940726 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.440707581 +0000 UTC m=+134.653306231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.941985 5033 generic.go:334] "Generic (PLEG): container finished" podID="c63de0b0-6c5f-4ae1-960e-86a483b1ac68" containerID="0f42b320d324d982d301ce02400d19e687ede9fcae515a43a1e91b98d3d5157e" exitCode=0 Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.942154 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" event={"ID":"c63de0b0-6c5f-4ae1-960e-86a483b1ac68","Type":"ContainerDied","Data":"0f42b320d324d982d301ce02400d19e687ede9fcae515a43a1e91b98d3d5157e"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.951594 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" event={"ID":"db1da5ad-1d5b-4c87-9d5d-340cf0536968","Type":"ContainerStarted","Data":"15b200e3f49f54814c7df5d307a998b5db1e6079d2c484bcac9580c39fa792b2"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.952616 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.956424 5033 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-h294z container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.956491 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" podUID="db1da5ad-1d5b-4c87-9d5d-340cf0536968" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.964869 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gs8vj" event={"ID":"ef433248-3b40-4b06-86ff-f9fd9377c779","Type":"ContainerStarted","Data":"393ff065c2726794699522e8d306db0039bf482872aa9dbf47587dec7989a7ff"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.964882 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cnhpr" podStartSLOduration=71.964866195 podStartE2EDuration="1m11.964866195s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:26.964301289 +0000 UTC m=+134.176899959" watchObservedRunningTime="2026-02-26 15:43:26.964866195 +0000 UTC m=+134.177464845" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.965909 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.970477 5033 patch_prober.go:28] interesting pod/console-operator-58897d9998-gs8vj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.973146 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gs8vj" podUID="ef433248-3b40-4b06-86ff-f9fd9377c779" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.993590 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" event={"ID":"b7c878e1-790a-487b-967e-bf476f6f22a9","Type":"ContainerStarted","Data":"5b5258a4e09a296eb9061d7f72c7a18ed7cfe256acb4dfe985950820715ca592"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.993686 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" event={"ID":"b7c878e1-790a-487b-967e-bf476f6f22a9","Type":"ContainerStarted","Data":"33400f7eec62c4079197edab8b7859ae6041ae9f7db8f034cc24da9e421bb913"} Feb 26 15:43:26 crc kubenswrapper[5033]: I0226 15:43:26.994487 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:26.998784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" event={"ID":"4547700c-d872-4fa8-bd19-d5d442dc38f7","Type":"ContainerStarted","Data":"5138f88021856e238e141c32c06a6102f3f5dca46d71a23bc4c10de968265dc8"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.006299 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" podStartSLOduration=72.006272819 podStartE2EDuration="1m12.006272819s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.004811658 +0000 UTC m=+134.217410308" watchObservedRunningTime="2026-02-26 15:43:27.006272819 +0000 UTC m=+134.218871469" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.007297 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" event={"ID":"df4d19a5-387f-464f-ad1f-373b31520bcf","Type":"ContainerStarted","Data":"f0e91c31a64e3526e425f6389f80853d75929f408e6a58f7e8c220779036e18e"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.007910 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.031136 5033 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-f79xz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.031211 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" podUID="df4d19a5-387f-464f-ad1f-373b31520bcf" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.031977 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" event={"ID":"22eaaf47-a99c-484f-9c2f-1aa78da1111c","Type":"ContainerStarted","Data":"df1230f2eb1dcac3327ab9cd9fa80d3b79be89df5ad37ba4c4d79c84ceb8c6bc"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.042955 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.043856 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.543835424 +0000 UTC m=+134.756434074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.053069 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fmqqx" event={"ID":"bb42c6ae-cf0c-43e8-95e8-ab56d339ba31","Type":"ContainerStarted","Data":"d0b6f7cf4fe7c9719ab725b053484e3fcdf3e196d573311d735e02e0a2373194"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.068713 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" event={"ID":"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38","Type":"ContainerStarted","Data":"24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.068786 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.077710 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8l772" event={"ID":"c852046c-87e9-42a6-9082-2d7390d6f71e","Type":"ContainerStarted","Data":"de3487d052993c7403d1eddabbf6b7b89ca9b3c0884e2807349702c2da9cab81"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.078008 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8l772" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.079631 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fb82g" event={"ID":"40828720-7437-44c4-993e-a757c56e571b","Type":"ContainerStarted","Data":"47c215a11513eb0dacde320e167f899bcdff212118d18485028b0cd6b089fcbb"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.079665 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fb82g" event={"ID":"40828720-7437-44c4-993e-a757c56e571b","Type":"ContainerStarted","Data":"bbba52d32adc6ef6ddf1e4ab6c33ae5747b354d2afb0ed14b77901a2f3bb7b8a"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.079878 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.080799 5033 patch_prober.go:28] interesting pod/downloads-7954f5f757-8l772 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.080850 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8l772" podUID="c852046c-87e9-42a6-9082-2d7390d6f71e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.083224 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" event={"ID":"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71","Type":"ContainerStarted","Data":"b08ff198c297cc053689002a76f9f743707da7e0740899a130e9976e6c2ec3cc"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.083306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" event={"ID":"73c0a4a0-c4d4-4167-ae1a-05ffd7291c71","Type":"ContainerStarted","Data":"69c78ab21aac452e824a9ed45a40c7619b5e086644700c2aee73bc85962f393d"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.088569 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5jgbd" podStartSLOduration=72.088552682 podStartE2EDuration="1m12.088552682s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.087659778 +0000 UTC m=+134.300258438" watchObservedRunningTime="2026-02-26 15:43:27.088552682 +0000 UTC m=+134.301151332" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.093971 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" podStartSLOduration=73.093945096 podStartE2EDuration="1m13.093945096s" podCreationTimestamp="2026-02-26 15:42:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.0542418 +0000 UTC m=+134.266840450" watchObservedRunningTime="2026-02-26 15:43:27.093945096 +0000 UTC m=+134.306543746" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.099528 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8bjnz" event={"ID":"c42a886d-9392-4ba9-92d4-49763cfe4d10","Type":"ContainerStarted","Data":"29752608d1eb8e8178214fac1fea9033ff9005a42812adf256dd22f2cbcec7ca"} Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.144574 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.145652 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.148800 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.648787651 +0000 UTC m=+134.861386291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.153841 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" podStartSLOduration=72.153823854 podStartE2EDuration="1m12.153823854s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.123301008 +0000 UTC m=+134.335899658" watchObservedRunningTime="2026-02-26 15:43:27.153823854 +0000 UTC m=+134.366422504" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.155468 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-b6z8t" podStartSLOduration=72.15546107 podStartE2EDuration="1m12.15546107s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.153121883 +0000 UTC m=+134.365720533" watchObservedRunningTime="2026-02-26 15:43:27.15546107 +0000 UTC m=+134.368059720" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.213476 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8l772" podStartSLOduration=72.213459324 podStartE2EDuration="1m12.213459324s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.210851721 +0000 UTC m=+134.423450371" watchObservedRunningTime="2026-02-26 15:43:27.213459324 +0000 UTC m=+134.426057974" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.248460 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.248651 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.748623771 +0000 UTC m=+134.961222421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.249178 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.249439 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.749427985 +0000 UTC m=+134.962026635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.329991 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" podStartSLOduration=72.329970769 podStartE2EDuration="1m12.329970769s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.288687068 +0000 UTC m=+134.501285718" watchObservedRunningTime="2026-02-26 15:43:27.329970769 +0000 UTC m=+134.542569419" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.330389 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fnszf" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.330616 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" podStartSLOduration=72.330608927 podStartE2EDuration="1m12.330608927s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.327306793 +0000 UTC m=+134.539905433" watchObservedRunningTime="2026-02-26 15:43:27.330608927 +0000 UTC m=+134.543207567" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.351596 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.352277 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.852221239 +0000 UTC m=+135.064819889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.361888 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fb82g" podStartSLOduration=25.361870603 podStartE2EDuration="25.361870603s" podCreationTimestamp="2026-02-26 15:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.358428696 +0000 UTC m=+134.571027346" watchObservedRunningTime="2026-02-26 15:43:27.361870603 +0000 UTC m=+134.574469253" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.393827 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wmk55" podStartSLOduration=72.393808549 podStartE2EDuration="1m12.393808549s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.389770804 +0000 UTC m=+134.602369454" watchObservedRunningTime="2026-02-26 15:43:27.393808549 +0000 UTC m=+134.606407199" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.453547 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.454017 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:27.954000946 +0000 UTC m=+135.166599596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.484409 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gs8vj" podStartSLOduration=72.484388658 podStartE2EDuration="1m12.484388658s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.482085732 +0000 UTC m=+134.694684382" watchObservedRunningTime="2026-02-26 15:43:27.484388658 +0000 UTC m=+134.696987308" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.484587 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vrz4d" podStartSLOduration=72.484580703 podStartE2EDuration="1m12.484580703s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.457639029 +0000 UTC m=+134.670237679" watchObservedRunningTime="2026-02-26 15:43:27.484580703 +0000 UTC m=+134.697179343" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.537659 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" podStartSLOduration=72.537631048 podStartE2EDuration="1m12.537631048s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.524790004 +0000 UTC m=+134.737388654" watchObservedRunningTime="2026-02-26 15:43:27.537631048 +0000 UTC m=+134.750229698" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.554723 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.554932 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.054893487 +0000 UTC m=+135.267492137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.555032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.555497 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.055481514 +0000 UTC m=+135.268080164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.656778 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.657089 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.157064704 +0000 UTC m=+135.369663354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.657473 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.657771 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.157758804 +0000 UTC m=+135.370357454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.665590 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.709740 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8bjnz" podStartSLOduration=25.709710577 podStartE2EDuration="25.709710577s" podCreationTimestamp="2026-02-26 15:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:27.632324563 +0000 UTC m=+134.844923213" watchObservedRunningTime="2026-02-26 15:43:27.709710577 +0000 UTC m=+134.922309227" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.757885 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.758163 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.25811843 +0000 UTC m=+135.470717080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.850918 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:27 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:27 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:27 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.851016 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.859052 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.859452 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.359434483 +0000 UTC m=+135.572033133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.960478 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.960766 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.460724955 +0000 UTC m=+135.673323605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:27 crc kubenswrapper[5033]: I0226 15:43:27.960826 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:27 crc kubenswrapper[5033]: E0226 15:43:27.961329 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.461315653 +0000 UTC m=+135.673914303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.061650 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.061858 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.561829362 +0000 UTC m=+135.774428002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.061902 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.062521 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.562490231 +0000 UTC m=+135.775088881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.104599 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fkt29" event={"ID":"de9d193e-d622-4bc0-8827-3d55d3ecf448","Type":"ContainerStarted","Data":"ad062cdd5ff071ba30545b89fd6af5fc972f4ae18ed1edd819bae4cfe5c9d974"} Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.106868 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" event={"ID":"0b3f9223-93d9-4270-bbc2-c3235636f93f","Type":"ContainerStarted","Data":"898ea9df4e32044eb29b7e9c7169d1cafc0c9ba276077954cd1e7b3ab7750c31"} Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.109683 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" event={"ID":"c63de0b0-6c5f-4ae1-960e-86a483b1ac68","Type":"ContainerStarted","Data":"2ef7e8a25f485ab6fab7018d2b859bd46af78f403a7227fb5dc8a8e3adc489a5"} Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.109711 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.111892 5033 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8mgvk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.111926 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.113585 5033 patch_prober.go:28] interesting pod/downloads-7954f5f757-8l772 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.113609 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8l772" podUID="c852046c-87e9-42a6-9082-2d7390d6f71e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.121021 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-f79xz" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.123810 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h294z" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.140130 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.140168 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.142120 5033 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-kgzj7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.142187 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" podUID="0b3f9223-93d9-4270-bbc2-c3235636f93f" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.162127 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" podStartSLOduration=73.162108966 podStartE2EDuration="1m13.162108966s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:28.1428607 +0000 UTC m=+135.355459350" watchObservedRunningTime="2026-02-26 15:43:28.162108966 +0000 UTC m=+135.374707616" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.162978 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.163114 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.663078504 +0000 UTC m=+135.875677154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.164333 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.170212 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.670199396 +0000 UTC m=+135.882798046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.190444 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" podStartSLOduration=73.190423409 podStartE2EDuration="1m13.190423409s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:28.19008984 +0000 UTC m=+135.402688490" watchObservedRunningTime="2026-02-26 15:43:28.190423409 +0000 UTC m=+135.403022059" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.270917 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.272030 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.772015132 +0000 UTC m=+135.984613782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.373306 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.373709 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.873691536 +0000 UTC m=+136.086290186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.429032 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bdlcv"] Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.474158 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.474335 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.97431243 +0000 UTC m=+136.186911080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.474574 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.474895 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:28.974887596 +0000 UTC m=+136.187486246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.503982 5033 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.548414 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp"] Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.575414 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.575604 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:29.075575661 +0000 UTC m=+136.288174311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.575658 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.576000 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:29.075993343 +0000 UTC m=+136.288591993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.676800 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.676985 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:29.176932186 +0000 UTC m=+136.389530836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.677218 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.678913 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:29.178899641 +0000 UTC m=+136.391498301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.705694 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gs8vj" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.778591 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.778890 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:29.278875956 +0000 UTC m=+136.491474606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.817921 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.827519 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rgjzx" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.848654 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:28 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:28 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:28 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.848736 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.880381 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.880705 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:43:29.380666913 +0000 UTC m=+136.593265563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p4fzm" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:28 crc kubenswrapper[5033]: I0226 15:43:28.981415 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:28 crc kubenswrapper[5033]: E0226 15:43:28.993052 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:43:29.493028809 +0000 UTC m=+136.705627459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.000383 5033 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-26T15:43:28.504014242Z","Handler":null,"Name":""} Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.043011 5033 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.043079 5033 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.083105 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.118956 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fkt29" event={"ID":"de9d193e-d622-4bc0-8827-3d55d3ecf448","Type":"ContainerStarted","Data":"0a04d0760b15693fe316987ddd5442cfd2241b3c8b4872c67e2775046e60e057"} Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.119006 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fkt29" event={"ID":"de9d193e-d622-4bc0-8827-3d55d3ecf448","Type":"ContainerStarted","Data":"cb4cdff3a797057f70be5d567cabc7c63359139ff61cf7886957ca3e9383df63"} Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.122768 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.122806 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.220234 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-fkt29" podStartSLOduration=27.220201681 podStartE2EDuration="27.220201681s" podCreationTimestamp="2026-02-26 15:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:29.192153936 +0000 UTC m=+136.404752586" watchObservedRunningTime="2026-02-26 15:43:29.220201681 +0000 UTC m=+136.432800331" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.221718 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4br75"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.222866 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.235945 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.257641 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4br75"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.375353 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p4fzm\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.389002 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.389378 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-catalog-content\") pod \"community-operators-4br75\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.389433 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5n9w\" (UniqueName: \"kubernetes.io/projected/5049f268-de58-4e79-9deb-d08c2f33397c-kube-api-access-j5n9w\") pod \"community-operators-4br75\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.389474 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-utilities\") pod \"community-operators-4br75\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.399691 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.406225 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kjzmq"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.407486 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.418152 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.435777 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kjzmq"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.491028 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5n9w\" (UniqueName: \"kubernetes.io/projected/5049f268-de58-4e79-9deb-d08c2f33397c-kube-api-access-j5n9w\") pod \"community-operators-4br75\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.491118 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-utilities\") pod \"community-operators-4br75\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.491175 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-catalog-content\") pod \"community-operators-4br75\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.491710 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-catalog-content\") pod \"community-operators-4br75\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.492018 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-utilities\") pod \"community-operators-4br75\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.537774 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5n9w\" (UniqueName: \"kubernetes.io/projected/5049f268-de58-4e79-9deb-d08c2f33397c-kube-api-access-j5n9w\") pod \"community-operators-4br75\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.582057 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.588144 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7p958"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.589364 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.591020 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.592006 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-utilities\") pod \"certified-operators-kjzmq\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.592061 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p5jc\" (UniqueName: \"kubernetes.io/projected/7a38e065-f09c-4182-ba61-d4db657510de-kube-api-access-6p5jc\") pod \"certified-operators-kjzmq\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.592095 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-catalog-content\") pod \"certified-operators-kjzmq\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.623851 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7p958"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.663134 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.689090 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.689973 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.693090 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-utilities\") pod \"certified-operators-kjzmq\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.693145 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p5jc\" (UniqueName: \"kubernetes.io/projected/7a38e065-f09c-4182-ba61-d4db657510de-kube-api-access-6p5jc\") pod \"certified-operators-kjzmq\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.693175 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-utilities\") pod \"community-operators-7p958\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.693205 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-catalog-content\") pod \"certified-operators-kjzmq\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.693223 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7w4f\" (UniqueName: \"kubernetes.io/projected/a7b8c773-dedb-4ed9-818d-cd878376c734-kube-api-access-p7w4f\") pod \"community-operators-7p958\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.693250 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-catalog-content\") pod \"community-operators-7p958\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.693792 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-utilities\") pod \"certified-operators-kjzmq\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.694244 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-catalog-content\") pod \"certified-operators-kjzmq\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.697972 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.698436 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.703730 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.714619 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p5jc\" (UniqueName: \"kubernetes.io/projected/7a38e065-f09c-4182-ba61-d4db657510de-kube-api-access-6p5jc\") pod \"certified-operators-kjzmq\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.721492 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.794141 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.794200 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-utilities\") pod \"community-operators-7p958\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.794231 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7w4f\" (UniqueName: \"kubernetes.io/projected/a7b8c773-dedb-4ed9-818d-cd878376c734-kube-api-access-p7w4f\") pod \"community-operators-7p958\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.794328 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-catalog-content\") pod \"community-operators-7p958\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.794371 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.794832 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-utilities\") pod \"community-operators-7p958\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.795746 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-catalog-content\") pod \"community-operators-7p958\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.812989 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cp66d"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.814811 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.825146 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cp66d"] Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.833112 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7w4f\" (UniqueName: \"kubernetes.io/projected/a7b8c773-dedb-4ed9-818d-cd878376c734-kube-api-access-p7w4f\") pod \"community-operators-7p958\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.841815 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4br75" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.867434 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:29 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:29 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:29 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.867508 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.898448 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.898556 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.898651 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.916542 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:29 crc kubenswrapper[5033]: I0226 15:43:29.942476 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7p958" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.000773 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-utilities\") pod \"certified-operators-cp66d\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.000849 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-catalog-content\") pod \"certified-operators-cp66d\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.000877 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8p4\" (UniqueName: \"kubernetes.io/projected/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-kube-api-access-jc8p4\") pod \"certified-operators-cp66d\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.063184 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.106397 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8p4\" (UniqueName: \"kubernetes.io/projected/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-kube-api-access-jc8p4\") pod \"certified-operators-cp66d\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.106482 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-utilities\") pod \"certified-operators-cp66d\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.107729 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-catalog-content\") pod \"certified-operators-cp66d\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.109029 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-catalog-content\") pod \"certified-operators-cp66d\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.109715 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-utilities\") pod \"certified-operators-cp66d\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.133624 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8p4\" (UniqueName: \"kubernetes.io/projected/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-kube-api-access-jc8p4\") pod \"certified-operators-cp66d\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.153468 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" podUID="14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" containerName="route-controller-manager" containerID="cri-o://24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4" gracePeriod=30 Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.154135 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" podUID="65c7bb02-dc66-4754-8623-8d1be2932557" containerName="controller-manager" containerID="cri-o://a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693" gracePeriod=30 Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.159645 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.196275 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kjzmq"] Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.228307 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4br75"] Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.347907 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p4fzm"] Feb 26 15:43:30 crc kubenswrapper[5033]: W0226 15:43:30.389735 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc382cb_23f7_44b0_89a2_43c41292c21e.slice/crio-2bdc305196091bcfd73e4844ed04f6d1d4f2a1ea6d2328447eb3b35144d4743d WatchSource:0}: Error finding container 2bdc305196091bcfd73e4844ed04f6d1d4f2a1ea6d2328447eb3b35144d4743d: Status 404 returned error can't find the container with id 2bdc305196091bcfd73e4844ed04f6d1d4f2a1ea6d2328447eb3b35144d4743d Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.413492 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.497734 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7p958"] Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.505474 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cp66d"] Feb 26 15:43:30 crc kubenswrapper[5033]: W0226 15:43:30.583949 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7b8c773_dedb_4ed9_818d_cd878376c734.slice/crio-b8fa44e5462868cb2c053391e527a773473170791d1128c1ba65def860ca54ce WatchSource:0}: Error finding container b8fa44e5462868cb2c053391e527a773473170791d1128c1ba65def860ca54ce: Status 404 returned error can't find the container with id b8fa44e5462868cb2c053391e527a773473170791d1128c1ba65def860ca54ce Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.817014 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.839221 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j"] Feb 26 15:43:30 crc kubenswrapper[5033]: E0226 15:43:30.839518 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" containerName="route-controller-manager" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.839539 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" containerName="route-controller-manager" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.839694 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" containerName="route-controller-manager" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.840198 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.850437 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:30 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:30 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:30 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.850486 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.852297 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.854130 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j"] Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.925745 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-serving-cert\") pod \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.926063 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt8w6\" (UniqueName: \"kubernetes.io/projected/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-kube-api-access-wt8w6\") pod \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.926108 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-client-ca\") pod \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.926126 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-config\") pod \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\" (UID: \"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38\") " Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.926880 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-client-ca" (OuterVolumeSpecName: "client-ca") pod "14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" (UID: "14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.926928 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-config" (OuterVolumeSpecName: "config") pod "14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" (UID: "14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.932704 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-kube-api-access-wt8w6" (OuterVolumeSpecName: "kube-api-access-wt8w6") pod "14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" (UID: "14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38"). InnerVolumeSpecName "kube-api-access-wt8w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:43:30 crc kubenswrapper[5033]: I0226 15:43:30.934838 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" (UID: "14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.027012 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-client-ca\") pod \"65c7bb02-dc66-4754-8623-8d1be2932557\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.027074 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-config\") pod \"65c7bb02-dc66-4754-8623-8d1be2932557\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.027127 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l52nw\" (UniqueName: \"kubernetes.io/projected/65c7bb02-dc66-4754-8623-8d1be2932557-kube-api-access-l52nw\") pod \"65c7bb02-dc66-4754-8623-8d1be2932557\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.027168 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65c7bb02-dc66-4754-8623-8d1be2932557-serving-cert\") pod \"65c7bb02-dc66-4754-8623-8d1be2932557\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.027241 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-proxy-ca-bundles\") pod \"65c7bb02-dc66-4754-8623-8d1be2932557\" (UID: \"65c7bb02-dc66-4754-8623-8d1be2932557\") " Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.027437 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-client-ca\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028431 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb4af4-aa1e-45d4-adb9-7264603f510a-serving-cert\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028502 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-config\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028534 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m28vj\" (UniqueName: \"kubernetes.io/projected/c2fb4af4-aa1e-45d4-adb9-7264603f510a-kube-api-access-m28vj\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028667 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028690 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028704 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028717 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt8w6\" (UniqueName: \"kubernetes.io/projected/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38-kube-api-access-wt8w6\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028703 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "65c7bb02-dc66-4754-8623-8d1be2932557" (UID: "65c7bb02-dc66-4754-8623-8d1be2932557"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028789 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-client-ca" (OuterVolumeSpecName: "client-ca") pod "65c7bb02-dc66-4754-8623-8d1be2932557" (UID: "65c7bb02-dc66-4754-8623-8d1be2932557"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.028993 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-config" (OuterVolumeSpecName: "config") pod "65c7bb02-dc66-4754-8623-8d1be2932557" (UID: "65c7bb02-dc66-4754-8623-8d1be2932557"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.030865 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c7bb02-dc66-4754-8623-8d1be2932557-kube-api-access-l52nw" (OuterVolumeSpecName: "kube-api-access-l52nw") pod "65c7bb02-dc66-4754-8623-8d1be2932557" (UID: "65c7bb02-dc66-4754-8623-8d1be2932557"). InnerVolumeSpecName "kube-api-access-l52nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.032174 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c7bb02-dc66-4754-8623-8d1be2932557-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "65c7bb02-dc66-4754-8623-8d1be2932557" (UID: "65c7bb02-dc66-4754-8623-8d1be2932557"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.129788 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-config\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.129859 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m28vj\" (UniqueName: \"kubernetes.io/projected/c2fb4af4-aa1e-45d4-adb9-7264603f510a-kube-api-access-m28vj\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.129905 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-client-ca\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.129938 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb4af4-aa1e-45d4-adb9-7264603f510a-serving-cert\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.129985 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.129997 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.130006 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l52nw\" (UniqueName: \"kubernetes.io/projected/65c7bb02-dc66-4754-8623-8d1be2932557-kube-api-access-l52nw\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.130021 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65c7bb02-dc66-4754-8623-8d1be2932557-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.130032 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65c7bb02-dc66-4754-8623-8d1be2932557-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.132432 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-config\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.133662 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb4af4-aa1e-45d4-adb9-7264603f510a-serving-cert\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.137092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-client-ca\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.147567 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m28vj\" (UniqueName: \"kubernetes.io/projected/c2fb4af4-aa1e-45d4-adb9-7264603f510a-kube-api-access-m28vj\") pod \"route-controller-manager-7dd8cd45b-hlz9j\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.167018 5033 generic.go:334] "Generic (PLEG): container finished" podID="65c7bb02-dc66-4754-8623-8d1be2932557" containerID="a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693" exitCode=0 Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.167069 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.167134 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" event={"ID":"65c7bb02-dc66-4754-8623-8d1be2932557","Type":"ContainerDied","Data":"a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.167178 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bdlcv" event={"ID":"65c7bb02-dc66-4754-8623-8d1be2932557","Type":"ContainerDied","Data":"da51268416a6118244b5cbcea2c67909a863c1ef3a36f82df7440ec24fb560b7"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.167207 5033 scope.go:117] "RemoveContainer" containerID="a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.173173 5033 generic.go:334] "Generic (PLEG): container finished" podID="7a38e065-f09c-4182-ba61-d4db657510de" containerID="ff3a18365a2a548b36c9cc426fcae81f56aaebb13ce4c5a4cfc2f8c1a6be2085" exitCode=0 Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.173548 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjzmq" event={"ID":"7a38e065-f09c-4182-ba61-d4db657510de","Type":"ContainerDied","Data":"ff3a18365a2a548b36c9cc426fcae81f56aaebb13ce4c5a4cfc2f8c1a6be2085"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.173594 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjzmq" event={"ID":"7a38e065-f09c-4182-ba61-d4db657510de","Type":"ContainerStarted","Data":"a4d808434f981e953d255e24eef4ebf50c38dc65b6779a4fa2d03e39f5e535c8"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.176330 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.179732 5033 generic.go:334] "Generic (PLEG): container finished" podID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerID="d4f390be869abdfbcd429d577bb4e1c6a57f8e247d32f54c032f3b55a51a30fa" exitCode=0 Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.179784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p958" event={"ID":"a7b8c773-dedb-4ed9-818d-cd878376c734","Type":"ContainerDied","Data":"d4f390be869abdfbcd429d577bb4e1c6a57f8e247d32f54c032f3b55a51a30fa"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.179809 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p958" event={"ID":"a7b8c773-dedb-4ed9-818d-cd878376c734","Type":"ContainerStarted","Data":"b8fa44e5462868cb2c053391e527a773473170791d1128c1ba65def860ca54ce"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.181563 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.182308 5033 generic.go:334] "Generic (PLEG): container finished" podID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerID="c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016" exitCode=0 Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.182407 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp66d" event={"ID":"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166","Type":"ContainerDied","Data":"c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.182429 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp66d" event={"ID":"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166","Type":"ContainerStarted","Data":"a98a1c18ea3392dd3f553d16ec80a37ff4c72804394ce8976c69bdbe8ac8d8a6"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.184137 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bbdd1c0a-8942-4ea8-b35a-32248767ebf9","Type":"ContainerStarted","Data":"1e0ea97dd2f4bdaacd84477493e27b91866f3e39b90ec4bcd1147a58571120d6"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.184176 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bbdd1c0a-8942-4ea8-b35a-32248767ebf9","Type":"ContainerStarted","Data":"d6fe453444243b961369a3333197948b7af877e56f62caecdf2d6810a9de29e7"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.191608 5033 scope.go:117] "RemoveContainer" containerID="a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693" Feb 26 15:43:31 crc kubenswrapper[5033]: E0226 15:43:31.195624 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693\": container with ID starting with a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693 not found: ID does not exist" containerID="a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.195668 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693"} err="failed to get container status \"a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693\": rpc error: code = NotFound desc = could not find container \"a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693\": container with ID starting with a80cdb1216d24e26cb08335b3ea7246ae7a99013d36e91b654c1bcf8cb976693 not found: ID does not exist" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.198961 5033 generic.go:334] "Generic (PLEG): container finished" podID="344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74" containerID="321ba68642c60ac3c778428e35e4e851d36924d7675f9116bfc94d29f1047d9e" exitCode=0 Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.199011 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" event={"ID":"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74","Type":"ContainerDied","Data":"321ba68642c60ac3c778428e35e4e851d36924d7675f9116bfc94d29f1047d9e"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.201110 5033 generic.go:334] "Generic (PLEG): container finished" podID="5049f268-de58-4e79-9deb-d08c2f33397c" containerID="c0c2b9132dc3f945fe67add326a93134658a66191bb4a2f25fd720d50bd9ce68" exitCode=0 Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.201143 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4br75" event={"ID":"5049f268-de58-4e79-9deb-d08c2f33397c","Type":"ContainerDied","Data":"c0c2b9132dc3f945fe67add326a93134658a66191bb4a2f25fd720d50bd9ce68"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.201157 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4br75" event={"ID":"5049f268-de58-4e79-9deb-d08c2f33397c","Type":"ContainerStarted","Data":"267ba42384a51bc68db366f899a6a99a71d428d8a12931122b26125f017049ef"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.202767 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" event={"ID":"dcc382cb-23f7-44b0-89a2-43c41292c21e","Type":"ContainerStarted","Data":"8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.202788 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" event={"ID":"dcc382cb-23f7-44b0-89a2-43c41292c21e","Type":"ContainerStarted","Data":"2bdc305196091bcfd73e4844ed04f6d1d4f2a1ea6d2328447eb3b35144d4743d"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.203145 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.205597 5033 generic.go:334] "Generic (PLEG): container finished" podID="14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" containerID="24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4" exitCode=0 Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.205625 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" event={"ID":"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38","Type":"ContainerDied","Data":"24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.205643 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" event={"ID":"14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38","Type":"ContainerDied","Data":"a2e2f87ef37d1e4eced88681c061c4355c00ab710ee49a6344e2b05c2529b096"} Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.205660 5033 scope.go:117] "RemoveContainer" containerID="24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.205732 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.221708 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.221682038 podStartE2EDuration="2.221682038s" podCreationTimestamp="2026-02-26 15:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:31.210870441 +0000 UTC m=+138.423469121" watchObservedRunningTime="2026-02-26 15:43:31.221682038 +0000 UTC m=+138.434280698" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.235805 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bdlcv"] Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.247809 5033 scope.go:117] "RemoveContainer" containerID="24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.247835 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bdlcv"] Feb 26 15:43:31 crc kubenswrapper[5033]: E0226 15:43:31.248395 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4\": container with ID starting with 24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4 not found: ID does not exist" containerID="24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.248438 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4"} err="failed to get container status \"24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4\": rpc error: code = NotFound desc = could not find container \"24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4\": container with ID starting with 24bf8292f280223355b932dd91c3419a1a01b4c1ed9fa85d777ae27fc07001b4 not found: ID does not exist" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.290993 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" podStartSLOduration=76.290960213 podStartE2EDuration="1m16.290960213s" podCreationTimestamp="2026-02-26 15:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:31.287313889 +0000 UTC m=+138.499912539" watchObservedRunningTime="2026-02-26 15:43:31.290960213 +0000 UTC m=+138.503558863" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.344870 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp"] Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.349068 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nggcp"] Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.400853 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wctnw"] Feb 26 15:43:31 crc kubenswrapper[5033]: E0226 15:43:31.401107 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c7bb02-dc66-4754-8623-8d1be2932557" containerName="controller-manager" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.401119 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c7bb02-dc66-4754-8623-8d1be2932557" containerName="controller-manager" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.401229 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c7bb02-dc66-4754-8623-8d1be2932557" containerName="controller-manager" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.402014 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.405020 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.407297 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wctnw"] Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.508464 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j"] Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.544810 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-catalog-content\") pod \"redhat-marketplace-wctnw\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.544874 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-utilities\") pod \"redhat-marketplace-wctnw\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.545211 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6grln\" (UniqueName: \"kubernetes.io/projected/596c5e6d-16b2-47b7-a052-a164aa944a7f-kube-api-access-6grln\") pod \"redhat-marketplace-wctnw\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.636179 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38" path="/var/lib/kubelet/pods/14f7f8c4-c5b9-4db0-8f77-e12a6a09ea38/volumes" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.637183 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c7bb02-dc66-4754-8623-8d1be2932557" path="/var/lib/kubelet/pods/65c7bb02-dc66-4754-8623-8d1be2932557/volumes" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.647326 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-utilities\") pod \"redhat-marketplace-wctnw\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.647420 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6grln\" (UniqueName: \"kubernetes.io/projected/596c5e6d-16b2-47b7-a052-a164aa944a7f-kube-api-access-6grln\") pod \"redhat-marketplace-wctnw\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.647479 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-catalog-content\") pod \"redhat-marketplace-wctnw\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.648541 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-catalog-content\") pod \"redhat-marketplace-wctnw\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.648569 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-utilities\") pod \"redhat-marketplace-wctnw\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.669305 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6grln\" (UniqueName: \"kubernetes.io/projected/596c5e6d-16b2-47b7-a052-a164aa944a7f-kube-api-access-6grln\") pod \"redhat-marketplace-wctnw\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.732656 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.784136 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5hpx7"] Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.785206 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.797118 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hpx7"] Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.853349 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:31 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:31 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:31 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.853417 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.955388 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b42nk\" (UniqueName: \"kubernetes.io/projected/922ca3ba-ef67-4e47-b111-e40f8733202b-kube-api-access-b42nk\") pod \"redhat-marketplace-5hpx7\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.955441 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-catalog-content\") pod \"redhat-marketplace-5hpx7\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:31 crc kubenswrapper[5033]: I0226 15:43:31.955464 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-utilities\") pod \"redhat-marketplace-5hpx7\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:31 crc kubenswrapper[5033]: E0226 15:43:31.982911 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:43:32 crc kubenswrapper[5033]: E0226 15:43:32.002957 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:43:32 crc kubenswrapper[5033]: E0226 15:43:32.010621 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:43:32 crc kubenswrapper[5033]: E0226 15:43:32.010709 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" podUID="2371a2b2-ba78-48f3-839d-cd97b08f3243" containerName="kube-multus-additional-cni-plugins" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.040191 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wctnw"] Feb 26 15:43:32 crc kubenswrapper[5033]: W0226 15:43:32.056538 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod596c5e6d_16b2_47b7_a052_a164aa944a7f.slice/crio-cbb767710e944035572e13b4752206f8d56ee6617d089809d6dbc6958588ed48 WatchSource:0}: Error finding container cbb767710e944035572e13b4752206f8d56ee6617d089809d6dbc6958588ed48: Status 404 returned error can't find the container with id cbb767710e944035572e13b4752206f8d56ee6617d089809d6dbc6958588ed48 Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.056710 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-catalog-content\") pod \"redhat-marketplace-5hpx7\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.056779 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-utilities\") pod \"redhat-marketplace-5hpx7\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.057069 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b42nk\" (UniqueName: \"kubernetes.io/projected/922ca3ba-ef67-4e47-b111-e40f8733202b-kube-api-access-b42nk\") pod \"redhat-marketplace-5hpx7\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.057705 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-utilities\") pod \"redhat-marketplace-5hpx7\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.057810 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-catalog-content\") pod \"redhat-marketplace-5hpx7\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.074643 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b42nk\" (UniqueName: \"kubernetes.io/projected/922ca3ba-ef67-4e47-b111-e40f8733202b-kube-api-access-b42nk\") pod \"redhat-marketplace-5hpx7\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.128680 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.218384 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bxjw5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.222479 5033 generic.go:334] "Generic (PLEG): container finished" podID="bbdd1c0a-8942-4ea8-b35a-32248767ebf9" containerID="1e0ea97dd2f4bdaacd84477493e27b91866f3e39b90ec4bcd1147a58571120d6" exitCode=0 Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.222547 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bbdd1c0a-8942-4ea8-b35a-32248767ebf9","Type":"ContainerDied","Data":"1e0ea97dd2f4bdaacd84477493e27b91866f3e39b90ec4bcd1147a58571120d6"} Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.320550 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" event={"ID":"c2fb4af4-aa1e-45d4-adb9-7264603f510a","Type":"ContainerStarted","Data":"5aef2fa334514789e5d82ab26a6f2a14411af9053d2d79df36b5efb76955997b"} Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.320846 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" event={"ID":"c2fb4af4-aa1e-45d4-adb9-7264603f510a","Type":"ContainerStarted","Data":"46739baabfec752b09d3e077f14b410073ba0c434916cf005f46f9a413df4d55"} Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.321914 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.353766 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" podStartSLOduration=4.3537484410000005 podStartE2EDuration="4.353748441s" podCreationTimestamp="2026-02-26 15:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:32.352404733 +0000 UTC m=+139.565003383" watchObservedRunningTime="2026-02-26 15:43:32.353748441 +0000 UTC m=+139.566347091" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.355113 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.359165 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wctnw" event={"ID":"596c5e6d-16b2-47b7-a052-a164aa944a7f","Type":"ContainerStarted","Data":"cbb767710e944035572e13b4752206f8d56ee6617d089809d6dbc6958588ed48"} Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.399526 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-npgt5"] Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.400437 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.404984 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.442729 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npgt5"] Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.502531 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hpx7"] Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.564140 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9fh5\" (UniqueName: \"kubernetes.io/projected/65f83e7b-3a72-450b-a54f-899aba4acd8a-kube-api-access-x9fh5\") pod \"redhat-operators-npgt5\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.564205 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-catalog-content\") pod \"redhat-operators-npgt5\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.564232 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-utilities\") pod \"redhat-operators-npgt5\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.641973 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.642916 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.647757 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.649241 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.650214 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.665128 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-catalog-content\") pod \"redhat-operators-npgt5\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.665168 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-utilities\") pod \"redhat-operators-npgt5\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.665242 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9fh5\" (UniqueName: \"kubernetes.io/projected/65f83e7b-3a72-450b-a54f-899aba4acd8a-kube-api-access-x9fh5\") pod \"redhat-operators-npgt5\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.665756 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-catalog-content\") pod \"redhat-operators-npgt5\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.665973 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-utilities\") pod \"redhat-operators-npgt5\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.696171 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9fh5\" (UniqueName: \"kubernetes.io/projected/65f83e7b-3a72-450b-a54f-899aba4acd8a-kube-api-access-x9fh5\") pod \"redhat-operators-npgt5\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.731496 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.737955 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.751216 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.751267 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.752782 5033 patch_prober.go:28] interesting pod/console-f9d7485db-nq49v container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.752840 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nq49v" podUID="d6d762dc-b9e3-44fd-a835-78df146dc35a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.764908 5033 patch_prober.go:28] interesting pod/downloads-7954f5f757-8l772 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.764966 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8l772" podUID="c852046c-87e9-42a6-9082-2d7390d6f71e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.765181 5033 patch_prober.go:28] interesting pod/downloads-7954f5f757-8l772 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.765235 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8l772" podUID="c852046c-87e9-42a6-9082-2d7390d6f71e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.30:8080/\": dial tcp 10.217.0.30:8080: connect: connection refused" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.767167 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.767213 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.783612 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rqwqh"] Feb 26 15:43:32 crc kubenswrapper[5033]: E0226 15:43:32.784495 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74" containerName="collect-profiles" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.784574 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74" containerName="collect-profiles" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.785572 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74" containerName="collect-profiles" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.786623 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.804736 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rqwqh"] Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.846146 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.849498 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:32 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:32 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:32 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.849562 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.868850 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-secret-volume\") pod \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.868961 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-config-volume\") pod \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.869009 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4vbq\" (UniqueName: \"kubernetes.io/projected/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-kube-api-access-v4vbq\") pod \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\" (UID: \"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74\") " Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.869263 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.869342 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.870469 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.871610 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-config-volume" (OuterVolumeSpecName: "config-volume") pod "344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74" (UID: "344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.874549 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74" (UID: "344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.876562 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-kube-api-access-v4vbq" (OuterVolumeSpecName: "kube-api-access-v4vbq") pod "344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74" (UID: "344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74"). InnerVolumeSpecName "kube-api-access-v4vbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.885799 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.970374 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbk2p\" (UniqueName: \"kubernetes.io/projected/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-kube-api-access-jbk2p\") pod \"redhat-operators-rqwqh\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.970544 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-catalog-content\") pod \"redhat-operators-rqwqh\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.970649 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-utilities\") pod \"redhat-operators-rqwqh\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.970735 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4vbq\" (UniqueName: \"kubernetes.io/projected/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-kube-api-access-v4vbq\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.970752 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:32 crc kubenswrapper[5033]: I0226 15:43:32.970763 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.019688 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.072432 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbk2p\" (UniqueName: \"kubernetes.io/projected/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-kube-api-access-jbk2p\") pod \"redhat-operators-rqwqh\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.072809 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-catalog-content\") pod \"redhat-operators-rqwqh\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.072867 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-utilities\") pod \"redhat-operators-rqwqh\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.072643 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.073573 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-catalog-content\") pod \"redhat-operators-rqwqh\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.073757 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-utilities\") pod \"redhat-operators-rqwqh\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.093937 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbk2p\" (UniqueName: \"kubernetes.io/projected/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-kube-api-access-jbk2p\") pod \"redhat-operators-rqwqh\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.119287 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.148139 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.154049 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kgzj7" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.269746 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npgt5"] Feb 26 15:43:33 crc kubenswrapper[5033]: W0226 15:43:33.304692 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65f83e7b_3a72_450b_a54f_899aba4acd8a.slice/crio-432f497f1bed32f4d0107d79edd8ef44146810ef5c18734e11319965a2a2f651 WatchSource:0}: Error finding container 432f497f1bed32f4d0107d79edd8ef44146810ef5c18734e11319965a2a2f651: Status 404 returned error can't find the container with id 432f497f1bed32f4d0107d79edd8ef44146810ef5c18734e11319965a2a2f651 Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.325004 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 15:43:33 crc kubenswrapper[5033]: W0226 15:43:33.356901 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podada183b1_a576_4a90_a94f_0eb8bf7990cb.slice/crio-2e57b9e78ee89789597cc238deba5357d5b6604f6db581c640fa7a943198e39a WatchSource:0}: Error finding container 2e57b9e78ee89789597cc238deba5357d5b6604f6db581c640fa7a943198e39a: Status 404 returned error can't find the container with id 2e57b9e78ee89789597cc238deba5357d5b6604f6db581c640fa7a943198e39a Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.382622 5033 generic.go:334] "Generic (PLEG): container finished" podID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerID="4b5d0b4caa3a96117deadec9b600360ca80a8b5d6be802c3feb1259491733efa" exitCode=0 Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.382695 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wctnw" event={"ID":"596c5e6d-16b2-47b7-a052-a164aa944a7f","Type":"ContainerDied","Data":"4b5d0b4caa3a96117deadec9b600360ca80a8b5d6be802c3feb1259491733efa"} Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.384237 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" event={"ID":"344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74","Type":"ContainerDied","Data":"7bfd31ae5d49a3dab79e982baaa5055b867806d6588c1d185055d7c56c2030cb"} Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.384315 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bfd31ae5d49a3dab79e982baaa5055b867806d6588c1d185055d7c56c2030cb" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.384375 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.403305 5033 generic.go:334] "Generic (PLEG): container finished" podID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerID="c6d3e399170cd23aac45baed035c2b59829baadec3740b7a5a55036d834cd0a0" exitCode=0 Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.403390 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hpx7" event={"ID":"922ca3ba-ef67-4e47-b111-e40f8733202b","Type":"ContainerDied","Data":"c6d3e399170cd23aac45baed035c2b59829baadec3740b7a5a55036d834cd0a0"} Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.403418 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hpx7" event={"ID":"922ca3ba-ef67-4e47-b111-e40f8733202b","Type":"ContainerStarted","Data":"ef38582c410a86fbf1913a14330c72ccb701f142f45c3398a97ca7c56ae7d94e"} Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.409406 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgt5" event={"ID":"65f83e7b-3a72-450b-a54f-899aba4acd8a","Type":"ContainerStarted","Data":"432f497f1bed32f4d0107d79edd8ef44146810ef5c18734e11319965a2a2f651"} Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.412286 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ada183b1-a576-4a90-a94f-0eb8bf7990cb","Type":"ContainerStarted","Data":"2e57b9e78ee89789597cc238deba5357d5b6604f6db581c640fa7a943198e39a"} Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.533058 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rqwqh"] Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.768462 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.819579 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bd8bf6879-s44sm"] Feb 26 15:43:33 crc kubenswrapper[5033]: E0226 15:43:33.819788 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbdd1c0a-8942-4ea8-b35a-32248767ebf9" containerName="pruner" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.819803 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbdd1c0a-8942-4ea8-b35a-32248767ebf9" containerName="pruner" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.819900 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbdd1c0a-8942-4ea8-b35a-32248767ebf9" containerName="pruner" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.820279 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.824967 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.825512 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.825529 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.826648 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.826989 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.827046 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.830400 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.836554 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bd8bf6879-s44sm"] Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.850216 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:33 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:33 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:33 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.850310 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.889051 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kube-api-access\") pod \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\" (UID: \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\") " Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.889239 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kubelet-dir\") pod \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\" (UID: \"bbdd1c0a-8942-4ea8-b35a-32248767ebf9\") " Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.889654 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-proxy-ca-bundles\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.889718 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99wwt\" (UniqueName: \"kubernetes.io/projected/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-kube-api-access-99wwt\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.889836 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-serving-cert\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.889899 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-config\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.889929 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-client-ca\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.890098 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bbdd1c0a-8942-4ea8-b35a-32248767ebf9" (UID: "bbdd1c0a-8942-4ea8-b35a-32248767ebf9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.901961 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bbdd1c0a-8942-4ea8-b35a-32248767ebf9" (UID: "bbdd1c0a-8942-4ea8-b35a-32248767ebf9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.991397 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-proxy-ca-bundles\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.991434 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99wwt\" (UniqueName: \"kubernetes.io/projected/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-kube-api-access-99wwt\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.991479 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-serving-cert\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.991511 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-config\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.991532 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-client-ca\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.991589 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.991757 5033 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bbdd1c0a-8942-4ea8-b35a-32248767ebf9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.992763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-proxy-ca-bundles\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.992966 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-client-ca\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:33 crc kubenswrapper[5033]: I0226 15:43:33.993356 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-config\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.007399 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-serving-cert\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.009245 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99wwt\" (UniqueName: \"kubernetes.io/projected/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-kube-api-access-99wwt\") pod \"controller-manager-5bd8bf6879-s44sm\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.211749 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.425495 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ada183b1-a576-4a90-a94f-0eb8bf7990cb","Type":"ContainerStarted","Data":"eca71c5a8bf127a6e5629d538fab5b2a31da1f710528180a6ae7c2acb2eec285"} Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.435446 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.435444 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bbdd1c0a-8942-4ea8-b35a-32248767ebf9","Type":"ContainerDied","Data":"d6fe453444243b961369a3333197948b7af877e56f62caecdf2d6810a9de29e7"} Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.435552 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6fe453444243b961369a3333197948b7af877e56f62caecdf2d6810a9de29e7" Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.441411 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.441393832 podStartE2EDuration="2.441393832s" podCreationTimestamp="2026-02-26 15:43:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:34.438398027 +0000 UTC m=+141.650996677" watchObservedRunningTime="2026-02-26 15:43:34.441393832 +0000 UTC m=+141.653992482" Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.442176 5033 generic.go:334] "Generic (PLEG): container finished" podID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerID="0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9" exitCode=0 Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.442546 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqwqh" event={"ID":"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08","Type":"ContainerDied","Data":"0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9"} Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.442605 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqwqh" event={"ID":"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08","Type":"ContainerStarted","Data":"dc36ff8f8f50443547b4ee6b70b5a6f2058e72f564301b5bba43703f0e6693be"} Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.465597 5033 generic.go:334] "Generic (PLEG): container finished" podID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerID="9e2324306761b1d114a6b1eac3c1903531e40aabdb7f08e3001b2abce0b28140" exitCode=0 Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.466998 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgt5" event={"ID":"65f83e7b-3a72-450b-a54f-899aba4acd8a","Type":"ContainerDied","Data":"9e2324306761b1d114a6b1eac3c1903531e40aabdb7f08e3001b2abce0b28140"} Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.738250 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bd8bf6879-s44sm"] Feb 26 15:43:34 crc kubenswrapper[5033]: W0226 15:43:34.774974 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4244b6d_2d76_48dd_bbc6_ee1b82fb8533.slice/crio-165cb88f7e7554cfbe46d1b813aac2577f38e40014bfa93cdb7feb0406f874a2 WatchSource:0}: Error finding container 165cb88f7e7554cfbe46d1b813aac2577f38e40014bfa93cdb7feb0406f874a2: Status 404 returned error can't find the container with id 165cb88f7e7554cfbe46d1b813aac2577f38e40014bfa93cdb7feb0406f874a2 Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.850004 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:34 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:34 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:34 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:34 crc kubenswrapper[5033]: I0226 15:43:34.850369 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:35 crc kubenswrapper[5033]: I0226 15:43:35.501202 5033 generic.go:334] "Generic (PLEG): container finished" podID="ada183b1-a576-4a90-a94f-0eb8bf7990cb" containerID="eca71c5a8bf127a6e5629d538fab5b2a31da1f710528180a6ae7c2acb2eec285" exitCode=0 Feb 26 15:43:35 crc kubenswrapper[5033]: I0226 15:43:35.501308 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ada183b1-a576-4a90-a94f-0eb8bf7990cb","Type":"ContainerDied","Data":"eca71c5a8bf127a6e5629d538fab5b2a31da1f710528180a6ae7c2acb2eec285"} Feb 26 15:43:35 crc kubenswrapper[5033]: I0226 15:43:35.505779 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" event={"ID":"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533","Type":"ContainerStarted","Data":"04cc2459740a905cbf062150e07508032b277f3daf5279e175afbcfe6b27ae08"} Feb 26 15:43:35 crc kubenswrapper[5033]: I0226 15:43:35.505800 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" event={"ID":"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533","Type":"ContainerStarted","Data":"165cb88f7e7554cfbe46d1b813aac2577f38e40014bfa93cdb7feb0406f874a2"} Feb 26 15:43:35 crc kubenswrapper[5033]: I0226 15:43:35.506728 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:35 crc kubenswrapper[5033]: I0226 15:43:35.511746 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:35 crc kubenswrapper[5033]: I0226 15:43:35.547835 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" podStartSLOduration=7.547813647 podStartE2EDuration="7.547813647s" podCreationTimestamp="2026-02-26 15:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:35.542905667 +0000 UTC m=+142.755504337" watchObservedRunningTime="2026-02-26 15:43:35.547813647 +0000 UTC m=+142.760412287" Feb 26 15:43:35 crc kubenswrapper[5033]: I0226 15:43:35.848664 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:35 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:35 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:35 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:35 crc kubenswrapper[5033]: I0226 15:43:35.848768 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:36 crc kubenswrapper[5033]: I0226 15:43:36.439421 5033 ???:1] "http: TLS handshake error from 192.168.126.11:46656: no serving certificate available for the kubelet" Feb 26 15:43:36 crc kubenswrapper[5033]: I0226 15:43:36.859119 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:36 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:36 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:36 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:36 crc kubenswrapper[5033]: I0226 15:43:36.859176 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:36 crc kubenswrapper[5033]: I0226 15:43:36.998465 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.173743 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kube-api-access\") pod \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\" (UID: \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\") " Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.173846 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kubelet-dir\") pod \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\" (UID: \"ada183b1-a576-4a90-a94f-0eb8bf7990cb\") " Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.174107 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ada183b1-a576-4a90-a94f-0eb8bf7990cb" (UID: "ada183b1-a576-4a90-a94f-0eb8bf7990cb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.191746 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ada183b1-a576-4a90-a94f-0eb8bf7990cb" (UID: "ada183b1-a576-4a90-a94f-0eb8bf7990cb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.275792 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.275825 5033 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ada183b1-a576-4a90-a94f-0eb8bf7990cb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.649398 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.683463 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.683508 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ada183b1-a576-4a90-a94f-0eb8bf7990cb","Type":"ContainerDied","Data":"2e57b9e78ee89789597cc238deba5357d5b6604f6db581c640fa7a943198e39a"} Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.683535 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e57b9e78ee89789597cc238deba5357d5b6604f6db581c640fa7a943198e39a" Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.847054 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:37 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:37 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:37 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.847113 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.963758 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fb82g" Feb 26 15:43:37 crc kubenswrapper[5033]: I0226 15:43:37.998074 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.99805763 podStartE2EDuration="998.05763ms" podCreationTimestamp="2026-02-26 15:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:37.995170278 +0000 UTC m=+145.207768928" watchObservedRunningTime="2026-02-26 15:43:37.99805763 +0000 UTC m=+145.210656280" Feb 26 15:43:38 crc kubenswrapper[5033]: I0226 15:43:38.847147 5033 patch_prober.go:28] interesting pod/router-default-5444994796-l4bg7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:43:38 crc kubenswrapper[5033]: [-]has-synced failed: reason withheld Feb 26 15:43:38 crc kubenswrapper[5033]: [+]process-running ok Feb 26 15:43:38 crc kubenswrapper[5033]: healthz check failed Feb 26 15:43:38 crc kubenswrapper[5033]: I0226 15:43:38.847211 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-l4bg7" podUID="6c2a8653-71cf-4142-9fe7-9294ac24e231" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:43:38 crc kubenswrapper[5033]: I0226 15:43:38.923008 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:43:39 crc kubenswrapper[5033]: I0226 15:43:39.848720 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:39 crc kubenswrapper[5033]: I0226 15:43:39.855299 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-l4bg7" Feb 26 15:43:41 crc kubenswrapper[5033]: E0226 15:43:41.981683 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:43:42 crc kubenswrapper[5033]: E0226 15:43:42.005456 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:43:42 crc kubenswrapper[5033]: E0226 15:43:42.016719 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:43:42 crc kubenswrapper[5033]: E0226 15:43:42.016802 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" podUID="2371a2b2-ba78-48f3-839d-cd97b08f3243" containerName="kube-multus-additional-cni-plugins" Feb 26 15:43:42 crc kubenswrapper[5033]: I0226 15:43:42.771469 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8l772" Feb 26 15:43:42 crc kubenswrapper[5033]: I0226 15:43:42.811456 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:42 crc kubenswrapper[5033]: I0226 15:43:42.816683 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:43:46 crc kubenswrapper[5033]: I0226 15:43:46.036733 5033 ???:1] "http: TLS handshake error from 192.168.126.11:37850: no serving certificate available for the kubelet" Feb 26 15:43:47 crc kubenswrapper[5033]: I0226 15:43:47.772546 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bd8bf6879-s44sm"] Feb 26 15:43:47 crc kubenswrapper[5033]: I0226 15:43:47.773198 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" podUID="a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" containerName="controller-manager" containerID="cri-o://04cc2459740a905cbf062150e07508032b277f3daf5279e175afbcfe6b27ae08" gracePeriod=30 Feb 26 15:43:47 crc kubenswrapper[5033]: I0226 15:43:47.815217 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j"] Feb 26 15:43:47 crc kubenswrapper[5033]: I0226 15:43:47.815659 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" podUID="c2fb4af4-aa1e-45d4-adb9-7264603f510a" containerName="route-controller-manager" containerID="cri-o://5aef2fa334514789e5d82ab26a6f2a14411af9053d2d79df36b5efb76955997b" gracePeriod=30 Feb 26 15:43:48 crc kubenswrapper[5033]: I0226 15:43:48.748732 5033 generic.go:334] "Generic (PLEG): container finished" podID="c2fb4af4-aa1e-45d4-adb9-7264603f510a" containerID="5aef2fa334514789e5d82ab26a6f2a14411af9053d2d79df36b5efb76955997b" exitCode=0 Feb 26 15:43:48 crc kubenswrapper[5033]: I0226 15:43:48.748846 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" event={"ID":"c2fb4af4-aa1e-45d4-adb9-7264603f510a","Type":"ContainerDied","Data":"5aef2fa334514789e5d82ab26a6f2a14411af9053d2d79df36b5efb76955997b"} Feb 26 15:43:48 crc kubenswrapper[5033]: I0226 15:43:48.751198 5033 generic.go:334] "Generic (PLEG): container finished" podID="a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" containerID="04cc2459740a905cbf062150e07508032b277f3daf5279e175afbcfe6b27ae08" exitCode=0 Feb 26 15:43:48 crc kubenswrapper[5033]: I0226 15:43:48.751239 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" event={"ID":"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533","Type":"ContainerDied","Data":"04cc2459740a905cbf062150e07508032b277f3daf5279e175afbcfe6b27ae08"} Feb 26 15:43:49 crc kubenswrapper[5033]: I0226 15:43:49.609595 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:43:49 crc kubenswrapper[5033]: I0226 15:43:49.654838 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 26 15:43:51 crc kubenswrapper[5033]: E0226 15:43:51.982163 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:43:51 crc kubenswrapper[5033]: E0226 15:43:51.986464 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:43:51 crc kubenswrapper[5033]: E0226 15:43:51.988400 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:43:51 crc kubenswrapper[5033]: E0226 15:43:51.988477 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" podUID="2371a2b2-ba78-48f3-839d-cd97b08f3243" containerName="kube-multus-additional-cni-plugins" Feb 26 15:43:52 crc kubenswrapper[5033]: I0226 15:43:52.182563 5033 patch_prober.go:28] interesting pod/route-controller-manager-7dd8cd45b-hlz9j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:43:52 crc kubenswrapper[5033]: I0226 15:43:52.182645 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" podUID="c2fb4af4-aa1e-45d4-adb9-7264603f510a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:43:53 crc kubenswrapper[5033]: I0226 15:43:53.647692 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.647667415 podStartE2EDuration="4.647667415s" podCreationTimestamp="2026-02-26 15:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:43:53.644837605 +0000 UTC m=+160.857436255" watchObservedRunningTime="2026-02-26 15:43:53.647667415 +0000 UTC m=+160.860266065" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.284731 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.320337 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl"] Feb 26 15:43:54 crc kubenswrapper[5033]: E0226 15:43:54.320930 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" containerName="controller-manager" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.320962 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" containerName="controller-manager" Feb 26 15:43:54 crc kubenswrapper[5033]: E0226 15:43:54.320996 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada183b1-a576-4a90-a94f-0eb8bf7990cb" containerName="pruner" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.321009 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada183b1-a576-4a90-a94f-0eb8bf7990cb" containerName="pruner" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.321150 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" containerName="controller-manager" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.321176 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada183b1-a576-4a90-a94f-0eb8bf7990cb" containerName="pruner" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.321654 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.346248 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl"] Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.376318 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-proxy-ca-bundles\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.376434 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjvd\" (UniqueName: \"kubernetes.io/projected/9a6ded02-b66e-4dd1-9da6-688d62267de1-kube-api-access-dbjvd\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.376501 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-config\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.377019 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6ded02-b66e-4dd1-9da6-688d62267de1-serving-cert\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.377148 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-client-ca\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.478132 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-serving-cert\") pod \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.478430 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-proxy-ca-bundles\") pod \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.478545 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-config\") pod \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.478748 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99wwt\" (UniqueName: \"kubernetes.io/projected/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-kube-api-access-99wwt\") pod \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.478861 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-client-ca\") pod \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\" (UID: \"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533\") " Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.479073 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6ded02-b66e-4dd1-9da6-688d62267de1-serving-cert\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.479194 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-client-ca\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.479335 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-proxy-ca-bundles\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.479460 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjvd\" (UniqueName: \"kubernetes.io/projected/9a6ded02-b66e-4dd1-9da6-688d62267de1-kube-api-access-dbjvd\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.479573 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-config\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.479745 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" (UID: "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.479824 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-client-ca" (OuterVolumeSpecName: "client-ca") pod "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" (UID: "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.479858 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-config" (OuterVolumeSpecName: "config") pod "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" (UID: "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.480610 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-client-ca\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.481114 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-proxy-ca-bundles\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.481520 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-config\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.486604 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" (UID: "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.488018 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6ded02-b66e-4dd1-9da6-688d62267de1-serving-cert\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.490522 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-kube-api-access-99wwt" (OuterVolumeSpecName: "kube-api-access-99wwt") pod "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" (UID: "a4244b6d-2d76-48dd-bbc6-ee1b82fb8533"). InnerVolumeSpecName "kube-api-access-99wwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.497239 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjvd\" (UniqueName: \"kubernetes.io/projected/9a6ded02-b66e-4dd1-9da6-688d62267de1-kube-api-access-dbjvd\") pod \"controller-manager-cd6ffb8db-5pzwl\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.580518 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.580553 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.580568 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.580578 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99wwt\" (UniqueName: \"kubernetes.io/projected/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-kube-api-access-99wwt\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.580588 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.638422 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.791106 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" event={"ID":"a4244b6d-2d76-48dd-bbc6-ee1b82fb8533","Type":"ContainerDied","Data":"165cb88f7e7554cfbe46d1b813aac2577f38e40014bfa93cdb7feb0406f874a2"} Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.791181 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.791191 5033 scope.go:117] "RemoveContainer" containerID="04cc2459740a905cbf062150e07508032b277f3daf5279e175afbcfe6b27ae08" Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.827248 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bd8bf6879-s44sm"] Feb 26 15:43:54 crc kubenswrapper[5033]: I0226 15:43:54.831648 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bd8bf6879-s44sm"] Feb 26 15:43:55 crc kubenswrapper[5033]: I0226 15:43:55.223651 5033 patch_prober.go:28] interesting pod/controller-manager-5bd8bf6879-s44sm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:43:55 crc kubenswrapper[5033]: I0226 15:43:55.223749 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bd8bf6879-s44sm" podUID="a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:43:55 crc kubenswrapper[5033]: I0226 15:43:55.645737 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4244b6d-2d76-48dd-bbc6-ee1b82fb8533" path="/var/lib/kubelet/pods/a4244b6d-2d76-48dd-bbc6-ee1b82fb8533/volumes" Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.146158 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535344-glfw9"] Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.147792 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535344-glfw9" Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.150138 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.150365 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.150569 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.160505 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535344-glfw9"] Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.177342 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8dsj\" (UniqueName: \"kubernetes.io/projected/ea999e72-9356-4693-a1f4-ced6617d4b65-kube-api-access-v8dsj\") pod \"auto-csr-approver-29535344-glfw9\" (UID: \"ea999e72-9356-4693-a1f4-ced6617d4b65\") " pod="openshift-infra/auto-csr-approver-29535344-glfw9" Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.278555 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8dsj\" (UniqueName: \"kubernetes.io/projected/ea999e72-9356-4693-a1f4-ced6617d4b65-kube-api-access-v8dsj\") pod \"auto-csr-approver-29535344-glfw9\" (UID: \"ea999e72-9356-4693-a1f4-ced6617d4b65\") " pod="openshift-infra/auto-csr-approver-29535344-glfw9" Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.299970 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8dsj\" (UniqueName: \"kubernetes.io/projected/ea999e72-9356-4693-a1f4-ced6617d4b65-kube-api-access-v8dsj\") pod \"auto-csr-approver-29535344-glfw9\" (UID: \"ea999e72-9356-4693-a1f4-ced6617d4b65\") " pod="openshift-infra/auto-csr-approver-29535344-glfw9" Feb 26 15:44:00 crc kubenswrapper[5033]: I0226 15:44:00.468303 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535344-glfw9" Feb 26 15:44:01 crc kubenswrapper[5033]: I0226 15:44:01.774111 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:01 crc kubenswrapper[5033]: E0226 15:44:01.980474 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5 is running failed: container process not found" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:44:01 crc kubenswrapper[5033]: E0226 15:44:01.981092 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5 is running failed: container process not found" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:44:01 crc kubenswrapper[5033]: E0226 15:44:01.981603 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5 is running failed: container process not found" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 26 15:44:01 crc kubenswrapper[5033]: E0226 15:44:01.981642 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" podUID="2371a2b2-ba78-48f3-839d-cd97b08f3243" containerName="kube-multus-additional-cni-plugins" Feb 26 15:44:02 crc kubenswrapper[5033]: I0226 15:44:02.183324 5033 patch_prober.go:28] interesting pod/route-controller-manager-7dd8cd45b-hlz9j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:44:02 crc kubenswrapper[5033]: I0226 15:44:02.183404 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" podUID="c2fb4af4-aa1e-45d4-adb9-7264603f510a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:44:03 crc kubenswrapper[5033]: I0226 15:44:03.047323 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fghqv" Feb 26 15:44:03 crc kubenswrapper[5033]: I0226 15:44:03.854079 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-tz87k_2371a2b2-ba78-48f3-839d-cd97b08f3243/kube-multus-additional-cni-plugins/0.log" Feb 26 15:44:03 crc kubenswrapper[5033]: I0226 15:44:03.854158 5033 generic.go:334] "Generic (PLEG): container finished" podID="2371a2b2-ba78-48f3-839d-cd97b08f3243" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" exitCode=137 Feb 26 15:44:03 crc kubenswrapper[5033]: I0226 15:44:03.854202 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" event={"ID":"2371a2b2-ba78-48f3-839d-cd97b08f3243","Type":"ContainerDied","Data":"fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5"} Feb 26 15:44:05 crc kubenswrapper[5033]: E0226 15:44:05.320522 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 26 15:44:05 crc kubenswrapper[5033]: E0226 15:44:05.321311 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x9fh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-npgt5_openshift-marketplace(65f83e7b-3a72-450b-a54f-899aba4acd8a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:44:05 crc kubenswrapper[5033]: E0226 15:44:05.322510 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-npgt5" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.642431 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.645546 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.651633 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.653222 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.656750 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.662544 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.662707 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.763481 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.763597 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.763677 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.792783 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:05 crc kubenswrapper[5033]: I0226 15:44:05.975818 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:06 crc kubenswrapper[5033]: E0226 15:44:06.652339 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-npgt5" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.691074 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.729516 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854c6474-7d72w"] Feb 26 15:44:06 crc kubenswrapper[5033]: E0226 15:44:06.729870 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fb4af4-aa1e-45d4-adb9-7264603f510a" containerName="route-controller-manager" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.729896 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fb4af4-aa1e-45d4-adb9-7264603f510a" containerName="route-controller-manager" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.730225 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2fb4af4-aa1e-45d4-adb9-7264603f510a" containerName="route-controller-manager" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.731170 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.733082 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854c6474-7d72w"] Feb 26 15:44:06 crc kubenswrapper[5033]: E0226 15:44:06.746504 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 26 15:44:06 crc kubenswrapper[5033]: E0226 15:44:06.746645 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j5n9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4br75_openshift-marketplace(5049f268-de58-4e79-9deb-d08c2f33397c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:44:06 crc kubenswrapper[5033]: E0226 15:44:06.748566 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4br75" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.774689 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-client-ca\") pod \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.774743 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-config\") pod \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.774801 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m28vj\" (UniqueName: \"kubernetes.io/projected/c2fb4af4-aa1e-45d4-adb9-7264603f510a-kube-api-access-m28vj\") pod \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.774854 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb4af4-aa1e-45d4-adb9-7264603f510a-serving-cert\") pod \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\" (UID: \"c2fb4af4-aa1e-45d4-adb9-7264603f510a\") " Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.775548 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2fb4af4-aa1e-45d4-adb9-7264603f510a" (UID: "c2fb4af4-aa1e-45d4-adb9-7264603f510a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.775802 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-config" (OuterVolumeSpecName: "config") pod "c2fb4af4-aa1e-45d4-adb9-7264603f510a" (UID: "c2fb4af4-aa1e-45d4-adb9-7264603f510a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.775805 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-client-ca\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.775859 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgpth\" (UniqueName: \"kubernetes.io/projected/f1850eea-7cc4-419b-8bd7-d6156cee9abc-kube-api-access-wgpth\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.775922 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-config\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.775971 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1850eea-7cc4-419b-8bd7-d6156cee9abc-serving-cert\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.776075 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.776092 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb4af4-aa1e-45d4-adb9-7264603f510a-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.778413 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fb4af4-aa1e-45d4-adb9-7264603f510a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2fb4af4-aa1e-45d4-adb9-7264603f510a" (UID: "c2fb4af4-aa1e-45d4-adb9-7264603f510a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.778944 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fb4af4-aa1e-45d4-adb9-7264603f510a-kube-api-access-m28vj" (OuterVolumeSpecName: "kube-api-access-m28vj") pod "c2fb4af4-aa1e-45d4-adb9-7264603f510a" (UID: "c2fb4af4-aa1e-45d4-adb9-7264603f510a"). InnerVolumeSpecName "kube-api-access-m28vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.872832 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.872858 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j" event={"ID":"c2fb4af4-aa1e-45d4-adb9-7264603f510a","Type":"ContainerDied","Data":"46739baabfec752b09d3e077f14b410073ba0c434916cf005f46f9a413df4d55"} Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.876588 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1850eea-7cc4-419b-8bd7-d6156cee9abc-serving-cert\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.876707 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-client-ca\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.876742 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgpth\" (UniqueName: \"kubernetes.io/projected/f1850eea-7cc4-419b-8bd7-d6156cee9abc-kube-api-access-wgpth\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.876806 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-config\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.876865 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m28vj\" (UniqueName: \"kubernetes.io/projected/c2fb4af4-aa1e-45d4-adb9-7264603f510a-kube-api-access-m28vj\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.876884 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb4af4-aa1e-45d4-adb9-7264603f510a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.880171 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-client-ca\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.880679 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-config\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.884586 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1850eea-7cc4-419b-8bd7-d6156cee9abc-serving-cert\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.900496 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgpth\" (UniqueName: \"kubernetes.io/projected/f1850eea-7cc4-419b-8bd7-d6156cee9abc-kube-api-access-wgpth\") pod \"route-controller-manager-854c6474-7d72w\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.943678 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j"] Feb 26 15:44:06 crc kubenswrapper[5033]: I0226 15:44:06.948749 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd8cd45b-hlz9j"] Feb 26 15:44:07 crc kubenswrapper[5033]: I0226 15:44:07.048675 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:07 crc kubenswrapper[5033]: I0226 15:44:07.636072 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fb4af4-aa1e-45d4-adb9-7264603f510a" path="/var/lib/kubelet/pods/c2fb4af4-aa1e-45d4-adb9-7264603f510a/volumes" Feb 26 15:44:07 crc kubenswrapper[5033]: I0226 15:44:07.707363 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl"] Feb 26 15:44:07 crc kubenswrapper[5033]: I0226 15:44:07.807599 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854c6474-7d72w"] Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.313776 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4br75" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.328368 5033 scope.go:117] "RemoveContainer" containerID="5aef2fa334514789e5d82ab26a6f2a14411af9053d2d79df36b5efb76955997b" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.376102 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.376534 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b42nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5hpx7_openshift-marketplace(922ca3ba-ef67-4e47-b111-e40f8733202b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.378047 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5hpx7" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.420912 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.421247 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbk2p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rqwqh_openshift-marketplace(2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.422933 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rqwqh" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.427799 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-tz87k_2371a2b2-ba78-48f3-839d-cd97b08f3243/kube-multus-additional-cni-plugins/0.log" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.428394 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.447411 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.447554 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7w4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7p958_openshift-marketplace(a7b8c773-dedb-4ed9-818d-cd878376c734): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.448824 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7p958" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.460997 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.461145 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6grln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wctnw_openshift-marketplace(596c5e6d-16b2-47b7-a052-a164aa944a7f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.462291 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wctnw" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.476418 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.476549 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6p5jc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kjzmq_openshift-marketplace(7a38e065-f09c-4182-ba61-d4db657510de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.477734 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kjzmq" podUID="7a38e065-f09c-4182-ba61-d4db657510de" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.496539 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2371a2b2-ba78-48f3-839d-cd97b08f3243-ready\") pod \"2371a2b2-ba78-48f3-839d-cd97b08f3243\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.496647 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2371a2b2-ba78-48f3-839d-cd97b08f3243-cni-sysctl-allowlist\") pod \"2371a2b2-ba78-48f3-839d-cd97b08f3243\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.496724 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bx2x\" (UniqueName: \"kubernetes.io/projected/2371a2b2-ba78-48f3-839d-cd97b08f3243-kube-api-access-6bx2x\") pod \"2371a2b2-ba78-48f3-839d-cd97b08f3243\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.496756 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2371a2b2-ba78-48f3-839d-cd97b08f3243-tuning-conf-dir\") pod \"2371a2b2-ba78-48f3-839d-cd97b08f3243\" (UID: \"2371a2b2-ba78-48f3-839d-cd97b08f3243\") " Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.497005 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2371a2b2-ba78-48f3-839d-cd97b08f3243-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "2371a2b2-ba78-48f3-839d-cd97b08f3243" (UID: "2371a2b2-ba78-48f3-839d-cd97b08f3243"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.497406 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2371a2b2-ba78-48f3-839d-cd97b08f3243-ready" (OuterVolumeSpecName: "ready") pod "2371a2b2-ba78-48f3-839d-cd97b08f3243" (UID: "2371a2b2-ba78-48f3-839d-cd97b08f3243"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.497565 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2371a2b2-ba78-48f3-839d-cd97b08f3243-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "2371a2b2-ba78-48f3-839d-cd97b08f3243" (UID: "2371a2b2-ba78-48f3-839d-cd97b08f3243"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.502204 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2371a2b2-ba78-48f3-839d-cd97b08f3243-kube-api-access-6bx2x" (OuterVolumeSpecName: "kube-api-access-6bx2x") pod "2371a2b2-ba78-48f3-839d-cd97b08f3243" (UID: "2371a2b2-ba78-48f3-839d-cd97b08f3243"). InnerVolumeSpecName "kube-api-access-6bx2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.598577 5033 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2371a2b2-ba78-48f3-839d-cd97b08f3243-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.598613 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bx2x\" (UniqueName: \"kubernetes.io/projected/2371a2b2-ba78-48f3-839d-cd97b08f3243-kube-api-access-6bx2x\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.598627 5033 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2371a2b2-ba78-48f3-839d-cd97b08f3243-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.598640 5033 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2371a2b2-ba78-48f3-839d-cd97b08f3243-ready\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.640846 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl"] Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.675972 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535344-glfw9"] Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.708139 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854c6474-7d72w"] Feb 26 15:44:08 crc kubenswrapper[5033]: W0226 15:44:08.719031 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1850eea_7cc4_419b_8bd7_d6156cee9abc.slice/crio-6d7b3a8a29a26f15a42dad1f2843e36dd2e347cc095a7fab6c9ee81196b432ac WatchSource:0}: Error finding container 6d7b3a8a29a26f15a42dad1f2843e36dd2e347cc095a7fab6c9ee81196b432ac: Status 404 returned error can't find the container with id 6d7b3a8a29a26f15a42dad1f2843e36dd2e347cc095a7fab6c9ee81196b432ac Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.796038 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 15:44:08 crc kubenswrapper[5033]: W0226 15:44:08.806031 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbf09cfb1_f105_4307_b424_ac1fb3ef72a6.slice/crio-832335de5f67c33f4148dd6ea707842a14d676ac76db86db8b14225d2690a5ea WatchSource:0}: Error finding container 832335de5f67c33f4148dd6ea707842a14d676ac76db86db8b14225d2690a5ea: Status 404 returned error can't find the container with id 832335de5f67c33f4148dd6ea707842a14d676ac76db86db8b14225d2690a5ea Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.884860 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-tz87k_2371a2b2-ba78-48f3-839d-cd97b08f3243/kube-multus-additional-cni-plugins/0.log" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.885232 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" event={"ID":"2371a2b2-ba78-48f3-839d-cd97b08f3243","Type":"ContainerDied","Data":"ac27a332db38a10a59f56600b53805bfbde2300e428c82ab52ecd00966976c41"} Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.885247 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-tz87k" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.885296 5033 scope.go:117] "RemoveContainer" containerID="fc907ea3cd8c1ab826f89570095c166c7a024893581e4efcd785086370deb3b5" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.888534 5033 generic.go:334] "Generic (PLEG): container finished" podID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerID="98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12" exitCode=0 Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.888661 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp66d" event={"ID":"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166","Type":"ContainerDied","Data":"98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12"} Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.891377 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bf09cfb1-f105-4307-b424-ac1fb3ef72a6","Type":"ContainerStarted","Data":"832335de5f67c33f4148dd6ea707842a14d676ac76db86db8b14225d2690a5ea"} Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.898504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" event={"ID":"f1850eea-7cc4-419b-8bd7-d6156cee9abc","Type":"ContainerStarted","Data":"61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de"} Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.898538 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" event={"ID":"f1850eea-7cc4-419b-8bd7-d6156cee9abc","Type":"ContainerStarted","Data":"6d7b3a8a29a26f15a42dad1f2843e36dd2e347cc095a7fab6c9ee81196b432ac"} Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.898643 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" podUID="f1850eea-7cc4-419b-8bd7-d6156cee9abc" containerName="route-controller-manager" containerID="cri-o://61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de" gracePeriod=30 Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.898801 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.899961 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535344-glfw9" event={"ID":"ea999e72-9356-4693-a1f4-ced6617d4b65","Type":"ContainerStarted","Data":"86cd39975bb818d264ee7b4a40ea69b1b93fba444eb65f12d3dd5c81e5924252"} Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.901717 5033 patch_prober.go:28] interesting pod/route-controller-manager-854c6474-7d72w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.901765 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" podUID="f1850eea-7cc4-419b-8bd7-d6156cee9abc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.903023 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" event={"ID":"9a6ded02-b66e-4dd1-9da6-688d62267de1","Type":"ContainerStarted","Data":"6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72"} Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.903062 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" event={"ID":"9a6ded02-b66e-4dd1-9da6-688d62267de1","Type":"ContainerStarted","Data":"2dfadd2917116e2dfc1c6fab56a5ea73215aafd9df0bbcbcacf682ccd24bccf5"} Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.903203 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" podUID="9a6ded02-b66e-4dd1-9da6-688d62267de1" containerName="controller-manager" containerID="cri-o://6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72" gracePeriod=30 Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.904088 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.908079 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.909697 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rqwqh" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.910759 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7p958" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.911085 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5hpx7" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.918598 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kjzmq" podUID="7a38e065-f09c-4182-ba61-d4db657510de" Feb 26 15:44:08 crc kubenswrapper[5033]: E0226 15:44:08.925651 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wctnw" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" Feb 26 15:44:08 crc kubenswrapper[5033]: I0226 15:44:08.940772 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" podStartSLOduration=21.940756849 podStartE2EDuration="21.940756849s" podCreationTimestamp="2026-02-26 15:43:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:44:08.938207797 +0000 UTC m=+176.150806447" watchObservedRunningTime="2026-02-26 15:44:08.940756849 +0000 UTC m=+176.153355499" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.010811 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" podStartSLOduration=22.010795396 podStartE2EDuration="22.010795396s" podCreationTimestamp="2026-02-26 15:43:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:44:09.007907263 +0000 UTC m=+176.220505923" watchObservedRunningTime="2026-02-26 15:44:09.010795396 +0000 UTC m=+176.223394046" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.079994 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-tz87k"] Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.088371 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-tz87k"] Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.242137 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.407422 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6ded02-b66e-4dd1-9da6-688d62267de1-serving-cert\") pod \"9a6ded02-b66e-4dd1-9da6-688d62267de1\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.407700 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-config\") pod \"9a6ded02-b66e-4dd1-9da6-688d62267de1\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.407745 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbjvd\" (UniqueName: \"kubernetes.io/projected/9a6ded02-b66e-4dd1-9da6-688d62267de1-kube-api-access-dbjvd\") pod \"9a6ded02-b66e-4dd1-9da6-688d62267de1\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.407778 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-proxy-ca-bundles\") pod \"9a6ded02-b66e-4dd1-9da6-688d62267de1\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.407814 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-client-ca\") pod \"9a6ded02-b66e-4dd1-9da6-688d62267de1\" (UID: \"9a6ded02-b66e-4dd1-9da6-688d62267de1\") " Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.408517 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-client-ca" (OuterVolumeSpecName: "client-ca") pod "9a6ded02-b66e-4dd1-9da6-688d62267de1" (UID: "9a6ded02-b66e-4dd1-9da6-688d62267de1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.408582 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9a6ded02-b66e-4dd1-9da6-688d62267de1" (UID: "9a6ded02-b66e-4dd1-9da6-688d62267de1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.408590 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-config" (OuterVolumeSpecName: "config") pod "9a6ded02-b66e-4dd1-9da6-688d62267de1" (UID: "9a6ded02-b66e-4dd1-9da6-688d62267de1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.417992 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a6ded02-b66e-4dd1-9da6-688d62267de1-kube-api-access-dbjvd" (OuterVolumeSpecName: "kube-api-access-dbjvd") pod "9a6ded02-b66e-4dd1-9da6-688d62267de1" (UID: "9a6ded02-b66e-4dd1-9da6-688d62267de1"). InnerVolumeSpecName "kube-api-access-dbjvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.418381 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a6ded02-b66e-4dd1-9da6-688d62267de1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9a6ded02-b66e-4dd1-9da6-688d62267de1" (UID: "9a6ded02-b66e-4dd1-9da6-688d62267de1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.509233 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.509596 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbjvd\" (UniqueName: \"kubernetes.io/projected/9a6ded02-b66e-4dd1-9da6-688d62267de1-kube-api-access-dbjvd\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.509613 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.509626 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6ded02-b66e-4dd1-9da6-688d62267de1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.509639 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a6ded02-b66e-4dd1-9da6-688d62267de1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.568891 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-854c6474-7d72w_f1850eea-7cc4-419b-8bd7-d6156cee9abc/route-controller-manager/0.log" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.568971 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.609831 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1850eea-7cc4-419b-8bd7-d6156cee9abc-serving-cert\") pod \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.609890 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-client-ca\") pod \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.609943 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgpth\" (UniqueName: \"kubernetes.io/projected/f1850eea-7cc4-419b-8bd7-d6156cee9abc-kube-api-access-wgpth\") pod \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.609964 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-config\") pod \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\" (UID: \"f1850eea-7cc4-419b-8bd7-d6156cee9abc\") " Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.610846 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-client-ca" (OuterVolumeSpecName: "client-ca") pod "f1850eea-7cc4-419b-8bd7-d6156cee9abc" (UID: "f1850eea-7cc4-419b-8bd7-d6156cee9abc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.610860 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-config" (OuterVolumeSpecName: "config") pod "f1850eea-7cc4-419b-8bd7-d6156cee9abc" (UID: "f1850eea-7cc4-419b-8bd7-d6156cee9abc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.614310 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1850eea-7cc4-419b-8bd7-d6156cee9abc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f1850eea-7cc4-419b-8bd7-d6156cee9abc" (UID: "f1850eea-7cc4-419b-8bd7-d6156cee9abc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.614596 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1850eea-7cc4-419b-8bd7-d6156cee9abc-kube-api-access-wgpth" (OuterVolumeSpecName: "kube-api-access-wgpth") pod "f1850eea-7cc4-419b-8bd7-d6156cee9abc" (UID: "f1850eea-7cc4-419b-8bd7-d6156cee9abc"). InnerVolumeSpecName "kube-api-access-wgpth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.643009 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2371a2b2-ba78-48f3-839d-cd97b08f3243" path="/var/lib/kubelet/pods/2371a2b2-ba78-48f3-839d-cd97b08f3243/volumes" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.716701 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.716768 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1850eea-7cc4-419b-8bd7-d6156cee9abc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.716783 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1850eea-7cc4-419b-8bd7-d6156cee9abc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.716795 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgpth\" (UniqueName: \"kubernetes.io/projected/f1850eea-7cc4-419b-8bd7-d6156cee9abc-kube-api-access-wgpth\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.920218 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp66d" event={"ID":"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166","Type":"ContainerStarted","Data":"9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65"} Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.922374 5033 generic.go:334] "Generic (PLEG): container finished" podID="bf09cfb1-f105-4307-b424-ac1fb3ef72a6" containerID="c82f357fd5fa991020b45f75e1c2c927002316ab794206b16d64fad7345c729a" exitCode=0 Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.922429 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bf09cfb1-f105-4307-b424-ac1fb3ef72a6","Type":"ContainerDied","Data":"c82f357fd5fa991020b45f75e1c2c927002316ab794206b16d64fad7345c729a"} Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.925283 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-854c6474-7d72w_f1850eea-7cc4-419b-8bd7-d6156cee9abc/route-controller-manager/0.log" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.925313 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1850eea-7cc4-419b-8bd7-d6156cee9abc" containerID="61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de" exitCode=255 Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.925361 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" event={"ID":"f1850eea-7cc4-419b-8bd7-d6156cee9abc","Type":"ContainerDied","Data":"61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de"} Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.925381 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" event={"ID":"f1850eea-7cc4-419b-8bd7-d6156cee9abc","Type":"ContainerDied","Data":"6d7b3a8a29a26f15a42dad1f2843e36dd2e347cc095a7fab6c9ee81196b432ac"} Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.925593 5033 scope.go:117] "RemoveContainer" containerID="61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.925670 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-854c6474-7d72w" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.933947 5033 generic.go:334] "Generic (PLEG): container finished" podID="9a6ded02-b66e-4dd1-9da6-688d62267de1" containerID="6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72" exitCode=0 Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.933987 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" event={"ID":"9a6ded02-b66e-4dd1-9da6-688d62267de1","Type":"ContainerDied","Data":"6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72"} Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.934056 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" event={"ID":"9a6ded02-b66e-4dd1-9da6-688d62267de1","Type":"ContainerDied","Data":"2dfadd2917116e2dfc1c6fab56a5ea73215aafd9df0bbcbcacf682ccd24bccf5"} Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.934139 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.944886 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cp66d" podStartSLOduration=2.815374534 podStartE2EDuration="40.944868694s" podCreationTimestamp="2026-02-26 15:43:29 +0000 UTC" firstStartedPulling="2026-02-26 15:43:31.193841778 +0000 UTC m=+138.406440418" lastFinishedPulling="2026-02-26 15:44:09.323335888 +0000 UTC m=+176.535934578" observedRunningTime="2026-02-26 15:44:09.940340504 +0000 UTC m=+177.152939164" watchObservedRunningTime="2026-02-26 15:44:09.944868694 +0000 UTC m=+177.157467344" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.956820 5033 scope.go:117] "RemoveContainer" containerID="61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de" Feb 26 15:44:09 crc kubenswrapper[5033]: E0226 15:44:09.957208 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de\": container with ID starting with 61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de not found: ID does not exist" containerID="61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.957244 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de"} err="failed to get container status \"61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de\": rpc error: code = NotFound desc = could not find container \"61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de\": container with ID starting with 61492b09f28cbd7717620024a1ce5d60046a6154a410441eaad5e3b9156860de not found: ID does not exist" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.957279 5033 scope.go:117] "RemoveContainer" containerID="6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.969207 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854c6474-7d72w"] Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.971604 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854c6474-7d72w"] Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.981690 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl"] Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.986809 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-cd6ffb8db-5pzwl"] Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.995513 5033 scope.go:117] "RemoveContainer" containerID="6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72" Feb 26 15:44:09 crc kubenswrapper[5033]: E0226 15:44:09.995960 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72\": container with ID starting with 6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72 not found: ID does not exist" containerID="6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72" Feb 26 15:44:09 crc kubenswrapper[5033]: I0226 15:44:09.995996 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72"} err="failed to get container status \"6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72\": rpc error: code = NotFound desc = could not find container \"6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72\": container with ID starting with 6284d2f6131a7ed22c0670c177c79abe5c4e2df3b19930239804ab9a1704ce72 not found: ID does not exist" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.161008 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.161045 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.857215 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4"] Feb 26 15:44:10 crc kubenswrapper[5033]: E0226 15:44:10.858164 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6ded02-b66e-4dd1-9da6-688d62267de1" containerName="controller-manager" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.858178 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6ded02-b66e-4dd1-9da6-688d62267de1" containerName="controller-manager" Feb 26 15:44:10 crc kubenswrapper[5033]: E0226 15:44:10.858186 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1850eea-7cc4-419b-8bd7-d6156cee9abc" containerName="route-controller-manager" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.858192 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1850eea-7cc4-419b-8bd7-d6156cee9abc" containerName="route-controller-manager" Feb 26 15:44:10 crc kubenswrapper[5033]: E0226 15:44:10.858205 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2371a2b2-ba78-48f3-839d-cd97b08f3243" containerName="kube-multus-additional-cni-plugins" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.858211 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2371a2b2-ba78-48f3-839d-cd97b08f3243" containerName="kube-multus-additional-cni-plugins" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.858358 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1850eea-7cc4-419b-8bd7-d6156cee9abc" containerName="route-controller-manager" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.858370 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2371a2b2-ba78-48f3-839d-cd97b08f3243" containerName="kube-multus-additional-cni-plugins" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.858383 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a6ded02-b66e-4dd1-9da6-688d62267de1" containerName="controller-manager" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.858783 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.864250 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.864478 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.864608 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.864727 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.864916 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.865940 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cc8594b46-nt6fd"] Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.866484 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.868711 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.868795 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.869384 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.870242 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.870523 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.870812 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.871047 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.874795 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.890466 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cc8594b46-nt6fd"] Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.894411 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4"] Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.927055 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-config\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.927121 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-proxy-ca-bundles\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.927145 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmkgq\" (UniqueName: \"kubernetes.io/projected/bbccdb14-2871-4ce9-ab03-db26ecb3a160-kube-api-access-wmkgq\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.927174 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvcnc\" (UniqueName: \"kubernetes.io/projected/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-kube-api-access-jvcnc\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.927235 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-client-ca\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.927278 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbccdb14-2871-4ce9-ab03-db26ecb3a160-serving-cert\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.927317 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-client-ca\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.927332 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-serving-cert\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:10 crc kubenswrapper[5033]: I0226 15:44:10.927398 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-config\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.028329 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-config\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.028399 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-proxy-ca-bundles\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.028430 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmkgq\" (UniqueName: \"kubernetes.io/projected/bbccdb14-2871-4ce9-ab03-db26ecb3a160-kube-api-access-wmkgq\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.030146 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvcnc\" (UniqueName: \"kubernetes.io/projected/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-kube-api-access-jvcnc\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.030177 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-client-ca\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.030204 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbccdb14-2871-4ce9-ab03-db26ecb3a160-serving-cert\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.030224 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-client-ca\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.030241 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-serving-cert\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.030376 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-config\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.030456 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-config\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.030687 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-proxy-ca-bundles\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.031536 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-client-ca\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.031995 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-client-ca\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.032364 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-config\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.048828 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-serving-cert\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.051405 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvcnc\" (UniqueName: \"kubernetes.io/projected/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-kube-api-access-jvcnc\") pod \"controller-manager-6cc8594b46-nt6fd\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.052597 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmkgq\" (UniqueName: \"kubernetes.io/projected/bbccdb14-2871-4ce9-ab03-db26ecb3a160-kube-api-access-wmkgq\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.054654 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbccdb14-2871-4ce9-ab03-db26ecb3a160-serving-cert\") pod \"route-controller-manager-5b8b54bdbf-gkbz4\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.158309 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.189591 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.208451 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.338275 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kubelet-dir\") pod \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\" (UID: \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\") " Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.338686 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kube-api-access\") pod \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\" (UID: \"bf09cfb1-f105-4307-b424-ac1fb3ef72a6\") " Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.338322 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bf09cfb1-f105-4307-b424-ac1fb3ef72a6" (UID: "bf09cfb1-f105-4307-b424-ac1fb3ef72a6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.357278 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bf09cfb1-f105-4307-b424-ac1fb3ef72a6" (UID: "bf09cfb1-f105-4307-b424-ac1fb3ef72a6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.364943 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cp66d" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerName="registry-server" probeResult="failure" output=< Feb 26 15:44:11 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 15:44:11 crc kubenswrapper[5033]: > Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.439801 5033 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.439848 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf09cfb1-f105-4307-b424-ac1fb3ef72a6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.474318 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cc8594b46-nt6fd"] Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.609246 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4"] Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.636217 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a6ded02-b66e-4dd1-9da6-688d62267de1" path="/var/lib/kubelet/pods/9a6ded02-b66e-4dd1-9da6-688d62267de1/volumes" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.636902 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1850eea-7cc4-419b-8bd7-d6156cee9abc" path="/var/lib/kubelet/pods/f1850eea-7cc4-419b-8bd7-d6156cee9abc/volumes" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.956610 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bf09cfb1-f105-4307-b424-ac1fb3ef72a6","Type":"ContainerDied","Data":"832335de5f67c33f4148dd6ea707842a14d676ac76db86db8b14225d2690a5ea"} Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.956919 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832335de5f67c33f4148dd6ea707842a14d676ac76db86db8b14225d2690a5ea" Feb 26 15:44:11 crc kubenswrapper[5033]: I0226 15:44:11.956643 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: W0226 15:44:14.142897 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97cb2dfd_bbbd_4e5c_a8b4_1d83d81618cc.slice/crio-be5f2cd055f866c922477507c2e40654d7be5ed9a3a502136068c92ba318d77b WatchSource:0}: Error finding container be5f2cd055f866c922477507c2e40654d7be5ed9a3a502136068c92ba318d77b: Status 404 returned error can't find the container with id be5f2cd055f866c922477507c2e40654d7be5ed9a3a502136068c92ba318d77b Feb 26 15:44:14 crc kubenswrapper[5033]: W0226 15:44:14.145460 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbccdb14_2871_4ce9_ab03_db26ecb3a160.slice/crio-f13f69be9e23c75ce307ef298da04c74b121a60d8b73f1eedce3661e26d657fb WatchSource:0}: Error finding container f13f69be9e23c75ce307ef298da04c74b121a60d8b73f1eedce3661e26d657fb: Status 404 returned error can't find the container with id f13f69be9e23c75ce307ef298da04c74b121a60d8b73f1eedce3661e26d657fb Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.241730 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 15:44:14 crc kubenswrapper[5033]: E0226 15:44:14.242034 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf09cfb1-f105-4307-b424-ac1fb3ef72a6" containerName="pruner" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.242052 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf09cfb1-f105-4307-b424-ac1fb3ef72a6" containerName="pruner" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.242292 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf09cfb1-f105-4307-b424-ac1fb3ef72a6" containerName="pruner" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.242909 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.245985 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.247106 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.250988 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.373049 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.373616 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-var-lock\") pod \"installer-9-crc\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.373717 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0317728e-d238-4fbd-99e3-a75fceeda178-kube-api-access\") pod \"installer-9-crc\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.475333 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.475401 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-var-lock\") pod \"installer-9-crc\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.475422 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0317728e-d238-4fbd-99e3-a75fceeda178-kube-api-access\") pod \"installer-9-crc\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.475758 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.475786 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-var-lock\") pod \"installer-9-crc\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.491915 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0317728e-d238-4fbd-99e3-a75fceeda178-kube-api-access\") pod \"installer-9-crc\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.566310 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.971559 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" event={"ID":"bbccdb14-2871-4ce9-ab03-db26ecb3a160","Type":"ContainerStarted","Data":"f13f69be9e23c75ce307ef298da04c74b121a60d8b73f1eedce3661e26d657fb"} Feb 26 15:44:14 crc kubenswrapper[5033]: I0226 15:44:14.972520 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" event={"ID":"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc","Type":"ContainerStarted","Data":"be5f2cd055f866c922477507c2e40654d7be5ed9a3a502136068c92ba318d77b"} Feb 26 15:44:15 crc kubenswrapper[5033]: I0226 15:44:15.927353 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 15:44:15 crc kubenswrapper[5033]: W0226 15:44:15.953291 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0317728e_d238_4fbd_99e3_a75fceeda178.slice/crio-e2004237aa74f5fb92001c2ccdb47b911a0f743fc2b5c44c0bb62aeac82c30c2 WatchSource:0}: Error finding container e2004237aa74f5fb92001c2ccdb47b911a0f743fc2b5c44c0bb62aeac82c30c2: Status 404 returned error can't find the container with id e2004237aa74f5fb92001c2ccdb47b911a0f743fc2b5c44c0bb62aeac82c30c2 Feb 26 15:44:15 crc kubenswrapper[5033]: I0226 15:44:15.978707 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535344-glfw9" event={"ID":"ea999e72-9356-4693-a1f4-ced6617d4b65","Type":"ContainerStarted","Data":"05cd628789e65980a86fade8b907e76524f06a00fa636a0ef4705f36900fa4e1"} Feb 26 15:44:15 crc kubenswrapper[5033]: I0226 15:44:15.979474 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0317728e-d238-4fbd-99e3-a75fceeda178","Type":"ContainerStarted","Data":"e2004237aa74f5fb92001c2ccdb47b911a0f743fc2b5c44c0bb62aeac82c30c2"} Feb 26 15:44:15 crc kubenswrapper[5033]: I0226 15:44:15.980526 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" event={"ID":"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc","Type":"ContainerStarted","Data":"372ecfa3b740ecc306dbf7779cdb7b8ad3d19a40f4d3ace6dd738104191c97ca"} Feb 26 15:44:15 crc kubenswrapper[5033]: I0226 15:44:15.980847 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:15 crc kubenswrapper[5033]: I0226 15:44:15.983521 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" event={"ID":"bbccdb14-2871-4ce9-ab03-db26ecb3a160","Type":"ContainerStarted","Data":"48e04d557923ed884258f88133ca1a399352dd3cb22d15a13d8afa5f7566c5ee"} Feb 26 15:44:15 crc kubenswrapper[5033]: I0226 15:44:15.984363 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:15 crc kubenswrapper[5033]: I0226 15:44:15.993524 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:16 crc kubenswrapper[5033]: I0226 15:44:16.020222 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535344-glfw9" podStartSLOduration=9.104935285 podStartE2EDuration="16.020201865s" podCreationTimestamp="2026-02-26 15:44:00 +0000 UTC" firstStartedPulling="2026-02-26 15:44:08.695960208 +0000 UTC m=+175.908558858" lastFinishedPulling="2026-02-26 15:44:15.611226788 +0000 UTC m=+182.823825438" observedRunningTime="2026-02-26 15:44:15.995578426 +0000 UTC m=+183.208177076" watchObservedRunningTime="2026-02-26 15:44:16.020201865 +0000 UTC m=+183.232800515" Feb 26 15:44:16 crc kubenswrapper[5033]: I0226 15:44:16.021670 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" podStartSLOduration=9.021663277 podStartE2EDuration="9.021663277s" podCreationTimestamp="2026-02-26 15:44:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:44:16.018779065 +0000 UTC m=+183.231377715" watchObservedRunningTime="2026-02-26 15:44:16.021663277 +0000 UTC m=+183.234261927" Feb 26 15:44:16 crc kubenswrapper[5033]: I0226 15:44:16.054289 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" podStartSLOduration=9.05424672 podStartE2EDuration="9.05424672s" podCreationTimestamp="2026-02-26 15:44:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:44:16.053007805 +0000 UTC m=+183.265606455" watchObservedRunningTime="2026-02-26 15:44:16.05424672 +0000 UTC m=+183.266845370" Feb 26 15:44:16 crc kubenswrapper[5033]: I0226 15:44:16.150865 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:16 crc kubenswrapper[5033]: I0226 15:44:16.393311 5033 csr.go:261] certificate signing request csr-p7vvb is approved, waiting to be issued Feb 26 15:44:16 crc kubenswrapper[5033]: I0226 15:44:16.399477 5033 csr.go:257] certificate signing request csr-p7vvb is issued Feb 26 15:44:16 crc kubenswrapper[5033]: I0226 15:44:16.989673 5033 generic.go:334] "Generic (PLEG): container finished" podID="ea999e72-9356-4693-a1f4-ced6617d4b65" containerID="05cd628789e65980a86fade8b907e76524f06a00fa636a0ef4705f36900fa4e1" exitCode=0 Feb 26 15:44:16 crc kubenswrapper[5033]: I0226 15:44:16.989741 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535344-glfw9" event={"ID":"ea999e72-9356-4693-a1f4-ced6617d4b65","Type":"ContainerDied","Data":"05cd628789e65980a86fade8b907e76524f06a00fa636a0ef4705f36900fa4e1"} Feb 26 15:44:16 crc kubenswrapper[5033]: I0226 15:44:16.990960 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0317728e-d238-4fbd-99e3-a75fceeda178","Type":"ContainerStarted","Data":"2717ed8f1c68b7b302631d38799a5aafae8db3b4e39beda21cee8d0094938462"} Feb 26 15:44:17 crc kubenswrapper[5033]: I0226 15:44:17.016553 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.016530918 podStartE2EDuration="3.016530918s" podCreationTimestamp="2026-02-26 15:44:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:44:17.014326396 +0000 UTC m=+184.226925086" watchObservedRunningTime="2026-02-26 15:44:17.016530918 +0000 UTC m=+184.229129568" Feb 26 15:44:17 crc kubenswrapper[5033]: I0226 15:44:17.400463 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-13 11:06:51.29073455 +0000 UTC Feb 26 15:44:17 crc kubenswrapper[5033]: I0226 15:44:17.400589 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6955h22m33.89015084s for next certificate rotation Feb 26 15:44:18 crc kubenswrapper[5033]: I0226 15:44:18.288991 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535344-glfw9" Feb 26 15:44:18 crc kubenswrapper[5033]: I0226 15:44:18.401477 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 08:27:55.155390425 +0000 UTC Feb 26 15:44:18 crc kubenswrapper[5033]: I0226 15:44:18.401522 5033 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6712h43m36.753871402s for next certificate rotation Feb 26 15:44:18 crc kubenswrapper[5033]: I0226 15:44:18.434547 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8dsj\" (UniqueName: \"kubernetes.io/projected/ea999e72-9356-4693-a1f4-ced6617d4b65-kube-api-access-v8dsj\") pod \"ea999e72-9356-4693-a1f4-ced6617d4b65\" (UID: \"ea999e72-9356-4693-a1f4-ced6617d4b65\") " Feb 26 15:44:18 crc kubenswrapper[5033]: I0226 15:44:18.439956 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea999e72-9356-4693-a1f4-ced6617d4b65-kube-api-access-v8dsj" (OuterVolumeSpecName: "kube-api-access-v8dsj") pod "ea999e72-9356-4693-a1f4-ced6617d4b65" (UID: "ea999e72-9356-4693-a1f4-ced6617d4b65"). InnerVolumeSpecName "kube-api-access-v8dsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[5033]: I0226 15:44:18.536126 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8dsj\" (UniqueName: \"kubernetes.io/projected/ea999e72-9356-4693-a1f4-ced6617d4b65-kube-api-access-v8dsj\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:19 crc kubenswrapper[5033]: I0226 15:44:19.002206 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535344-glfw9" event={"ID":"ea999e72-9356-4693-a1f4-ced6617d4b65","Type":"ContainerDied","Data":"86cd39975bb818d264ee7b4a40ea69b1b93fba444eb65f12d3dd5c81e5924252"} Feb 26 15:44:19 crc kubenswrapper[5033]: I0226 15:44:19.002647 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86cd39975bb818d264ee7b4a40ea69b1b93fba444eb65f12d3dd5c81e5924252" Feb 26 15:44:19 crc kubenswrapper[5033]: I0226 15:44:19.002526 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535344-glfw9" Feb 26 15:44:20 crc kubenswrapper[5033]: I0226 15:44:20.296607 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:44:20 crc kubenswrapper[5033]: I0226 15:44:20.350236 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:44:20 crc kubenswrapper[5033]: I0226 15:44:20.859220 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cp66d"] Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.022754 5033 generic.go:334] "Generic (PLEG): container finished" podID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerID="927d38af4dbda54e8446d731cc611fdb6063d2f7a8b9ef4abcd392f1c5777f88" exitCode=0 Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.023115 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgt5" event={"ID":"65f83e7b-3a72-450b-a54f-899aba4acd8a","Type":"ContainerDied","Data":"927d38af4dbda54e8446d731cc611fdb6063d2f7a8b9ef4abcd392f1c5777f88"} Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.027386 5033 generic.go:334] "Generic (PLEG): container finished" podID="7a38e065-f09c-4182-ba61-d4db657510de" containerID="0ef3d0c9a98070414756a5c0c82627b297ff7c2ff2ef6620d8519a3376b0b4e5" exitCode=0 Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.027457 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjzmq" event={"ID":"7a38e065-f09c-4182-ba61-d4db657510de","Type":"ContainerDied","Data":"0ef3d0c9a98070414756a5c0c82627b297ff7c2ff2ef6620d8519a3376b0b4e5"} Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.031328 5033 generic.go:334] "Generic (PLEG): container finished" podID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerID="6b24334721a668295eb067b97461f033406e54e8fa89b758ad3b94664ff95f6e" exitCode=0 Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.031414 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wctnw" event={"ID":"596c5e6d-16b2-47b7-a052-a164aa944a7f","Type":"ContainerDied","Data":"6b24334721a668295eb067b97461f033406e54e8fa89b758ad3b94664ff95f6e"} Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.031534 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cp66d" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerName="registry-server" containerID="cri-o://9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65" gracePeriod=2 Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.615184 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.641796 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jdtrx"] Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.794837 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-utilities\") pod \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.794926 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc8p4\" (UniqueName: \"kubernetes.io/projected/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-kube-api-access-jc8p4\") pod \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.794945 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-catalog-content\") pod \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\" (UID: \"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166\") " Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.796402 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-utilities" (OuterVolumeSpecName: "utilities") pod "af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" (UID: "af8b3c9a-9d7f-47c8-9aff-4d0f66e56166"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.807227 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-kube-api-access-jc8p4" (OuterVolumeSpecName: "kube-api-access-jc8p4") pod "af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" (UID: "af8b3c9a-9d7f-47c8-9aff-4d0f66e56166"). InnerVolumeSpecName "kube-api-access-jc8p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.888479 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" (UID: "af8b3c9a-9d7f-47c8-9aff-4d0f66e56166"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.895993 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc8p4\" (UniqueName: \"kubernetes.io/projected/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-kube-api-access-jc8p4\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.896033 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:22 crc kubenswrapper[5033]: I0226 15:44:22.896044 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.039686 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjzmq" event={"ID":"7a38e065-f09c-4182-ba61-d4db657510de","Type":"ContainerStarted","Data":"cb437783cf4bd310d39365090e25aaf5d90f48190ebf201e29effe6ab29f8ed3"} Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.042771 5033 generic.go:334] "Generic (PLEG): container finished" podID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerID="0282c8d1eab9ab761b02d190189ca612debc50faa1eb54ef278cc5ae90eaf895" exitCode=0 Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.042849 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p958" event={"ID":"a7b8c773-dedb-4ed9-818d-cd878376c734","Type":"ContainerDied","Data":"0282c8d1eab9ab761b02d190189ca612debc50faa1eb54ef278cc5ae90eaf895"} Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.046075 5033 generic.go:334] "Generic (PLEG): container finished" podID="5049f268-de58-4e79-9deb-d08c2f33397c" containerID="68c89a639e0a8745fbd5d4c8ce1781a2270809c01e72030a78e323e68208f099" exitCode=0 Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.046120 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4br75" event={"ID":"5049f268-de58-4e79-9deb-d08c2f33397c","Type":"ContainerDied","Data":"68c89a639e0a8745fbd5d4c8ce1781a2270809c01e72030a78e323e68208f099"} Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.049097 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wctnw" event={"ID":"596c5e6d-16b2-47b7-a052-a164aa944a7f","Type":"ContainerStarted","Data":"3bbd3ff437a4b73696e2ff50f30aff7e0e60498900c83b708912f10597e8ec23"} Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.051907 5033 generic.go:334] "Generic (PLEG): container finished" podID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerID="9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65" exitCode=0 Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.051979 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cp66d" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.051997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp66d" event={"ID":"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166","Type":"ContainerDied","Data":"9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65"} Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.052033 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cp66d" event={"ID":"af8b3c9a-9d7f-47c8-9aff-4d0f66e56166","Type":"ContainerDied","Data":"a98a1c18ea3392dd3f553d16ec80a37ff4c72804394ce8976c69bdbe8ac8d8a6"} Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.052051 5033 scope.go:117] "RemoveContainer" containerID="9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.062610 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqwqh" event={"ID":"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08","Type":"ContainerStarted","Data":"fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a"} Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.066986 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kjzmq" podStartSLOduration=2.7236594629999997 podStartE2EDuration="54.066966954s" podCreationTimestamp="2026-02-26 15:43:29 +0000 UTC" firstStartedPulling="2026-02-26 15:43:31.175809237 +0000 UTC m=+138.388407927" lastFinishedPulling="2026-02-26 15:44:22.519116768 +0000 UTC m=+189.731715418" observedRunningTime="2026-02-26 15:44:23.063232888 +0000 UTC m=+190.275831538" watchObservedRunningTime="2026-02-26 15:44:23.066966954 +0000 UTC m=+190.279565604" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.074955 5033 scope.go:117] "RemoveContainer" containerID="98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.075178 5033 generic.go:334] "Generic (PLEG): container finished" podID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerID="80bcd727965a725d37b4034d3a3091fc645b04b826954a2f06c80b546fd7ed83" exitCode=0 Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.075240 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hpx7" event={"ID":"922ca3ba-ef67-4e47-b111-e40f8733202b","Type":"ContainerDied","Data":"80bcd727965a725d37b4034d3a3091fc645b04b826954a2f06c80b546fd7ed83"} Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.083900 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgt5" event={"ID":"65f83e7b-3a72-450b-a54f-899aba4acd8a","Type":"ContainerStarted","Data":"22d0f7f4ccb6f101d1b620fb32ce4e44cf84dddb9e732145ceafe18dcd48b49e"} Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.101426 5033 scope.go:117] "RemoveContainer" containerID="c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.116618 5033 scope.go:117] "RemoveContainer" containerID="9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65" Feb 26 15:44:23 crc kubenswrapper[5033]: E0226 15:44:23.119202 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65\": container with ID starting with 9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65 not found: ID does not exist" containerID="9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.119247 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65"} err="failed to get container status \"9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65\": rpc error: code = NotFound desc = could not find container \"9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65\": container with ID starting with 9102e179ac291ede47d6e46ebdac1a48f2aee26e72c6801fcc7f935172876d65 not found: ID does not exist" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.119291 5033 scope.go:117] "RemoveContainer" containerID="98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12" Feb 26 15:44:23 crc kubenswrapper[5033]: E0226 15:44:23.119733 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12\": container with ID starting with 98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12 not found: ID does not exist" containerID="98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.119779 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12"} err="failed to get container status \"98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12\": rpc error: code = NotFound desc = could not find container \"98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12\": container with ID starting with 98278f6e982844b836fe8dc1f2618b6aaf397a6f2cc94aee8ddb82e86aae1e12 not found: ID does not exist" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.119799 5033 scope.go:117] "RemoveContainer" containerID="c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016" Feb 26 15:44:23 crc kubenswrapper[5033]: E0226 15:44:23.120179 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016\": container with ID starting with c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016 not found: ID does not exist" containerID="c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.120218 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016"} err="failed to get container status \"c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016\": rpc error: code = NotFound desc = could not find container \"c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016\": container with ID starting with c0a12bd63788324cf314d0fca8f27b08ee5c8fc61b89015981f50061d8dc9016 not found: ID does not exist" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.129761 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wctnw" podStartSLOduration=1.924806974 podStartE2EDuration="52.129740004s" podCreationTimestamp="2026-02-26 15:43:31 +0000 UTC" firstStartedPulling="2026-02-26 15:43:32.364461444 +0000 UTC m=+139.577060094" lastFinishedPulling="2026-02-26 15:44:22.569394464 +0000 UTC m=+189.781993124" observedRunningTime="2026-02-26 15:44:23.110669273 +0000 UTC m=+190.323267923" watchObservedRunningTime="2026-02-26 15:44:23.129740004 +0000 UTC m=+190.342338654" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.196547 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cp66d"] Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.208717 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cp66d"] Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.225439 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-npgt5" podStartSLOduration=3.143578933 podStartE2EDuration="51.225419287s" podCreationTimestamp="2026-02-26 15:43:32 +0000 UTC" firstStartedPulling="2026-02-26 15:43:34.473069029 +0000 UTC m=+141.685667679" lastFinishedPulling="2026-02-26 15:44:22.554909383 +0000 UTC m=+189.767508033" observedRunningTime="2026-02-26 15:44:23.216987918 +0000 UTC m=+190.429586568" watchObservedRunningTime="2026-02-26 15:44:23.225419287 +0000 UTC m=+190.438017937" Feb 26 15:44:23 crc kubenswrapper[5033]: I0226 15:44:23.634458 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" path="/var/lib/kubelet/pods/af8b3c9a-9d7f-47c8-9aff-4d0f66e56166/volumes" Feb 26 15:44:24 crc kubenswrapper[5033]: I0226 15:44:24.089852 5033 generic.go:334] "Generic (PLEG): container finished" podID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerID="fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a" exitCode=0 Feb 26 15:44:24 crc kubenswrapper[5033]: I0226 15:44:24.089912 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqwqh" event={"ID":"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08","Type":"ContainerDied","Data":"fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a"} Feb 26 15:44:25 crc kubenswrapper[5033]: I0226 15:44:25.105560 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p958" event={"ID":"a7b8c773-dedb-4ed9-818d-cd878376c734","Type":"ContainerStarted","Data":"4e7c3b34925bde1fc895c44cfdf8c6c0b09b4e88dee72ed1715cf52542edc31c"} Feb 26 15:44:25 crc kubenswrapper[5033]: I0226 15:44:25.128594 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7p958" podStartSLOduration=3.904066006 podStartE2EDuration="56.128577556s" podCreationTimestamp="2026-02-26 15:43:29 +0000 UTC" firstStartedPulling="2026-02-26 15:43:31.193815548 +0000 UTC m=+138.406414198" lastFinishedPulling="2026-02-26 15:44:23.418327098 +0000 UTC m=+190.630925748" observedRunningTime="2026-02-26 15:44:25.124054518 +0000 UTC m=+192.336653168" watchObservedRunningTime="2026-02-26 15:44:25.128577556 +0000 UTC m=+192.341176206" Feb 26 15:44:26 crc kubenswrapper[5033]: I0226 15:44:26.117233 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4br75" event={"ID":"5049f268-de58-4e79-9deb-d08c2f33397c","Type":"ContainerStarted","Data":"8102b71cd1ea2248a8abbf1b2f431266fd8006c4e5e7ac694794ada6a7386d38"} Feb 26 15:44:26 crc kubenswrapper[5033]: I0226 15:44:26.119516 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqwqh" event={"ID":"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08","Type":"ContainerStarted","Data":"75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8"} Feb 26 15:44:26 crc kubenswrapper[5033]: I0226 15:44:26.121880 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hpx7" event={"ID":"922ca3ba-ef67-4e47-b111-e40f8733202b","Type":"ContainerStarted","Data":"b67bacad2af29529bfc219c625b1d6f93a65b7bc35f4913cd9cc423c3bb867a3"} Feb 26 15:44:26 crc kubenswrapper[5033]: I0226 15:44:26.142707 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4br75" podStartSLOduration=3.314187379 podStartE2EDuration="57.142688164s" podCreationTimestamp="2026-02-26 15:43:29 +0000 UTC" firstStartedPulling="2026-02-26 15:43:31.203250226 +0000 UTC m=+138.415848866" lastFinishedPulling="2026-02-26 15:44:25.031751001 +0000 UTC m=+192.244349651" observedRunningTime="2026-02-26 15:44:26.141546651 +0000 UTC m=+193.354145301" watchObservedRunningTime="2026-02-26 15:44:26.142688164 +0000 UTC m=+193.355286814" Feb 26 15:44:26 crc kubenswrapper[5033]: I0226 15:44:26.190038 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5hpx7" podStartSLOduration=3.825004609 podStartE2EDuration="55.190018616s" podCreationTimestamp="2026-02-26 15:43:31 +0000 UTC" firstStartedPulling="2026-02-26 15:43:33.40601813 +0000 UTC m=+140.618616770" lastFinishedPulling="2026-02-26 15:44:24.771032127 +0000 UTC m=+191.983630777" observedRunningTime="2026-02-26 15:44:26.162122565 +0000 UTC m=+193.374721215" watchObservedRunningTime="2026-02-26 15:44:26.190018616 +0000 UTC m=+193.402617266" Feb 26 15:44:26 crc kubenswrapper[5033]: I0226 15:44:26.190690 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rqwqh" podStartSLOduration=3.442034128 podStartE2EDuration="54.190682595s" podCreationTimestamp="2026-02-26 15:43:32 +0000 UTC" firstStartedPulling="2026-02-26 15:43:34.444187641 +0000 UTC m=+141.656786291" lastFinishedPulling="2026-02-26 15:44:25.192836108 +0000 UTC m=+192.405434758" observedRunningTime="2026-02-26 15:44:26.188363509 +0000 UTC m=+193.400962169" watchObservedRunningTime="2026-02-26 15:44:26.190682595 +0000 UTC m=+193.403281245" Feb 26 15:44:27 crc kubenswrapper[5033]: I0226 15:44:27.734240 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cc8594b46-nt6fd"] Feb 26 15:44:27 crc kubenswrapper[5033]: I0226 15:44:27.734569 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" podUID="97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" containerName="controller-manager" containerID="cri-o://372ecfa3b740ecc306dbf7779cdb7b8ad3d19a40f4d3ace6dd738104191c97ca" gracePeriod=30 Feb 26 15:44:27 crc kubenswrapper[5033]: I0226 15:44:27.760016 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4"] Feb 26 15:44:27 crc kubenswrapper[5033]: I0226 15:44:27.760495 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" podUID="bbccdb14-2871-4ce9-ab03-db26ecb3a160" containerName="route-controller-manager" containerID="cri-o://48e04d557923ed884258f88133ca1a399352dd3cb22d15a13d8afa5f7566c5ee" gracePeriod=30 Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.142867 5033 generic.go:334] "Generic (PLEG): container finished" podID="bbccdb14-2871-4ce9-ab03-db26ecb3a160" containerID="48e04d557923ed884258f88133ca1a399352dd3cb22d15a13d8afa5f7566c5ee" exitCode=0 Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.142950 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" event={"ID":"bbccdb14-2871-4ce9-ab03-db26ecb3a160","Type":"ContainerDied","Data":"48e04d557923ed884258f88133ca1a399352dd3cb22d15a13d8afa5f7566c5ee"} Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.142977 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" event={"ID":"bbccdb14-2871-4ce9-ab03-db26ecb3a160","Type":"ContainerDied","Data":"f13f69be9e23c75ce307ef298da04c74b121a60d8b73f1eedce3661e26d657fb"} Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.142988 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f13f69be9e23c75ce307ef298da04c74b121a60d8b73f1eedce3661e26d657fb" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.145344 5033 generic.go:334] "Generic (PLEG): container finished" podID="97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" containerID="372ecfa3b740ecc306dbf7779cdb7b8ad3d19a40f4d3ace6dd738104191c97ca" exitCode=0 Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.145405 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" event={"ID":"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc","Type":"ContainerDied","Data":"372ecfa3b740ecc306dbf7779cdb7b8ad3d19a40f4d3ace6dd738104191c97ca"} Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.168604 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.195065 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g"] Feb 26 15:44:29 crc kubenswrapper[5033]: E0226 15:44:29.195352 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbccdb14-2871-4ce9-ab03-db26ecb3a160" containerName="route-controller-manager" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.195383 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbccdb14-2871-4ce9-ab03-db26ecb3a160" containerName="route-controller-manager" Feb 26 15:44:29 crc kubenswrapper[5033]: E0226 15:44:29.195394 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea999e72-9356-4693-a1f4-ced6617d4b65" containerName="oc" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.195402 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea999e72-9356-4693-a1f4-ced6617d4b65" containerName="oc" Feb 26 15:44:29 crc kubenswrapper[5033]: E0226 15:44:29.195420 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerName="extract-utilities" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.195429 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerName="extract-utilities" Feb 26 15:44:29 crc kubenswrapper[5033]: E0226 15:44:29.195440 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerName="extract-content" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.195447 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerName="extract-content" Feb 26 15:44:29 crc kubenswrapper[5033]: E0226 15:44:29.195461 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerName="registry-server" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.195468 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerName="registry-server" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.195582 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbccdb14-2871-4ce9-ab03-db26ecb3a160" containerName="route-controller-manager" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.195597 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8b3c9a-9d7f-47c8-9aff-4d0f66e56166" containerName="registry-server" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.195608 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea999e72-9356-4693-a1f4-ced6617d4b65" containerName="oc" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.196320 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.199040 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g"] Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.267892 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbccdb14-2871-4ce9-ab03-db26ecb3a160-serving-cert\") pod \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.267956 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-config\") pod \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.268025 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-client-ca\") pod \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.268086 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmkgq\" (UniqueName: \"kubernetes.io/projected/bbccdb14-2871-4ce9-ab03-db26ecb3a160-kube-api-access-wmkgq\") pod \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\" (UID: \"bbccdb14-2871-4ce9-ab03-db26ecb3a160\") " Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.269317 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-client-ca" (OuterVolumeSpecName: "client-ca") pod "bbccdb14-2871-4ce9-ab03-db26ecb3a160" (UID: "bbccdb14-2871-4ce9-ab03-db26ecb3a160"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.269480 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-config" (OuterVolumeSpecName: "config") pod "bbccdb14-2871-4ce9-ab03-db26ecb3a160" (UID: "bbccdb14-2871-4ce9-ab03-db26ecb3a160"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.275396 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbccdb14-2871-4ce9-ab03-db26ecb3a160-kube-api-access-wmkgq" (OuterVolumeSpecName: "kube-api-access-wmkgq") pod "bbccdb14-2871-4ce9-ab03-db26ecb3a160" (UID: "bbccdb14-2871-4ce9-ab03-db26ecb3a160"). InnerVolumeSpecName "kube-api-access-wmkgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.282385 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbccdb14-2871-4ce9-ab03-db26ecb3a160-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bbccdb14-2871-4ce9-ab03-db26ecb3a160" (UID: "bbccdb14-2871-4ce9-ab03-db26ecb3a160"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.369414 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-config\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.369470 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-client-ca\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.369506 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50cd52c1-dc88-41e1-93d5-29370876fec6-serving-cert\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.369525 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-295gw\" (UniqueName: \"kubernetes.io/projected/50cd52c1-dc88-41e1-93d5-29370876fec6-kube-api-access-295gw\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.369576 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.369588 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmkgq\" (UniqueName: \"kubernetes.io/projected/bbccdb14-2871-4ce9-ab03-db26ecb3a160-kube-api-access-wmkgq\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.369597 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbccdb14-2871-4ce9-ab03-db26ecb3a160-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.369608 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbccdb14-2871-4ce9-ab03-db26ecb3a160-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.445046 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.470692 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50cd52c1-dc88-41e1-93d5-29370876fec6-serving-cert\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.470743 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-295gw\" (UniqueName: \"kubernetes.io/projected/50cd52c1-dc88-41e1-93d5-29370876fec6-kube-api-access-295gw\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.470789 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-config\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.470820 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-client-ca\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.471620 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-client-ca\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.472542 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-config\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.475418 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50cd52c1-dc88-41e1-93d5-29370876fec6-serving-cert\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.488614 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-295gw\" (UniqueName: \"kubernetes.io/projected/50cd52c1-dc88-41e1-93d5-29370876fec6-kube-api-access-295gw\") pod \"route-controller-manager-6755df687b-48s5g\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.511193 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.572156 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-client-ca\") pod \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.572340 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvcnc\" (UniqueName: \"kubernetes.io/projected/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-kube-api-access-jvcnc\") pod \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.572406 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-config\") pod \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.572428 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-serving-cert\") pod \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.572453 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-proxy-ca-bundles\") pod \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\" (UID: \"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc\") " Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.572984 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" (UID: "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.573415 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-config" (OuterVolumeSpecName: "config") pod "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" (UID: "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.573872 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" (UID: "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.575881 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-kube-api-access-jvcnc" (OuterVolumeSpecName: "kube-api-access-jvcnc") pod "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" (UID: "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc"). InnerVolumeSpecName "kube-api-access-jvcnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.576469 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" (UID: "97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.674194 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.674471 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.674481 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.674492 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.674502 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvcnc\" (UniqueName: \"kubernetes.io/projected/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc-kube-api-access-jvcnc\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.723022 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.723105 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.780874 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.843117 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4br75" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.843210 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4br75" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.878430 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4br75" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.916332 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g"] Feb 26 15:44:29 crc kubenswrapper[5033]: W0226 15:44:29.934152 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50cd52c1_dc88_41e1_93d5_29370876fec6.slice/crio-3ba8dea62183ed0ee154375746946fdef5d1f397930b45b89dbeea35ef3d9398 WatchSource:0}: Error finding container 3ba8dea62183ed0ee154375746946fdef5d1f397930b45b89dbeea35ef3d9398: Status 404 returned error can't find the container with id 3ba8dea62183ed0ee154375746946fdef5d1f397930b45b89dbeea35ef3d9398 Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.943750 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7p958" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.943779 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7p958" Feb 26 15:44:29 crc kubenswrapper[5033]: I0226 15:44:29.984106 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7p958" Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.151140 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" event={"ID":"50cd52c1-dc88-41e1-93d5-29370876fec6","Type":"ContainerStarted","Data":"3ba8dea62183ed0ee154375746946fdef5d1f397930b45b89dbeea35ef3d9398"} Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.153401 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.154163 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4" Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.154190 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc8594b46-nt6fd" event={"ID":"97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc","Type":"ContainerDied","Data":"be5f2cd055f866c922477507c2e40654d7be5ed9a3a502136068c92ba318d77b"} Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.154245 5033 scope.go:117] "RemoveContainer" containerID="372ecfa3b740ecc306dbf7779cdb7b8ad3d19a40f4d3ace6dd738104191c97ca" Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.192588 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4"] Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.205921 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b8b54bdbf-gkbz4"] Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.206967 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.212075 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4br75" Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.213321 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cc8594b46-nt6fd"] Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.221292 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6cc8594b46-nt6fd"] Feb 26 15:44:30 crc kubenswrapper[5033]: I0226 15:44:30.231423 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7p958" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.159236 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" event={"ID":"50cd52c1-dc88-41e1-93d5-29370876fec6","Type":"ContainerStarted","Data":"43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c"} Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.163320 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.169404 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.184866 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" podStartSLOduration=4.184842697 podStartE2EDuration="4.184842697s" podCreationTimestamp="2026-02-26 15:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:44:31.183933623 +0000 UTC m=+198.396532313" watchObservedRunningTime="2026-02-26 15:44:31.184842697 +0000 UTC m=+198.397441387" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.640136 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" path="/var/lib/kubelet/pods/97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc/volumes" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.640823 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbccdb14-2871-4ce9-ab03-db26ecb3a160" path="/var/lib/kubelet/pods/bbccdb14-2871-4ce9-ab03-db26ecb3a160/volumes" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.732783 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.732909 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.771854 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.858961 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7p958"] Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.865660 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b44cf8878-knd2r"] Feb 26 15:44:31 crc kubenswrapper[5033]: E0226 15:44:31.865984 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" containerName="controller-manager" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.866010 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" containerName="controller-manager" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.866195 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="97cb2dfd-bbbd-4e5c-a8b4-1d83d81618cc" containerName="controller-manager" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.866913 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.869231 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.869477 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.871115 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.871330 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.871346 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.871355 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.880661 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:44:31 crc kubenswrapper[5033]: I0226 15:44:31.885787 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b44cf8878-knd2r"] Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.003091 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnlpm\" (UniqueName: \"kubernetes.io/projected/67421045-630b-4996-bd38-c56b0a70cc23-kube-api-access-lnlpm\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.003148 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-client-ca\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.003202 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67421045-630b-4996-bd38-c56b0a70cc23-serving-cert\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.003233 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-config\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.003286 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-proxy-ca-bundles\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.104270 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnlpm\" (UniqueName: \"kubernetes.io/projected/67421045-630b-4996-bd38-c56b0a70cc23-kube-api-access-lnlpm\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.104324 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-client-ca\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.104374 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67421045-630b-4996-bd38-c56b0a70cc23-serving-cert\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.104395 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-config\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.104418 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-proxy-ca-bundles\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.105405 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-proxy-ca-bundles\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.105594 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-config\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.105599 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-client-ca\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.111986 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67421045-630b-4996-bd38-c56b0a70cc23-serving-cert\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.127694 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnlpm\" (UniqueName: \"kubernetes.io/projected/67421045-630b-4996-bd38-c56b0a70cc23-kube-api-access-lnlpm\") pod \"controller-manager-6b44cf8878-knd2r\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.129545 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.129596 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.162958 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.181194 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.181561 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7p958" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerName="registry-server" containerID="cri-o://4e7c3b34925bde1fc895c44cfdf8c6c0b09b4e88dee72ed1715cf52542edc31c" gracePeriod=2 Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.224682 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.224752 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.397311 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b44cf8878-knd2r"] Feb 26 15:44:32 crc kubenswrapper[5033]: W0226 15:44:32.404968 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67421045_630b_4996_bd38_c56b0a70cc23.slice/crio-f20be4cf618dc9c5be021279a5321f977d3da80de8302b06d6b42cee079db7d4 WatchSource:0}: Error finding container f20be4cf618dc9c5be021279a5321f977d3da80de8302b06d6b42cee079db7d4: Status 404 returned error can't find the container with id f20be4cf618dc9c5be021279a5321f977d3da80de8302b06d6b42cee079db7d4 Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.738074 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.738152 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:44:32 crc kubenswrapper[5033]: I0226 15:44:32.818656 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:44:33 crc kubenswrapper[5033]: I0226 15:44:33.119426 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:44:33 crc kubenswrapper[5033]: I0226 15:44:33.119774 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:44:33 crc kubenswrapper[5033]: I0226 15:44:33.181904 5033 generic.go:334] "Generic (PLEG): container finished" podID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerID="4e7c3b34925bde1fc895c44cfdf8c6c0b09b4e88dee72ed1715cf52542edc31c" exitCode=0 Feb 26 15:44:33 crc kubenswrapper[5033]: I0226 15:44:33.182003 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p958" event={"ID":"a7b8c773-dedb-4ed9-818d-cd878376c734","Type":"ContainerDied","Data":"4e7c3b34925bde1fc895c44cfdf8c6c0b09b4e88dee72ed1715cf52542edc31c"} Feb 26 15:44:33 crc kubenswrapper[5033]: I0226 15:44:33.183617 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" event={"ID":"67421045-630b-4996-bd38-c56b0a70cc23","Type":"ContainerStarted","Data":"f20be4cf618dc9c5be021279a5321f977d3da80de8302b06d6b42cee079db7d4"} Feb 26 15:44:33 crc kubenswrapper[5033]: I0226 15:44:33.185945 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:44:33 crc kubenswrapper[5033]: I0226 15:44:33.242488 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:44:33 crc kubenswrapper[5033]: I0226 15:44:33.981520 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7p958" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.132273 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-utilities\") pod \"a7b8c773-dedb-4ed9-818d-cd878376c734\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.132462 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-catalog-content\") pod \"a7b8c773-dedb-4ed9-818d-cd878376c734\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.132494 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7w4f\" (UniqueName: \"kubernetes.io/projected/a7b8c773-dedb-4ed9-818d-cd878376c734-kube-api-access-p7w4f\") pod \"a7b8c773-dedb-4ed9-818d-cd878376c734\" (UID: \"a7b8c773-dedb-4ed9-818d-cd878376c734\") " Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.133621 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-utilities" (OuterVolumeSpecName: "utilities") pod "a7b8c773-dedb-4ed9-818d-cd878376c734" (UID: "a7b8c773-dedb-4ed9-818d-cd878376c734"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.152495 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7b8c773-dedb-4ed9-818d-cd878376c734-kube-api-access-p7w4f" (OuterVolumeSpecName: "kube-api-access-p7w4f") pod "a7b8c773-dedb-4ed9-818d-cd878376c734" (UID: "a7b8c773-dedb-4ed9-818d-cd878376c734"). InnerVolumeSpecName "kube-api-access-p7w4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.190797 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" event={"ID":"67421045-630b-4996-bd38-c56b0a70cc23","Type":"ContainerStarted","Data":"d9df114a9c307aa0b6443203cababa979ef3876abd24b05976985615112a3185"} Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.191365 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.193975 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p958" event={"ID":"a7b8c773-dedb-4ed9-818d-cd878376c734","Type":"ContainerDied","Data":"b8fa44e5462868cb2c053391e527a773473170791d1128c1ba65def860ca54ce"} Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.194038 5033 scope.go:117] "RemoveContainer" containerID="4e7c3b34925bde1fc895c44cfdf8c6c0b09b4e88dee72ed1715cf52542edc31c" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.194206 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7p958" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.197985 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.212286 5033 scope.go:117] "RemoveContainer" containerID="0282c8d1eab9ab761b02d190189ca612debc50faa1eb54ef278cc5ae90eaf895" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.228355 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" podStartSLOduration=7.228337936 podStartE2EDuration="7.228337936s" podCreationTimestamp="2026-02-26 15:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:44:34.224477 +0000 UTC m=+201.437075700" watchObservedRunningTime="2026-02-26 15:44:34.228337936 +0000 UTC m=+201.440936586" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.233932 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.233975 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7w4f\" (UniqueName: \"kubernetes.io/projected/a7b8c773-dedb-4ed9-818d-cd878376c734-kube-api-access-p7w4f\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.238898 5033 scope.go:117] "RemoveContainer" containerID="d4f390be869abdfbcd429d577bb4e1c6a57f8e247d32f54c032f3b55a51a30fa" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.239028 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7b8c773-dedb-4ed9-818d-cd878376c734" (UID: "a7b8c773-dedb-4ed9-818d-cd878376c734"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.279694 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hpx7"] Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.280019 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5hpx7" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerName="registry-server" containerID="cri-o://b67bacad2af29529bfc219c625b1d6f93a65b7bc35f4913cd9cc423c3bb867a3" gracePeriod=2 Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.287154 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.335513 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7b8c773-dedb-4ed9-818d-cd878376c734-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.528676 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7p958"] Feb 26 15:44:34 crc kubenswrapper[5033]: I0226 15:44:34.539213 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7p958"] Feb 26 15:44:35 crc kubenswrapper[5033]: I0226 15:44:35.642927 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" path="/var/lib/kubelet/pods/a7b8c773-dedb-4ed9-818d-cd878376c734/volumes" Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.213568 5033 generic.go:334] "Generic (PLEG): container finished" podID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerID="b67bacad2af29529bfc219c625b1d6f93a65b7bc35f4913cd9cc423c3bb867a3" exitCode=0 Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.213651 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hpx7" event={"ID":"922ca3ba-ef67-4e47-b111-e40f8733202b","Type":"ContainerDied","Data":"b67bacad2af29529bfc219c625b1d6f93a65b7bc35f4913cd9cc423c3bb867a3"} Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.664617 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rqwqh"] Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.665251 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rqwqh" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerName="registry-server" containerID="cri-o://75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8" gracePeriod=2 Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.875503 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.975541 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b42nk\" (UniqueName: \"kubernetes.io/projected/922ca3ba-ef67-4e47-b111-e40f8733202b-kube-api-access-b42nk\") pod \"922ca3ba-ef67-4e47-b111-e40f8733202b\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.975652 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-catalog-content\") pod \"922ca3ba-ef67-4e47-b111-e40f8733202b\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.975775 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-utilities\") pod \"922ca3ba-ef67-4e47-b111-e40f8733202b\" (UID: \"922ca3ba-ef67-4e47-b111-e40f8733202b\") " Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.977032 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-utilities" (OuterVolumeSpecName: "utilities") pod "922ca3ba-ef67-4e47-b111-e40f8733202b" (UID: "922ca3ba-ef67-4e47-b111-e40f8733202b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:36 crc kubenswrapper[5033]: I0226 15:44:36.996384 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/922ca3ba-ef67-4e47-b111-e40f8733202b-kube-api-access-b42nk" (OuterVolumeSpecName: "kube-api-access-b42nk") pod "922ca3ba-ef67-4e47-b111-e40f8733202b" (UID: "922ca3ba-ef67-4e47-b111-e40f8733202b"). InnerVolumeSpecName "kube-api-access-b42nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.005844 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "922ca3ba-ef67-4e47-b111-e40f8733202b" (UID: "922ca3ba-ef67-4e47-b111-e40f8733202b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.077686 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.077737 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b42nk\" (UniqueName: \"kubernetes.io/projected/922ca3ba-ef67-4e47-b111-e40f8733202b-kube-api-access-b42nk\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.077753 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/922ca3ba-ef67-4e47-b111-e40f8733202b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.226359 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hpx7" event={"ID":"922ca3ba-ef67-4e47-b111-e40f8733202b","Type":"ContainerDied","Data":"ef38582c410a86fbf1913a14330c72ccb701f142f45c3398a97ca7c56ae7d94e"} Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.226414 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hpx7" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.226425 5033 scope.go:117] "RemoveContainer" containerID="b67bacad2af29529bfc219c625b1d6f93a65b7bc35f4913cd9cc423c3bb867a3" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.286370 5033 scope.go:117] "RemoveContainer" containerID="80bcd727965a725d37b4034d3a3091fc645b04b826954a2f06c80b546fd7ed83" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.294898 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hpx7"] Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.304161 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hpx7"] Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.316807 5033 scope.go:117] "RemoveContainer" containerID="c6d3e399170cd23aac45baed035c2b59829baadec3740b7a5a55036d834cd0a0" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.637416 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" path="/var/lib/kubelet/pods/922ca3ba-ef67-4e47-b111-e40f8733202b/volumes" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.906892 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.992755 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbk2p\" (UniqueName: \"kubernetes.io/projected/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-kube-api-access-jbk2p\") pod \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.992974 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-catalog-content\") pod \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.993059 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-utilities\") pod \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\" (UID: \"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08\") " Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.994364 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-utilities" (OuterVolumeSpecName: "utilities") pod "2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" (UID: "2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:37 crc kubenswrapper[5033]: I0226 15:44:37.997558 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-kube-api-access-jbk2p" (OuterVolumeSpecName: "kube-api-access-jbk2p") pod "2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" (UID: "2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08"). InnerVolumeSpecName "kube-api-access-jbk2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.095078 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.095143 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbk2p\" (UniqueName: \"kubernetes.io/projected/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-kube-api-access-jbk2p\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.146756 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" (UID: "2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.196748 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.236180 5033 generic.go:334] "Generic (PLEG): container finished" podID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerID="75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8" exitCode=0 Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.236291 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqwqh" event={"ID":"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08","Type":"ContainerDied","Data":"75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8"} Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.236370 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqwqh" event={"ID":"2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08","Type":"ContainerDied","Data":"dc36ff8f8f50443547b4ee6b70b5a6f2058e72f564301b5bba43703f0e6693be"} Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.236394 5033 scope.go:117] "RemoveContainer" containerID="75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.236431 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqwqh" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.255278 5033 scope.go:117] "RemoveContainer" containerID="fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.283557 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rqwqh"] Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.286824 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rqwqh"] Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.292048 5033 scope.go:117] "RemoveContainer" containerID="0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.314237 5033 scope.go:117] "RemoveContainer" containerID="75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8" Feb 26 15:44:38 crc kubenswrapper[5033]: E0226 15:44:38.315015 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8\": container with ID starting with 75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8 not found: ID does not exist" containerID="75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.315145 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8"} err="failed to get container status \"75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8\": rpc error: code = NotFound desc = could not find container \"75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8\": container with ID starting with 75d67650ab8e8cd30fb385a8d3e378a33c27724f61e8ffaf048729fa9947cbd8 not found: ID does not exist" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.315299 5033 scope.go:117] "RemoveContainer" containerID="fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a" Feb 26 15:44:38 crc kubenswrapper[5033]: E0226 15:44:38.316045 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a\": container with ID starting with fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a not found: ID does not exist" containerID="fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.316107 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a"} err="failed to get container status \"fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a\": rpc error: code = NotFound desc = could not find container \"fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a\": container with ID starting with fcb4ca86b2b36b480df342031eb068e9d65b666c43f1a47451058a31fb9dae5a not found: ID does not exist" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.316162 5033 scope.go:117] "RemoveContainer" containerID="0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9" Feb 26 15:44:38 crc kubenswrapper[5033]: E0226 15:44:38.316571 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9\": container with ID starting with 0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9 not found: ID does not exist" containerID="0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9" Feb 26 15:44:38 crc kubenswrapper[5033]: I0226 15:44:38.316606 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9"} err="failed to get container status \"0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9\": rpc error: code = NotFound desc = could not find container \"0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9\": container with ID starting with 0b7c0031c47e8e8f211a23fac776681038962d0f7297bec415d73db918442bf9 not found: ID does not exist" Feb 26 15:44:39 crc kubenswrapper[5033]: I0226 15:44:39.635973 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" path="/var/lib/kubelet/pods/2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08/volumes" Feb 26 15:44:47 crc kubenswrapper[5033]: I0226 15:44:47.689051 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" podUID="09c96f57-a56c-4519-8101-b3aa96065974" containerName="oauth-openshift" containerID="cri-o://82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a" gracePeriod=15 Feb 26 15:44:47 crc kubenswrapper[5033]: I0226 15:44:47.740855 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b44cf8878-knd2r"] Feb 26 15:44:47 crc kubenswrapper[5033]: I0226 15:44:47.741161 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" podUID="67421045-630b-4996-bd38-c56b0a70cc23" containerName="controller-manager" containerID="cri-o://d9df114a9c307aa0b6443203cababa979ef3876abd24b05976985615112a3185" gracePeriod=30 Feb 26 15:44:47 crc kubenswrapper[5033]: I0226 15:44:47.844041 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g"] Feb 26 15:44:47 crc kubenswrapper[5033]: I0226 15:44:47.844342 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" podUID="50cd52c1-dc88-41e1-93d5-29370876fec6" containerName="route-controller-manager" containerID="cri-o://43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c" gracePeriod=30 Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.225420 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.262065 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.289343 5033 generic.go:334] "Generic (PLEG): container finished" podID="67421045-630b-4996-bd38-c56b0a70cc23" containerID="d9df114a9c307aa0b6443203cababa979ef3876abd24b05976985615112a3185" exitCode=0 Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.289419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" event={"ID":"67421045-630b-4996-bd38-c56b0a70cc23","Type":"ContainerDied","Data":"d9df114a9c307aa0b6443203cababa979ef3876abd24b05976985615112a3185"} Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.290563 5033 generic.go:334] "Generic (PLEG): container finished" podID="09c96f57-a56c-4519-8101-b3aa96065974" containerID="82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a" exitCode=0 Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.290632 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" event={"ID":"09c96f57-a56c-4519-8101-b3aa96065974","Type":"ContainerDied","Data":"82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a"} Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.290652 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.290688 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jdtrx" event={"ID":"09c96f57-a56c-4519-8101-b3aa96065974","Type":"ContainerDied","Data":"05ad84ce3c72b8bf9a02e0caed86a6405657fb180f477c83ed8dd15b897f2565"} Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.290714 5033 scope.go:117] "RemoveContainer" containerID="82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.291824 5033 generic.go:334] "Generic (PLEG): container finished" podID="50cd52c1-dc88-41e1-93d5-29370876fec6" containerID="43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c" exitCode=0 Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.291864 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" event={"ID":"50cd52c1-dc88-41e1-93d5-29370876fec6","Type":"ContainerDied","Data":"43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c"} Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.291886 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" event={"ID":"50cd52c1-dc88-41e1-93d5-29370876fec6","Type":"ContainerDied","Data":"3ba8dea62183ed0ee154375746946fdef5d1f397930b45b89dbeea35ef3d9398"} Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.291930 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.310659 5033 scope.go:117] "RemoveContainer" containerID="82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.311199 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a\": container with ID starting with 82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a not found: ID does not exist" containerID="82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.311265 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a"} err="failed to get container status \"82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a\": rpc error: code = NotFound desc = could not find container \"82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a\": container with ID starting with 82034f61fb168a5e0c25dcb01d43d7d52d6eeca861136d25e7368b17b0bad10a not found: ID does not exist" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.311297 5033 scope.go:117] "RemoveContainer" containerID="43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.330922 5033 scope.go:117] "RemoveContainer" containerID="43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.332193 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c\": container with ID starting with 43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c not found: ID does not exist" containerID="43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.332236 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c"} err="failed to get container status \"43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c\": rpc error: code = NotFound desc = could not find container \"43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c\": container with ID starting with 43abf9e3e7362faa571bcc587ef49be0589833441ee6b32d75e766c2d271228c not found: ID does not exist" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.342038 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386018 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-serving-cert\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386066 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-login\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386087 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-provider-selection\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386116 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-client-ca\") pod \"50cd52c1-dc88-41e1-93d5-29370876fec6\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386136 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-295gw\" (UniqueName: \"kubernetes.io/projected/50cd52c1-dc88-41e1-93d5-29370876fec6-kube-api-access-295gw\") pod \"50cd52c1-dc88-41e1-93d5-29370876fec6\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386165 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-ocp-branding-template\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386184 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-config\") pod \"50cd52c1-dc88-41e1-93d5-29370876fec6\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386203 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09c96f57-a56c-4519-8101-b3aa96065974-audit-dir\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386227 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-error\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386280 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k49w\" (UniqueName: \"kubernetes.io/projected/09c96f57-a56c-4519-8101-b3aa96065974-kube-api-access-7k49w\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386309 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-cliconfig\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386326 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-trusted-ca-bundle\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386362 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-router-certs\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386377 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-audit-policies\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386396 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-session\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386419 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-service-ca\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386452 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50cd52c1-dc88-41e1-93d5-29370876fec6-serving-cert\") pod \"50cd52c1-dc88-41e1-93d5-29370876fec6\" (UID: \"50cd52c1-dc88-41e1-93d5-29370876fec6\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386473 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-idp-0-file-data\") pod \"09c96f57-a56c-4519-8101-b3aa96065974\" (UID: \"09c96f57-a56c-4519-8101-b3aa96065974\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.386868 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-client-ca" (OuterVolumeSpecName: "client-ca") pod "50cd52c1-dc88-41e1-93d5-29370876fec6" (UID: "50cd52c1-dc88-41e1-93d5-29370876fec6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.387222 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-config" (OuterVolumeSpecName: "config") pod "50cd52c1-dc88-41e1-93d5-29370876fec6" (UID: "50cd52c1-dc88-41e1-93d5-29370876fec6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.387328 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.387378 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09c96f57-a56c-4519-8101-b3aa96065974-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.387726 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.387985 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.391715 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.391870 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.392761 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.393604 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.395512 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50cd52c1-dc88-41e1-93d5-29370876fec6-kube-api-access-295gw" (OuterVolumeSpecName: "kube-api-access-295gw") pod "50cd52c1-dc88-41e1-93d5-29370876fec6" (UID: "50cd52c1-dc88-41e1-93d5-29370876fec6"). InnerVolumeSpecName "kube-api-access-295gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.396236 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50cd52c1-dc88-41e1-93d5-29370876fec6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "50cd52c1-dc88-41e1-93d5-29370876fec6" (UID: "50cd52c1-dc88-41e1-93d5-29370876fec6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.398216 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.398692 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.399825 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.399888 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c96f57-a56c-4519-8101-b3aa96065974-kube-api-access-7k49w" (OuterVolumeSpecName: "kube-api-access-7k49w") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "kube-api-access-7k49w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.399970 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.404388 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "09c96f57-a56c-4519-8101-b3aa96065974" (UID: "09c96f57-a56c-4519-8101-b3aa96065974"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.487562 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-client-ca\") pod \"67421045-630b-4996-bd38-c56b0a70cc23\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.487640 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnlpm\" (UniqueName: \"kubernetes.io/projected/67421045-630b-4996-bd38-c56b0a70cc23-kube-api-access-lnlpm\") pod \"67421045-630b-4996-bd38-c56b0a70cc23\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.487677 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67421045-630b-4996-bd38-c56b0a70cc23-serving-cert\") pod \"67421045-630b-4996-bd38-c56b0a70cc23\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.487710 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-config\") pod \"67421045-630b-4996-bd38-c56b0a70cc23\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.487796 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-proxy-ca-bundles\") pod \"67421045-630b-4996-bd38-c56b0a70cc23\" (UID: \"67421045-630b-4996-bd38-c56b0a70cc23\") " Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.488619 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "67421045-630b-4996-bd38-c56b0a70cc23" (UID: "67421045-630b-4996-bd38-c56b0a70cc23"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.488667 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-config" (OuterVolumeSpecName: "config") pod "67421045-630b-4996-bd38-c56b0a70cc23" (UID: "67421045-630b-4996-bd38-c56b0a70cc23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.488700 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-client-ca" (OuterVolumeSpecName: "client-ca") pod "67421045-630b-4996-bd38-c56b0a70cc23" (UID: "67421045-630b-4996-bd38-c56b0a70cc23"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.488926 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.488952 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.488966 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.488980 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50cd52c1-dc88-41e1-93d5-29370876fec6-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.488991 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489002 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489013 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489025 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489035 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489045 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-295gw\" (UniqueName: \"kubernetes.io/projected/50cd52c1-dc88-41e1-93d5-29370876fec6-kube-api-access-295gw\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489054 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489066 5033 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/09c96f57-a56c-4519-8101-b3aa96065974-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489076 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50cd52c1-dc88-41e1-93d5-29370876fec6-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489085 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489095 5033 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489105 5033 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67421045-630b-4996-bd38-c56b0a70cc23-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489114 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k49w\" (UniqueName: \"kubernetes.io/projected/09c96f57-a56c-4519-8101-b3aa96065974-kube-api-access-7k49w\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489124 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489133 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489143 5033 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/09c96f57-a56c-4519-8101-b3aa96065974-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.489154 5033 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09c96f57-a56c-4519-8101-b3aa96065974-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.493815 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67421045-630b-4996-bd38-c56b0a70cc23-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67421045-630b-4996-bd38-c56b0a70cc23" (UID: "67421045-630b-4996-bd38-c56b0a70cc23"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.493861 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67421045-630b-4996-bd38-c56b0a70cc23-kube-api-access-lnlpm" (OuterVolumeSpecName: "kube-api-access-lnlpm") pod "67421045-630b-4996-bd38-c56b0a70cc23" (UID: "67421045-630b-4996-bd38-c56b0a70cc23"). InnerVolumeSpecName "kube-api-access-lnlpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.590102 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnlpm\" (UniqueName: \"kubernetes.io/projected/67421045-630b-4996-bd38-c56b0a70cc23-kube-api-access-lnlpm\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.590135 5033 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67421045-630b-4996-bd38-c56b0a70cc23-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.647002 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jdtrx"] Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.652319 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jdtrx"] Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.664802 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g"] Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.680041 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6755df687b-48s5g"] Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886168 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c6cc84444-h9ppw"] Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886491 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerName="extract-utilities" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886514 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerName="extract-utilities" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886532 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50cd52c1-dc88-41e1-93d5-29370876fec6" containerName="route-controller-manager" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886545 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="50cd52c1-dc88-41e1-93d5-29370876fec6" containerName="route-controller-manager" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886564 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67421045-630b-4996-bd38-c56b0a70cc23" containerName="controller-manager" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886576 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="67421045-630b-4996-bd38-c56b0a70cc23" containerName="controller-manager" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886593 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerName="extract-utilities" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886605 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerName="extract-utilities" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886624 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerName="registry-server" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886636 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerName="registry-server" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886654 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerName="registry-server" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886665 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerName="registry-server" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886681 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerName="registry-server" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886693 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerName="registry-server" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886713 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerName="extract-content" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886728 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerName="extract-content" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886750 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerName="extract-utilities" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886762 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerName="extract-utilities" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886778 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerName="extract-content" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886790 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerName="extract-content" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886802 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerName="extract-content" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886815 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerName="extract-content" Feb 26 15:44:48 crc kubenswrapper[5033]: E0226 15:44:48.886835 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09c96f57-a56c-4519-8101-b3aa96065974" containerName="oauth-openshift" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886847 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c96f57-a56c-4519-8101-b3aa96065974" containerName="oauth-openshift" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.886993 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7b8c773-dedb-4ed9-818d-cd878376c734" containerName="registry-server" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.887013 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="922ca3ba-ef67-4e47-b111-e40f8733202b" containerName="registry-server" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.887028 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="67421045-630b-4996-bd38-c56b0a70cc23" containerName="controller-manager" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.887050 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="50cd52c1-dc88-41e1-93d5-29370876fec6" containerName="route-controller-manager" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.887064 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="09c96f57-a56c-4519-8101-b3aa96065974" containerName="oauth-openshift" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.887082 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c5ecfdf-d6fe-4e68-ab3a-3664998f6e08" containerName="registry-server" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.887651 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.898161 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59"] Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.899529 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.901836 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.902340 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.902611 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.902899 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.904529 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.905890 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.906921 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6cc84444-h9ppw"] Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.910936 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59"] Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.995301 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-serving-cert\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.995380 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d30e130-b0eb-4df2-880f-fd5ab3da637f-proxy-ca-bundles\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.995404 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d30e130-b0eb-4df2-880f-fd5ab3da637f-config\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.995442 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d30e130-b0eb-4df2-880f-fd5ab3da637f-client-ca\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.995458 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-286wt\" (UniqueName: \"kubernetes.io/projected/5d30e130-b0eb-4df2-880f-fd5ab3da637f-kube-api-access-286wt\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.995480 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-client-ca\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.995495 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-config\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.995518 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7vl\" (UniqueName: \"kubernetes.io/projected/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-kube-api-access-fb7vl\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:48 crc kubenswrapper[5033]: I0226 15:44:48.995547 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d30e130-b0eb-4df2-880f-fd5ab3da637f-serving-cert\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.097039 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d30e130-b0eb-4df2-880f-fd5ab3da637f-client-ca\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.097109 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-286wt\" (UniqueName: \"kubernetes.io/projected/5d30e130-b0eb-4df2-880f-fd5ab3da637f-kube-api-access-286wt\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.097183 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-client-ca\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.097219 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-config\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.097278 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7vl\" (UniqueName: \"kubernetes.io/projected/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-kube-api-access-fb7vl\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.097338 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d30e130-b0eb-4df2-880f-fd5ab3da637f-serving-cert\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.097362 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-serving-cert\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.097403 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d30e130-b0eb-4df2-880f-fd5ab3da637f-proxy-ca-bundles\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.097435 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d30e130-b0eb-4df2-880f-fd5ab3da637f-config\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.098499 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-config\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.098882 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-client-ca\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.099675 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d30e130-b0eb-4df2-880f-fd5ab3da637f-config\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.100149 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d30e130-b0eb-4df2-880f-fd5ab3da637f-client-ca\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.100866 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d30e130-b0eb-4df2-880f-fd5ab3da637f-proxy-ca-bundles\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.102567 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-serving-cert\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.104206 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d30e130-b0eb-4df2-880f-fd5ab3da637f-serving-cert\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.119276 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7vl\" (UniqueName: \"kubernetes.io/projected/fbfbc7ba-fcee-49a0-a268-ae8d5e78042b-kube-api-access-fb7vl\") pod \"route-controller-manager-764b7c7b69-f4n59\" (UID: \"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b\") " pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.121273 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-286wt\" (UniqueName: \"kubernetes.io/projected/5d30e130-b0eb-4df2-880f-fd5ab3da637f-kube-api-access-286wt\") pod \"controller-manager-6c6cc84444-h9ppw\" (UID: \"5d30e130-b0eb-4df2-880f-fd5ab3da637f\") " pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.241904 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.247996 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.308333 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.308340 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b44cf8878-knd2r" event={"ID":"67421045-630b-4996-bd38-c56b0a70cc23","Type":"ContainerDied","Data":"f20be4cf618dc9c5be021279a5321f977d3da80de8302b06d6b42cee079db7d4"} Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.308409 5033 scope.go:117] "RemoveContainer" containerID="d9df114a9c307aa0b6443203cababa979ef3876abd24b05976985615112a3185" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.333343 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b44cf8878-knd2r"] Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.336109 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b44cf8878-knd2r"] Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.557334 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59"] Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.633657 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09c96f57-a56c-4519-8101-b3aa96065974" path="/var/lib/kubelet/pods/09c96f57-a56c-4519-8101-b3aa96065974/volumes" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.634414 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50cd52c1-dc88-41e1-93d5-29370876fec6" path="/var/lib/kubelet/pods/50cd52c1-dc88-41e1-93d5-29370876fec6/volumes" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.634917 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67421045-630b-4996-bd38-c56b0a70cc23" path="/var/lib/kubelet/pods/67421045-630b-4996-bd38-c56b0a70cc23/volumes" Feb 26 15:44:49 crc kubenswrapper[5033]: I0226 15:44:49.691534 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6cc84444-h9ppw"] Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.189368 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.189452 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.318335 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" event={"ID":"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b","Type":"ContainerStarted","Data":"bbe089eb109393dd4749174303ef050284888f1c774a7f9f4e96be43a4420e0a"} Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.318378 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" event={"ID":"fbfbc7ba-fcee-49a0-a268-ae8d5e78042b","Type":"ContainerStarted","Data":"a9e1392710b2107ccd28ecdcd0fa0c1df8852cda4d9e1d71f0881c849156d877"} Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.319234 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.320670 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" event={"ID":"5d30e130-b0eb-4df2-880f-fd5ab3da637f","Type":"ContainerStarted","Data":"2bcaea5d88def3ba9f7817afecc0991b635a016bc55f491db18193178d902fbb"} Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.320697 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" event={"ID":"5d30e130-b0eb-4df2-880f-fd5ab3da637f","Type":"ContainerStarted","Data":"9fae6a28b49c3a5c98de76733d01d63ce689a9defd452452ac0fd94c3de7b0f3"} Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.321140 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.325109 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.325559 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.335430 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-764b7c7b69-f4n59" podStartSLOduration=3.335418978 podStartE2EDuration="3.335418978s" podCreationTimestamp="2026-02-26 15:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:44:50.331163042 +0000 UTC m=+217.543761692" watchObservedRunningTime="2026-02-26 15:44:50.335418978 +0000 UTC m=+217.548017618" Feb 26 15:44:50 crc kubenswrapper[5033]: I0226 15:44:50.349032 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c6cc84444-h9ppw" podStartSLOduration=3.349012 podStartE2EDuration="3.349012s" podCreationTimestamp="2026-02-26 15:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:44:50.345895595 +0000 UTC m=+217.558494245" watchObservedRunningTime="2026-02-26 15:44:50.349012 +0000 UTC m=+217.561610640" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.888676 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-849dbf65f-dsj5m"] Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.893423 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.897870 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.898100 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.898467 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.898714 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.898903 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.901886 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-849dbf65f-dsj5m"] Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.914590 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.915355 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.915834 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.929367 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.931630 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.931650 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.932433 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.932904 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.970510 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 15:44:53 crc kubenswrapper[5033]: I0226 15:44:53.981524 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.019274 5033 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.020049 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.026799 5033 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.026869 5033 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027153 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027174 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027192 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027202 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027213 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027220 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027228 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027235 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027249 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027275 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027285 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027294 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027302 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027309 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027321 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027328 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027337 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027344 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027445 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027457 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027464 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027476 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027483 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027491 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027500 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.027603 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027611 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027694 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.027703 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.028598 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b" gracePeriod=15 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.028769 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2da09d43f341fda6117b35af9902a0dadd423ab611361a601e53c33f7c424abf" gracePeriod=15 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.028862 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203" gracePeriod=15 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.028879 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413" gracePeriod=15 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.028786 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e" gracePeriod=15 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.054168 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075020 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-audit-policies\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075072 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075091 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075110 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-session\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075129 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-service-ca\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075145 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075165 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-router-certs\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075182 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqnns\" (UniqueName: \"kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075199 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075216 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075229 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075247 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075284 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/647f4597-a6fb-472b-8420-15d6f2d802ba-audit-dir\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075299 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075326 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075344 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075364 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075385 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-template-login\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075403 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075423 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-template-error\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075438 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.075465 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.176378 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.176515 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.176600 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-session\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.176517 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.176729 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-service-ca\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.176813 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.176890 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-router-certs\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.176956 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqnns\" (UniqueName: \"kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177025 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177098 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177166 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177234 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177318 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/647f4597-a6fb-472b-8420-15d6f2d802ba-audit-dir\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177384 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177472 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177541 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177614 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177702 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-template-login\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177798 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.177909 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-template-error\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.179198 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.179309 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.176550 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.178525 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.178562 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.178858 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.179147 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.179285 5033 projected.go:194] Error preparing data for projected volume kube-api-access-xqnns for pod openshift-authentication/oauth-openshift-849dbf65f-dsj5m: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.179429 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns podName:647f4597-a6fb-472b-8420-15d6f2d802ba nodeName:}" failed. No retries permitted until 2026-02-26 15:44:54.679411437 +0000 UTC m=+221.892010077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xqnns" (UniqueName: "kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns") pod "oauth-openshift-849dbf65f-dsj5m" (UID: "647f4597-a6fb-472b-8420-15d6f2d802ba") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.178446 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/647f4597-a6fb-472b-8420-15d6f2d802ba-audit-dir\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.179697 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.179797 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.179831 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{oauth-openshift-849dbf65f-dsj5m.1897d661dbae9ef5 openshift-authentication 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-849dbf65f-dsj5m,UID:647f4597-a6fb-472b-8420-15d6f2d802ba,APIVersion:v1,ResourceVersion:29923,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-xqnns\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token\": dial tcp 38.102.83.89:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:44:54.179405557 +0000 UTC m=+221.392004207,LastTimestamp:2026-02-26 15:44:54.179405557 +0000 UTC m=+221.392004207,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.179838 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.178530 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.179933 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-audit-policies\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.179842 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-service-ca\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.181036 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/647f4597-a6fb-472b-8420-15d6f2d802ba-audit-policies\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.182509 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.182838 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-template-error\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.182862 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.183403 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.183923 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.184247 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-session\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.184972 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-user-template-login\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.185900 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/647f4597-a6fb-472b-8420-15d6f2d802ba-v4-0-config-system-router-certs\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.349144 5033 generic.go:334] "Generic (PLEG): container finished" podID="0317728e-d238-4fbd-99e3-a75fceeda178" containerID="2717ed8f1c68b7b302631d38799a5aafae8db3b4e39beda21cee8d0094938462" exitCode=0 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.349233 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0317728e-d238-4fbd-99e3-a75fceeda178","Type":"ContainerDied","Data":"2717ed8f1c68b7b302631d38799a5aafae8db3b4e39beda21cee8d0094938462"} Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.349940 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.350208 5033 status_manager.go:851] "Failed to get status for pod" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.350566 5033 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.351465 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.351638 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.354238 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.355022 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2da09d43f341fda6117b35af9902a0dadd423ab611361a601e53c33f7c424abf" exitCode=0 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.355153 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e" exitCode=0 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.355241 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413" exitCode=0 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.355340 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203" exitCode=2 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.355153 5033 scope.go:117] "RemoveContainer" containerID="3145674809d943dc0d6a08074fbb0a39660b91c008a9ec59775c85c1c8cff87b" Feb 26 15:44:54 crc kubenswrapper[5033]: W0226 15:44:54.391110 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-e9649f7e8398d4a2e15c7b7afdda9d6de3f3eb639aaed14a5a07858ce439c4c7 WatchSource:0}: Error finding container e9649f7e8398d4a2e15c7b7afdda9d6de3f3eb639aaed14a5a07858ce439c4c7: Status 404 returned error can't find the container with id e9649f7e8398d4a2e15c7b7afdda9d6de3f3eb639aaed14a5a07858ce439c4c7 Feb 26 15:44:54 crc kubenswrapper[5033]: I0226 15:44:54.688468 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqnns\" (UniqueName: \"kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.689493 5033 projected.go:194] Error preparing data for projected volume kube-api-access-xqnns for pod openshift-authentication/oauth-openshift-849dbf65f-dsj5m: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:44:54 crc kubenswrapper[5033]: E0226 15:44:54.689623 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns podName:647f4597-a6fb-472b-8420-15d6f2d802ba nodeName:}" failed. No retries permitted until 2026-02-26 15:44:55.689581657 +0000 UTC m=+222.902180347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqnns" (UniqueName: "kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns") pod "oauth-openshift-849dbf65f-dsj5m" (UID: "647f4597-a6fb-472b-8420-15d6f2d802ba") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.367074 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.372040 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270"} Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.372106 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e9649f7e8398d4a2e15c7b7afdda9d6de3f3eb639aaed14a5a07858ce439c4c7"} Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.373072 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.373443 5033 status_manager.go:851] "Failed to get status for pod" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.373964 5033 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.706924 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqnns\" (UniqueName: \"kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:55 crc kubenswrapper[5033]: E0226 15:44:55.708064 5033 projected.go:194] Error preparing data for projected volume kube-api-access-xqnns for pod openshift-authentication/oauth-openshift-849dbf65f-dsj5m: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:44:55 crc kubenswrapper[5033]: E0226 15:44:55.708198 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns podName:647f4597-a6fb-472b-8420-15d6f2d802ba nodeName:}" failed. No retries permitted until 2026-02-26 15:44:57.708156557 +0000 UTC m=+224.920755327 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqnns" (UniqueName: "kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns") pod "oauth-openshift-849dbf65f-dsj5m" (UID: "647f4597-a6fb-472b-8420-15d6f2d802ba") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.766085 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.767226 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.767827 5033 status_manager.go:851] "Failed to get status for pod" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.808520 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-var-lock\") pod \"0317728e-d238-4fbd-99e3-a75fceeda178\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.808605 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-kubelet-dir\") pod \"0317728e-d238-4fbd-99e3-a75fceeda178\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.808620 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-var-lock" (OuterVolumeSpecName: "var-lock") pod "0317728e-d238-4fbd-99e3-a75fceeda178" (UID: "0317728e-d238-4fbd-99e3-a75fceeda178"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.808717 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0317728e-d238-4fbd-99e3-a75fceeda178-kube-api-access\") pod \"0317728e-d238-4fbd-99e3-a75fceeda178\" (UID: \"0317728e-d238-4fbd-99e3-a75fceeda178\") " Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.808814 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0317728e-d238-4fbd-99e3-a75fceeda178" (UID: "0317728e-d238-4fbd-99e3-a75fceeda178"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.809153 5033 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.809205 5033 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0317728e-d238-4fbd-99e3-a75fceeda178-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.818851 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0317728e-d238-4fbd-99e3-a75fceeda178-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0317728e-d238-4fbd-99e3-a75fceeda178" (UID: "0317728e-d238-4fbd-99e3-a75fceeda178"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:55 crc kubenswrapper[5033]: I0226 15:44:55.911609 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0317728e-d238-4fbd-99e3-a75fceeda178-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.378593 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0317728e-d238-4fbd-99e3-a75fceeda178","Type":"ContainerDied","Data":"e2004237aa74f5fb92001c2ccdb47b911a0f743fc2b5c44c0bb62aeac82c30c2"} Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.378641 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2004237aa74f5fb92001c2ccdb47b911a0f743fc2b5c44c0bb62aeac82c30c2" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.378717 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.385133 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.386018 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b" exitCode=0 Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.386112 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bb9f317ffdd541de512565357dcdc243e337028401d9036e2451b13839198ce" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.394424 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.395555 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.396377 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.396816 5033 status_manager.go:851] "Failed to get status for pod" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.397450 5033 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.397785 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.397981 5033 status_manager.go:851] "Failed to get status for pod" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.398180 5033 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.418762 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.418802 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.418892 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.419079 5033 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.418933 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.520431 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.520572 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.520874 5033 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:56 crc kubenswrapper[5033]: I0226 15:44:56.520907 5033 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:57 crc kubenswrapper[5033]: I0226 15:44:57.394875 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:57 crc kubenswrapper[5033]: I0226 15:44:57.417359 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:57 crc kubenswrapper[5033]: I0226 15:44:57.418117 5033 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:57 crc kubenswrapper[5033]: I0226 15:44:57.418578 5033 status_manager.go:851] "Failed to get status for pod" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:44:57 crc kubenswrapper[5033]: I0226 15:44:57.639109 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 26 15:44:57 crc kubenswrapper[5033]: I0226 15:44:57.740435 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqnns\" (UniqueName: \"kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:44:57 crc kubenswrapper[5033]: E0226 15:44:57.741553 5033 projected.go:194] Error preparing data for projected volume kube-api-access-xqnns for pod openshift-authentication/oauth-openshift-849dbf65f-dsj5m: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:44:57 crc kubenswrapper[5033]: E0226 15:44:57.741668 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns podName:647f4597-a6fb-472b-8420-15d6f2d802ba nodeName:}" failed. No retries permitted until 2026-02-26 15:45:01.741637559 +0000 UTC m=+228.954236249 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqnns" (UniqueName: "kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns") pod "oauth-openshift-849dbf65f-dsj5m" (UID: "647f4597-a6fb-472b-8420-15d6f2d802ba") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:45:01 crc kubenswrapper[5033]: I0226 15:45:01.802663 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqnns\" (UniqueName: \"kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:45:01 crc kubenswrapper[5033]: E0226 15:45:01.803754 5033 projected.go:194] Error preparing data for projected volume kube-api-access-xqnns for pod openshift-authentication/oauth-openshift-849dbf65f-dsj5m: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:45:01 crc kubenswrapper[5033]: E0226 15:45:01.803856 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns podName:647f4597-a6fb-472b-8420-15d6f2d802ba nodeName:}" failed. No retries permitted until 2026-02-26 15:45:09.803829651 +0000 UTC m=+237.016428301 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqnns" (UniqueName: "kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns") pod "oauth-openshift-849dbf65f-dsj5m" (UID: "647f4597-a6fb-472b-8420-15d6f2d802ba") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token": dial tcp 38.102.83.89:6443: connect: connection refused Feb 26 15:45:03 crc kubenswrapper[5033]: E0226 15:45:03.055428 5033 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{oauth-openshift-849dbf65f-dsj5m.1897d661dbae9ef5 openshift-authentication 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-849dbf65f-dsj5m,UID:647f4597-a6fb-472b-8420-15d6f2d802ba,APIVersion:v1,ResourceVersion:29923,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-xqnns\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token\": dial tcp 38.102.83.89:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:44:54.179405557 +0000 UTC m=+221.392004207,LastTimestamp:2026-02-26 15:44:54.179405557 +0000 UTC m=+221.392004207,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:45:03 crc kubenswrapper[5033]: I0226 15:45:03.631828 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:03 crc kubenswrapper[5033]: I0226 15:45:03.632204 5033 status_manager.go:851] "Failed to get status for pod" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:04 crc kubenswrapper[5033]: E0226 15:45:04.037757 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:04 crc kubenswrapper[5033]: E0226 15:45:04.039117 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:04 crc kubenswrapper[5033]: E0226 15:45:04.039569 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:04 crc kubenswrapper[5033]: E0226 15:45:04.041708 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:04 crc kubenswrapper[5033]: E0226 15:45:04.042308 5033 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:04 crc kubenswrapper[5033]: I0226 15:45:04.042360 5033 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 26 15:45:04 crc kubenswrapper[5033]: E0226 15:45:04.042712 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="200ms" Feb 26 15:45:04 crc kubenswrapper[5033]: E0226 15:45:04.243824 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="400ms" Feb 26 15:45:04 crc kubenswrapper[5033]: E0226 15:45:04.645386 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="800ms" Feb 26 15:45:05 crc kubenswrapper[5033]: E0226 15:45:05.446107 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="1.6s" Feb 26 15:45:07 crc kubenswrapper[5033]: E0226 15:45:07.047513 5033 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="3.2s" Feb 26 15:45:07 crc kubenswrapper[5033]: I0226 15:45:07.628045 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:07 crc kubenswrapper[5033]: I0226 15:45:07.629200 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:07 crc kubenswrapper[5033]: I0226 15:45:07.629964 5033 status_manager.go:851] "Failed to get status for pod" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:07 crc kubenswrapper[5033]: I0226 15:45:07.649675 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:07 crc kubenswrapper[5033]: I0226 15:45:07.649729 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:07 crc kubenswrapper[5033]: E0226 15:45:07.650453 5033 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:07 crc kubenswrapper[5033]: I0226 15:45:07.651209 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:08 crc kubenswrapper[5033]: I0226 15:45:08.473708 5033 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="50026671d3e6ddf72922436e07019dff2ba37030e388dd25f1c46666043e7834" exitCode=0 Feb 26 15:45:08 crc kubenswrapper[5033]: I0226 15:45:08.473817 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"50026671d3e6ddf72922436e07019dff2ba37030e388dd25f1c46666043e7834"} Feb 26 15:45:08 crc kubenswrapper[5033]: I0226 15:45:08.474247 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2e22f24ef7d3d4dc7b989ee3906971cb944e7db32dd6661a596c8dac21efb050"} Feb 26 15:45:08 crc kubenswrapper[5033]: I0226 15:45:08.474660 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:08 crc kubenswrapper[5033]: I0226 15:45:08.474689 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:08 crc kubenswrapper[5033]: E0226 15:45:08.475247 5033 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:08 crc kubenswrapper[5033]: I0226 15:45:08.475396 5033 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:08 crc kubenswrapper[5033]: I0226 15:45:08.476025 5033 status_manager.go:851] "Failed to get status for pod" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.484717 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e3799c7298dd2b335128335be6080e1418f607f668e5f6c71e0b4d5316cee421"} Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.485217 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f0d6158194923ee7f85bc3646edc3836812519186596ca3551ecbbb8a9872f0e"} Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.485248 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"118864ad12c21d260364d56c7f3019a3350ec91eac7806c568f75ad99bb17bf4"} Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.485275 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"89685cc2069ad15544071b963beb78fb9d176e6fe78e1c720fdcb8d1f95cc6be"} Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.487222 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.487707 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.487745 5033 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c86862bdc1157284ee2b33caed922e81d6dcf1ebfcc7fd0ba4c2120083ff5cf5" exitCode=1 Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.487779 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c86862bdc1157284ee2b33caed922e81d6dcf1ebfcc7fd0ba4c2120083ff5cf5"} Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.488248 5033 scope.go:117] "RemoveContainer" containerID="c86862bdc1157284ee2b33caed922e81d6dcf1ebfcc7fd0ba4c2120083ff5cf5" Feb 26 15:45:09 crc kubenswrapper[5033]: I0226 15:45:09.816655 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqnns\" (UniqueName: \"kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:45:10 crc kubenswrapper[5033]: I0226 15:45:10.496065 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b96b791d703ca070c5c2ef972cd45712c944dfc5cecd070597e6ddf389e9240c"} Feb 26 15:45:10 crc kubenswrapper[5033]: I0226 15:45:10.496670 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:10 crc kubenswrapper[5033]: I0226 15:45:10.496725 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:10 crc kubenswrapper[5033]: I0226 15:45:10.499106 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 15:45:10 crc kubenswrapper[5033]: I0226 15:45:10.499764 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 15:45:10 crc kubenswrapper[5033]: I0226 15:45:10.499819 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fad444efe3616068c7f96c99276920dab43158eb7c885dd33348d06a71cc73a6"} Feb 26 15:45:12 crc kubenswrapper[5033]: I0226 15:45:12.651494 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:12 crc kubenswrapper[5033]: I0226 15:45:12.651856 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:12 crc kubenswrapper[5033]: I0226 15:45:12.661802 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:15 crc kubenswrapper[5033]: I0226 15:45:15.413693 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqnns\" (UniqueName: \"kubernetes.io/projected/647f4597-a6fb-472b-8420-15d6f2d802ba-kube-api-access-xqnns\") pod \"oauth-openshift-849dbf65f-dsj5m\" (UID: \"647f4597-a6fb-472b-8420-15d6f2d802ba\") " pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:45:15 crc kubenswrapper[5033]: I0226 15:45:15.521012 5033 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:15 crc kubenswrapper[5033]: I0226 15:45:15.533500 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 15:45:15 crc kubenswrapper[5033]: I0226 15:45:15.536004 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:45:15 crc kubenswrapper[5033]: W0226 15:45:15.978282 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647f4597_a6fb_472b_8420_15d6f2d802ba.slice/crio-c0593facfc53361c20308d0f74784bfb35936519b28ad67d3a5607a4c3a22d0a WatchSource:0}: Error finding container c0593facfc53361c20308d0f74784bfb35936519b28ad67d3a5607a4c3a22d0a: Status 404 returned error can't find the container with id c0593facfc53361c20308d0f74784bfb35936519b28ad67d3a5607a4c3a22d0a Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.542774 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.542810 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.543792 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" event={"ID":"647f4597-a6fb-472b-8420-15d6f2d802ba","Type":"ContainerStarted","Data":"5bd6d5e89c2485c67de96a154863f109bfc902e5717f631c22ae23f4de4e1244"} Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.543817 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" event={"ID":"647f4597-a6fb-472b-8420-15d6f2d802ba","Type":"ContainerStarted","Data":"c0593facfc53361c20308d0f74784bfb35936519b28ad67d3a5607a4c3a22d0a"} Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.544170 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.545366 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.560783 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.574817 5033 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="756db898-3a2e-4d23-b197-06dbc7eeb68c" Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.699627 5033 patch_prober.go:28] interesting pod/oauth-openshift-849dbf65f-dsj5m container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": read tcp 10.217.0.2:35158->10.217.0.68:6443: read: connection reset by peer" start-of-body= Feb 26 15:45:16 crc kubenswrapper[5033]: I0226 15:45:16.699694 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" podUID="647f4597-a6fb-472b-8420-15d6f2d802ba" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": read tcp 10.217.0.2:35158->10.217.0.68:6443: read: connection reset by peer" Feb 26 15:45:17 crc kubenswrapper[5033]: I0226 15:45:17.552002 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-849dbf65f-dsj5m_647f4597-a6fb-472b-8420-15d6f2d802ba/oauth-openshift/0.log" Feb 26 15:45:17 crc kubenswrapper[5033]: I0226 15:45:17.552991 5033 generic.go:334] "Generic (PLEG): container finished" podID="647f4597-a6fb-472b-8420-15d6f2d802ba" containerID="5bd6d5e89c2485c67de96a154863f109bfc902e5717f631c22ae23f4de4e1244" exitCode=255 Feb 26 15:45:17 crc kubenswrapper[5033]: I0226 15:45:17.553099 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" event={"ID":"647f4597-a6fb-472b-8420-15d6f2d802ba","Type":"ContainerDied","Data":"5bd6d5e89c2485c67de96a154863f109bfc902e5717f631c22ae23f4de4e1244"} Feb 26 15:45:17 crc kubenswrapper[5033]: I0226 15:45:17.553326 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:17 crc kubenswrapper[5033]: I0226 15:45:17.553338 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:17 crc kubenswrapper[5033]: I0226 15:45:17.553950 5033 scope.go:117] "RemoveContainer" containerID="5bd6d5e89c2485c67de96a154863f109bfc902e5717f631c22ae23f4de4e1244" Feb 26 15:45:17 crc kubenswrapper[5033]: I0226 15:45:17.582874 5033 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="756db898-3a2e-4d23-b197-06dbc7eeb68c" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.113729 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.566350 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-849dbf65f-dsj5m_647f4597-a6fb-472b-8420-15d6f2d802ba/oauth-openshift/1.log" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.568319 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-849dbf65f-dsj5m_647f4597-a6fb-472b-8420-15d6f2d802ba/oauth-openshift/0.log" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.568401 5033 generic.go:334] "Generic (PLEG): container finished" podID="647f4597-a6fb-472b-8420-15d6f2d802ba" containerID="378bdca7a2e28cae60029c64a69f267707852ee9d2bcceb56f1fa307baa3b602" exitCode=255 Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.568530 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" event={"ID":"647f4597-a6fb-472b-8420-15d6f2d802ba","Type":"ContainerDied","Data":"378bdca7a2e28cae60029c64a69f267707852ee9d2bcceb56f1fa307baa3b602"} Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.568627 5033 scope.go:117] "RemoveContainer" containerID="5bd6d5e89c2485c67de96a154863f109bfc902e5717f631c22ae23f4de4e1244" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.568964 5033 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.569003 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="127cc120-a042-4dd7-8513-f6ee67014a46" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.569391 5033 scope.go:117] "RemoveContainer" containerID="378bdca7a2e28cae60029c64a69f267707852ee9d2bcceb56f1fa307baa3b602" Feb 26 15:45:18 crc kubenswrapper[5033]: E0226 15:45:18.569882 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-849dbf65f-dsj5m_openshift-authentication(647f4597-a6fb-472b-8420-15d6f2d802ba)\"" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" podUID="647f4597-a6fb-472b-8420-15d6f2d802ba" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.609831 5033 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="756db898-3a2e-4d23-b197-06dbc7eeb68c" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.842590 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.842862 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 15:45:18 crc kubenswrapper[5033]: I0226 15:45:18.842914 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 15:45:19 crc kubenswrapper[5033]: I0226 15:45:19.580475 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-849dbf65f-dsj5m_647f4597-a6fb-472b-8420-15d6f2d802ba/oauth-openshift/1.log" Feb 26 15:45:19 crc kubenswrapper[5033]: I0226 15:45:19.581243 5033 scope.go:117] "RemoveContainer" containerID="378bdca7a2e28cae60029c64a69f267707852ee9d2bcceb56f1fa307baa3b602" Feb 26 15:45:19 crc kubenswrapper[5033]: E0226 15:45:19.581617 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-849dbf65f-dsj5m_openshift-authentication(647f4597-a6fb-472b-8420-15d6f2d802ba)\"" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" podUID="647f4597-a6fb-472b-8420-15d6f2d802ba" Feb 26 15:45:20 crc kubenswrapper[5033]: I0226 15:45:20.189119 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:45:20 crc kubenswrapper[5033]: I0226 15:45:20.189796 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:45:25 crc kubenswrapper[5033]: I0226 15:45:25.166358 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 15:45:25 crc kubenswrapper[5033]: I0226 15:45:25.365154 5033 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 15:45:25 crc kubenswrapper[5033]: I0226 15:45:25.536759 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:45:25 crc kubenswrapper[5033]: I0226 15:45:25.536845 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:45:25 crc kubenswrapper[5033]: I0226 15:45:25.537761 5033 scope.go:117] "RemoveContainer" containerID="378bdca7a2e28cae60029c64a69f267707852ee9d2bcceb56f1fa307baa3b602" Feb 26 15:45:25 crc kubenswrapper[5033]: E0226 15:45:25.538174 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-849dbf65f-dsj5m_openshift-authentication(647f4597-a6fb-472b-8420-15d6f2d802ba)\"" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" podUID="647f4597-a6fb-472b-8420-15d6f2d802ba" Feb 26 15:45:25 crc kubenswrapper[5033]: I0226 15:45:25.590035 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 15:45:25 crc kubenswrapper[5033]: I0226 15:45:25.960150 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.006763 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.061779 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.218794 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.651163 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.769770 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.862240 5033 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.867108 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=32.867088382 podStartE2EDuration="32.867088382s" podCreationTimestamp="2026-02-26 15:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:15.44411737 +0000 UTC m=+242.656716050" watchObservedRunningTime="2026-02-26 15:45:26.867088382 +0000 UTC m=+254.079687032" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.867438 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.867489 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.867514 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-849dbf65f-dsj5m"] Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.868141 5033 scope.go:117] "RemoveContainer" containerID="378bdca7a2e28cae60029c64a69f267707852ee9d2bcceb56f1fa307baa3b602" Feb 26 15:45:26 crc kubenswrapper[5033]: E0226 15:45:26.868392 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-849dbf65f-dsj5m_openshift-authentication(647f4597-a6fb-472b-8420-15d6f2d802ba)\"" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" podUID="647f4597-a6fb-472b-8420-15d6f2d802ba" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.877187 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.903246 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=11.903222468 podStartE2EDuration="11.903222468s" podCreationTimestamp="2026-02-26 15:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:26.899208078 +0000 UTC m=+254.111806778" watchObservedRunningTime="2026-02-26 15:45:26.903222468 +0000 UTC m=+254.115821138" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.936143 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 15:45:26 crc kubenswrapper[5033]: I0226 15:45:26.942922 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 15:45:27 crc kubenswrapper[5033]: I0226 15:45:27.182832 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 15:45:27 crc kubenswrapper[5033]: I0226 15:45:27.188484 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 15:45:27 crc kubenswrapper[5033]: I0226 15:45:27.383221 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 15:45:27 crc kubenswrapper[5033]: I0226 15:45:27.660686 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 15:45:27 crc kubenswrapper[5033]: I0226 15:45:27.711712 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 15:45:27 crc kubenswrapper[5033]: I0226 15:45:27.726465 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 15:45:27 crc kubenswrapper[5033]: I0226 15:45:27.736411 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:45:27 crc kubenswrapper[5033]: I0226 15:45:27.960565 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 15:45:27 crc kubenswrapper[5033]: I0226 15:45:27.985317 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.129927 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.158493 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.162133 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.210835 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.278802 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.326891 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.452555 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.493888 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.496934 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.553072 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.563819 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.739713 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.811163 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.843545 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.843678 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 15:45:28 crc kubenswrapper[5033]: I0226 15:45:28.865676 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 15:45:29 crc kubenswrapper[5033]: I0226 15:45:29.088643 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 15:45:29 crc kubenswrapper[5033]: I0226 15:45:29.134121 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 15:45:29 crc kubenswrapper[5033]: I0226 15:45:29.158292 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 15:45:29 crc kubenswrapper[5033]: I0226 15:45:29.396223 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:45:29 crc kubenswrapper[5033]: I0226 15:45:29.584216 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 15:45:29 crc kubenswrapper[5033]: I0226 15:45:29.640778 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 15:45:29 crc kubenswrapper[5033]: I0226 15:45:29.751636 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 15:45:29 crc kubenswrapper[5033]: I0226 15:45:29.962303 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.040148 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.186742 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.211047 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.224061 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.225791 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.244741 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.292497 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.360992 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.427135 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.433063 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.433697 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.603073 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.610794 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.622705 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.720182 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.740086 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.769147 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.819085 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.898507 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.922932 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 15:45:30 crc kubenswrapper[5033]: I0226 15:45:30.938288 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.058968 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.075630 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.079075 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.106823 5033 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.116350 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.170887 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.329422 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.341520 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.395135 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.395654 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.572195 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.690866 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.743471 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.817716 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.865908 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 15:45:31 crc kubenswrapper[5033]: I0226 15:45:31.950432 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.062334 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.107025 5033 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.160093 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.254319 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.254319 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.295064 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.359614 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.454535 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.501415 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.561458 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.608188 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.632908 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.636372 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.641578 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.701138 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.723068 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.747233 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.827676 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.854783 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.913671 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.916950 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.964464 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 15:45:32 crc kubenswrapper[5033]: I0226 15:45:32.977765 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.005303 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.037702 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.044529 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.084393 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.119579 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.192142 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.196347 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.210475 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.211065 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.241289 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.310521 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.327371 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.378431 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.478832 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.489925 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.491101 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.588498 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.774079 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.780472 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.781332 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.807521 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 15:45:33 crc kubenswrapper[5033]: I0226 15:45:33.838978 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.029149 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.030797 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.055328 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.150349 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.163960 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.173541 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.220881 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.252781 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.368382 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.423207 5033 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.458403 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.466695 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.544758 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.582627 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.625377 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.662597 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.678497 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.695121 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.767470 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.838640 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.873704 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.886244 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.888425 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.914824 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 15:45:34 crc kubenswrapper[5033]: I0226 15:45:34.940454 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.039124 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.045649 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.089204 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.341079 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.342524 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.406842 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.409173 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.409889 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.415546 5033 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.488758 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.509687 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.521330 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.560451 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.638765 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.711446 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.772906 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 15:45:35 crc kubenswrapper[5033]: I0226 15:45:35.824416 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.172138 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.185485 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.214900 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.216047 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.268823 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.291654 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.297890 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.301120 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.363347 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.396591 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.418060 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.419717 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.439513 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.516601 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.518087 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.570100 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.641832 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.757144 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.874848 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.982651 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.990474 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.991801 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 15:45:36 crc kubenswrapper[5033]: I0226 15:45:36.997387 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.024052 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.224383 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.246127 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.258722 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.281793 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.383010 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.424866 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.433627 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.467120 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.497205 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.551725 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.585527 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.610306 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.616670 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.620313 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.668727 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.768440 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.852377 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.882546 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.890029 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.936844 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 15:45:37 crc kubenswrapper[5033]: I0226 15:45:37.962014 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.047420 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.115764 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.120232 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.122458 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.124455 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.131904 5033 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.132202 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270" gracePeriod=5 Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.168794 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.179362 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.181604 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.184397 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.196298 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.250084 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.455977 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.497753 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.631157 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.790400 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.843550 5033 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.843683 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.843802 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.845024 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"fad444efe3616068c7f96c99276920dab43158eb7c885dd33348d06a71cc73a6"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.845334 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://fad444efe3616068c7f96c99276920dab43158eb7c885dd33348d06a71cc73a6" gracePeriod=30 Feb 26 15:45:38 crc kubenswrapper[5033]: I0226 15:45:38.916572 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.039682 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.251533 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.295604 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.306521 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.357215 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.427692 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.549717 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.601957 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.628774 5033 scope.go:117] "RemoveContainer" containerID="378bdca7a2e28cae60029c64a69f267707852ee9d2bcceb56f1fa307baa3b602" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.681482 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 15:45:39 crc kubenswrapper[5033]: I0226 15:45:39.983086 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.091302 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.568490 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.759982 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-849dbf65f-dsj5m_647f4597-a6fb-472b-8420-15d6f2d802ba/oauth-openshift/1.log" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.760082 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" event={"ID":"647f4597-a6fb-472b-8420-15d6f2d802ba","Type":"ContainerStarted","Data":"4f02b65ac77ea7efa022eb7855d24340cf178e804ed5fa8ccc6cc28d6f74c70a"} Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.760635 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.770788 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.800247 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-849dbf65f-dsj5m" podStartSLOduration=78.800224055 podStartE2EDuration="1m18.800224055s" podCreationTimestamp="2026-02-26 15:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:16.570873132 +0000 UTC m=+243.783471782" watchObservedRunningTime="2026-02-26 15:45:40.800224055 +0000 UTC m=+268.012822735" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.810549 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.843015 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.904934 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.910850 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 15:45:40 crc kubenswrapper[5033]: I0226 15:45:40.933526 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 15:45:41 crc kubenswrapper[5033]: I0226 15:45:41.019999 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 15:45:41 crc kubenswrapper[5033]: I0226 15:45:41.571068 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:45:41 crc kubenswrapper[5033]: I0226 15:45:41.654327 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.727824 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.728133 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.803050 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.803164 5033 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270" exitCode=137 Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.803238 5033 scope.go:117] "RemoveContainer" containerID="803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.803297 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.837465 5033 scope.go:117] "RemoveContainer" containerID="803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270" Feb 26 15:45:43 crc kubenswrapper[5033]: E0226 15:45:43.837878 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270\": container with ID starting with 803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270 not found: ID does not exist" containerID="803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.837933 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270"} err="failed to get container status \"803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270\": rpc error: code = NotFound desc = could not find container \"803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270\": container with ID starting with 803ce7364fe7cde344a5670b885d6459b7bc131fcf6959e0596a7152400a2270 not found: ID does not exist" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.870667 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.870811 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.870831 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.870894 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.870940 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.871077 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.871157 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.871326 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.871459 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.871644 5033 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.871670 5033 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.871691 5033 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.871708 5033 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.882946 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:45:43 crc kubenswrapper[5033]: I0226 15:45:43.972562 5033 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:45:45 crc kubenswrapper[5033]: I0226 15:45:45.641654 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 26 15:45:45 crc kubenswrapper[5033]: I0226 15:45:45.642103 5033 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 26 15:45:45 crc kubenswrapper[5033]: I0226 15:45:45.656770 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:45:45 crc kubenswrapper[5033]: I0226 15:45:45.656828 5033 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="53365a16-ebc9-4494-8b58-b54f63bb4565" Feb 26 15:45:45 crc kubenswrapper[5033]: I0226 15:45:45.663775 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:45:45 crc kubenswrapper[5033]: I0226 15:45:45.663822 5033 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="53365a16-ebc9-4494-8b58-b54f63bb4565" Feb 26 15:45:50 crc kubenswrapper[5033]: I0226 15:45:50.189373 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:45:50 crc kubenswrapper[5033]: I0226 15:45:50.189827 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:45:50 crc kubenswrapper[5033]: I0226 15:45:50.190006 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:45:50 crc kubenswrapper[5033]: I0226 15:45:50.191502 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c3b531c4becd39d370617a6d65865b0e8eb7fbdb2814420616e198b3e7ab4da"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:45:50 crc kubenswrapper[5033]: I0226 15:45:50.191802 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://8c3b531c4becd39d370617a6d65865b0e8eb7fbdb2814420616e198b3e7ab4da" gracePeriod=600 Feb 26 15:45:50 crc kubenswrapper[5033]: I0226 15:45:50.856488 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="8c3b531c4becd39d370617a6d65865b0e8eb7fbdb2814420616e198b3e7ab4da" exitCode=0 Feb 26 15:45:50 crc kubenswrapper[5033]: I0226 15:45:50.856547 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"8c3b531c4becd39d370617a6d65865b0e8eb7fbdb2814420616e198b3e7ab4da"} Feb 26 15:45:50 crc kubenswrapper[5033]: I0226 15:45:50.857104 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"4fc44b0fd7677be1be0b3e7cc01e29c502f9cf985c2a2bb180a7227a759d93cf"} Feb 26 15:45:54 crc kubenswrapper[5033]: I0226 15:45:54.092524 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 15:45:54 crc kubenswrapper[5033]: I0226 15:45:54.477657 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 15:45:59 crc kubenswrapper[5033]: I0226 15:45:59.932446 5033 generic.go:334] "Generic (PLEG): container finished" podID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerID="5d8830706bf9e10a080568e69feea28da653a9a09f7e195348be9abcb238cdff" exitCode=0 Feb 26 15:45:59 crc kubenswrapper[5033]: I0226 15:45:59.932581 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" event={"ID":"4fdf7757-4d06-43cc-be27-0be0a8155b11","Type":"ContainerDied","Data":"5d8830706bf9e10a080568e69feea28da653a9a09f7e195348be9abcb238cdff"} Feb 26 15:45:59 crc kubenswrapper[5033]: I0226 15:45:59.933609 5033 scope.go:117] "RemoveContainer" containerID="5d8830706bf9e10a080568e69feea28da653a9a09f7e195348be9abcb238cdff" Feb 26 15:46:00 crc kubenswrapper[5033]: I0226 15:46:00.680136 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 15:46:00 crc kubenswrapper[5033]: I0226 15:46:00.949426 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" event={"ID":"4fdf7757-4d06-43cc-be27-0be0a8155b11","Type":"ContainerStarted","Data":"b7cb3c95697725b9fdf4bc02b14c4b27bcd7db93e1be2115c64a29ebfb6e94a5"} Feb 26 15:46:00 crc kubenswrapper[5033]: I0226 15:46:00.949965 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:46:00 crc kubenswrapper[5033]: I0226 15:46:00.951595 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:46:09 crc kubenswrapper[5033]: I0226 15:46:09.006119 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 26 15:46:09 crc kubenswrapper[5033]: I0226 15:46:09.008002 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 15:46:09 crc kubenswrapper[5033]: I0226 15:46:09.008469 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 15:46:09 crc kubenswrapper[5033]: I0226 15:46:09.008510 5033 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fad444efe3616068c7f96c99276920dab43158eb7c885dd33348d06a71cc73a6" exitCode=137 Feb 26 15:46:09 crc kubenswrapper[5033]: I0226 15:46:09.008558 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fad444efe3616068c7f96c99276920dab43158eb7c885dd33348d06a71cc73a6"} Feb 26 15:46:09 crc kubenswrapper[5033]: I0226 15:46:09.008600 5033 scope.go:117] "RemoveContainer" containerID="c86862bdc1157284ee2b33caed922e81d6dcf1ebfcc7fd0ba4c2120083ff5cf5" Feb 26 15:46:10 crc kubenswrapper[5033]: I0226 15:46:10.018636 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 26 15:46:10 crc kubenswrapper[5033]: I0226 15:46:10.020325 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 15:46:10 crc kubenswrapper[5033]: I0226 15:46:10.020383 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f46bb3d70df79d5d91c122819bfa9e7bf039626b5012cedd0e0c350480404594"} Feb 26 15:46:18 crc kubenswrapper[5033]: I0226 15:46:18.113522 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:46:18 crc kubenswrapper[5033]: I0226 15:46:18.843473 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:46:18 crc kubenswrapper[5033]: I0226 15:46:18.849988 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:46:19 crc kubenswrapper[5033]: I0226 15:46:19.081503 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.612942 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535346-655p9"] Feb 26 15:46:32 crc kubenswrapper[5033]: E0226 15:46:32.614814 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.614898 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 15:46:32 crc kubenswrapper[5033]: E0226 15:46:32.614969 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" containerName="installer" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.615034 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" containerName="installer" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.615189 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0317728e-d238-4fbd-99e3-a75fceeda178" containerName="installer" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.615815 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.616277 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535346-655p9" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.618388 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.618682 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.618837 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.620527 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr"] Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.621308 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.622666 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.628613 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.628661 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535346-655p9"] Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.661039 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr"] Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.710498 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9352ff50-4ff9-481c-99fc-ec47324bab97-secret-volume\") pod \"collect-profiles-29535345-rndzr\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.710559 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p6rd\" (UniqueName: \"kubernetes.io/projected/9352ff50-4ff9-481c-99fc-ec47324bab97-kube-api-access-2p6rd\") pod \"collect-profiles-29535345-rndzr\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.710595 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9352ff50-4ff9-481c-99fc-ec47324bab97-config-volume\") pod \"collect-profiles-29535345-rndzr\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.710625 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbd6d\" (UniqueName: \"kubernetes.io/projected/a8ea6827-f34e-4f48-8939-6fbf283f63f0-kube-api-access-zbd6d\") pod \"auto-csr-approver-29535346-655p9\" (UID: \"a8ea6827-f34e-4f48-8939-6fbf283f63f0\") " pod="openshift-infra/auto-csr-approver-29535346-655p9" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.812050 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9352ff50-4ff9-481c-99fc-ec47324bab97-config-volume\") pod \"collect-profiles-29535345-rndzr\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.812124 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbd6d\" (UniqueName: \"kubernetes.io/projected/a8ea6827-f34e-4f48-8939-6fbf283f63f0-kube-api-access-zbd6d\") pod \"auto-csr-approver-29535346-655p9\" (UID: \"a8ea6827-f34e-4f48-8939-6fbf283f63f0\") " pod="openshift-infra/auto-csr-approver-29535346-655p9" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.812184 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9352ff50-4ff9-481c-99fc-ec47324bab97-secret-volume\") pod \"collect-profiles-29535345-rndzr\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.812226 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p6rd\" (UniqueName: \"kubernetes.io/projected/9352ff50-4ff9-481c-99fc-ec47324bab97-kube-api-access-2p6rd\") pod \"collect-profiles-29535345-rndzr\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.813101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9352ff50-4ff9-481c-99fc-ec47324bab97-config-volume\") pod \"collect-profiles-29535345-rndzr\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.820112 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9352ff50-4ff9-481c-99fc-ec47324bab97-secret-volume\") pod \"collect-profiles-29535345-rndzr\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.829485 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p6rd\" (UniqueName: \"kubernetes.io/projected/9352ff50-4ff9-481c-99fc-ec47324bab97-kube-api-access-2p6rd\") pod \"collect-profiles-29535345-rndzr\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.835609 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbd6d\" (UniqueName: \"kubernetes.io/projected/a8ea6827-f34e-4f48-8939-6fbf283f63f0-kube-api-access-zbd6d\") pod \"auto-csr-approver-29535346-655p9\" (UID: \"a8ea6827-f34e-4f48-8939-6fbf283f63f0\") " pod="openshift-infra/auto-csr-approver-29535346-655p9" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.941798 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535346-655p9" Feb 26 15:46:32 crc kubenswrapper[5033]: I0226 15:46:32.957427 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:33 crc kubenswrapper[5033]: I0226 15:46:33.327590 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535346-655p9"] Feb 26 15:46:33 crc kubenswrapper[5033]: W0226 15:46:33.334738 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ea6827_f34e_4f48_8939_6fbf283f63f0.slice/crio-9ad9569aaf6ec5374e66b2fac1a2cd35c4f2f294f047fb08497d41ea0ddefc85 WatchSource:0}: Error finding container 9ad9569aaf6ec5374e66b2fac1a2cd35c4f2f294f047fb08497d41ea0ddefc85: Status 404 returned error can't find the container with id 9ad9569aaf6ec5374e66b2fac1a2cd35c4f2f294f047fb08497d41ea0ddefc85 Feb 26 15:46:33 crc kubenswrapper[5033]: I0226 15:46:33.393161 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr"] Feb 26 15:46:33 crc kubenswrapper[5033]: W0226 15:46:33.397752 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9352ff50_4ff9_481c_99fc_ec47324bab97.slice/crio-7792ef9a9069d00d5e8f6903226184e908bfafd1e0405358bfb130a4f86dc511 WatchSource:0}: Error finding container 7792ef9a9069d00d5e8f6903226184e908bfafd1e0405358bfb130a4f86dc511: Status 404 returned error can't find the container with id 7792ef9a9069d00d5e8f6903226184e908bfafd1e0405358bfb130a4f86dc511 Feb 26 15:46:34 crc kubenswrapper[5033]: I0226 15:46:34.164082 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" event={"ID":"9352ff50-4ff9-481c-99fc-ec47324bab97","Type":"ContainerStarted","Data":"3b44e3bf7dde37f3bbc00e3f35de8e8541de980855186cd19516d2d805a77974"} Feb 26 15:46:34 crc kubenswrapper[5033]: I0226 15:46:34.164800 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" event={"ID":"9352ff50-4ff9-481c-99fc-ec47324bab97","Type":"ContainerStarted","Data":"7792ef9a9069d00d5e8f6903226184e908bfafd1e0405358bfb130a4f86dc511"} Feb 26 15:46:34 crc kubenswrapper[5033]: I0226 15:46:34.167671 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535346-655p9" event={"ID":"a8ea6827-f34e-4f48-8939-6fbf283f63f0","Type":"ContainerStarted","Data":"9ad9569aaf6ec5374e66b2fac1a2cd35c4f2f294f047fb08497d41ea0ddefc85"} Feb 26 15:46:34 crc kubenswrapper[5033]: I0226 15:46:34.179751 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" podStartSLOduration=2.179700965 podStartE2EDuration="2.179700965s" podCreationTimestamp="2026-02-26 15:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:34.178908594 +0000 UTC m=+321.391507324" watchObservedRunningTime="2026-02-26 15:46:34.179700965 +0000 UTC m=+321.392299615" Feb 26 15:46:35 crc kubenswrapper[5033]: I0226 15:46:35.176451 5033 generic.go:334] "Generic (PLEG): container finished" podID="9352ff50-4ff9-481c-99fc-ec47324bab97" containerID="3b44e3bf7dde37f3bbc00e3f35de8e8541de980855186cd19516d2d805a77974" exitCode=0 Feb 26 15:46:35 crc kubenswrapper[5033]: I0226 15:46:35.176560 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" event={"ID":"9352ff50-4ff9-481c-99fc-ec47324bab97","Type":"ContainerDied","Data":"3b44e3bf7dde37f3bbc00e3f35de8e8541de980855186cd19516d2d805a77974"} Feb 26 15:46:35 crc kubenswrapper[5033]: I0226 15:46:35.179713 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535346-655p9" event={"ID":"a8ea6827-f34e-4f48-8939-6fbf283f63f0","Type":"ContainerStarted","Data":"896f36ed28c99db1b518718b80d53dfc088043f963218f9fee55457c0038b4b1"} Feb 26 15:46:35 crc kubenswrapper[5033]: I0226 15:46:35.210494 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535346-655p9" podStartSLOduration=1.808888963 podStartE2EDuration="3.210475458s" podCreationTimestamp="2026-02-26 15:46:32 +0000 UTC" firstStartedPulling="2026-02-26 15:46:33.337400857 +0000 UTC m=+320.549999537" lastFinishedPulling="2026-02-26 15:46:34.738987372 +0000 UTC m=+321.951586032" observedRunningTime="2026-02-26 15:46:35.20974305 +0000 UTC m=+322.422341720" watchObservedRunningTime="2026-02-26 15:46:35.210475458 +0000 UTC m=+322.423074108" Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.185345 5033 generic.go:334] "Generic (PLEG): container finished" podID="a8ea6827-f34e-4f48-8939-6fbf283f63f0" containerID="896f36ed28c99db1b518718b80d53dfc088043f963218f9fee55457c0038b4b1" exitCode=0 Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.185388 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535346-655p9" event={"ID":"a8ea6827-f34e-4f48-8939-6fbf283f63f0","Type":"ContainerDied","Data":"896f36ed28c99db1b518718b80d53dfc088043f963218f9fee55457c0038b4b1"} Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.453042 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.574944 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p6rd\" (UniqueName: \"kubernetes.io/projected/9352ff50-4ff9-481c-99fc-ec47324bab97-kube-api-access-2p6rd\") pod \"9352ff50-4ff9-481c-99fc-ec47324bab97\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.574995 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9352ff50-4ff9-481c-99fc-ec47324bab97-secret-volume\") pod \"9352ff50-4ff9-481c-99fc-ec47324bab97\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.575091 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9352ff50-4ff9-481c-99fc-ec47324bab97-config-volume\") pod \"9352ff50-4ff9-481c-99fc-ec47324bab97\" (UID: \"9352ff50-4ff9-481c-99fc-ec47324bab97\") " Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.575825 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9352ff50-4ff9-481c-99fc-ec47324bab97-config-volume" (OuterVolumeSpecName: "config-volume") pod "9352ff50-4ff9-481c-99fc-ec47324bab97" (UID: "9352ff50-4ff9-481c-99fc-ec47324bab97"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.581758 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9352ff50-4ff9-481c-99fc-ec47324bab97-kube-api-access-2p6rd" (OuterVolumeSpecName: "kube-api-access-2p6rd") pod "9352ff50-4ff9-481c-99fc-ec47324bab97" (UID: "9352ff50-4ff9-481c-99fc-ec47324bab97"). InnerVolumeSpecName "kube-api-access-2p6rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.581802 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9352ff50-4ff9-481c-99fc-ec47324bab97-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9352ff50-4ff9-481c-99fc-ec47324bab97" (UID: "9352ff50-4ff9-481c-99fc-ec47324bab97"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.676923 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p6rd\" (UniqueName: \"kubernetes.io/projected/9352ff50-4ff9-481c-99fc-ec47324bab97-kube-api-access-2p6rd\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.676957 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9352ff50-4ff9-481c-99fc-ec47324bab97-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:36 crc kubenswrapper[5033]: I0226 15:46:36.676966 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9352ff50-4ff9-481c-99fc-ec47324bab97-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:37 crc kubenswrapper[5033]: I0226 15:46:37.194592 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" event={"ID":"9352ff50-4ff9-481c-99fc-ec47324bab97","Type":"ContainerDied","Data":"7792ef9a9069d00d5e8f6903226184e908bfafd1e0405358bfb130a4f86dc511"} Feb 26 15:46:37 crc kubenswrapper[5033]: I0226 15:46:37.194683 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7792ef9a9069d00d5e8f6903226184e908bfafd1e0405358bfb130a4f86dc511" Feb 26 15:46:37 crc kubenswrapper[5033]: I0226 15:46:37.194616 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr" Feb 26 15:46:37 crc kubenswrapper[5033]: I0226 15:46:37.452858 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535346-655p9" Feb 26 15:46:37 crc kubenswrapper[5033]: I0226 15:46:37.592937 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbd6d\" (UniqueName: \"kubernetes.io/projected/a8ea6827-f34e-4f48-8939-6fbf283f63f0-kube-api-access-zbd6d\") pod \"a8ea6827-f34e-4f48-8939-6fbf283f63f0\" (UID: \"a8ea6827-f34e-4f48-8939-6fbf283f63f0\") " Feb 26 15:46:37 crc kubenswrapper[5033]: I0226 15:46:37.597596 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ea6827-f34e-4f48-8939-6fbf283f63f0-kube-api-access-zbd6d" (OuterVolumeSpecName: "kube-api-access-zbd6d") pod "a8ea6827-f34e-4f48-8939-6fbf283f63f0" (UID: "a8ea6827-f34e-4f48-8939-6fbf283f63f0"). InnerVolumeSpecName "kube-api-access-zbd6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:46:37 crc kubenswrapper[5033]: I0226 15:46:37.694213 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbd6d\" (UniqueName: \"kubernetes.io/projected/a8ea6827-f34e-4f48-8939-6fbf283f63f0-kube-api-access-zbd6d\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:38 crc kubenswrapper[5033]: I0226 15:46:38.206512 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535346-655p9" event={"ID":"a8ea6827-f34e-4f48-8939-6fbf283f63f0","Type":"ContainerDied","Data":"9ad9569aaf6ec5374e66b2fac1a2cd35c4f2f294f047fb08497d41ea0ddefc85"} Feb 26 15:46:38 crc kubenswrapper[5033]: I0226 15:46:38.206643 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ad9569aaf6ec5374e66b2fac1a2cd35c4f2f294f047fb08497d41ea0ddefc85" Feb 26 15:46:38 crc kubenswrapper[5033]: I0226 15:46:38.206729 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535346-655p9" Feb 26 15:47:15 crc kubenswrapper[5033]: I0226 15:47:15.758801 5033 scope.go:117] "RemoveContainer" containerID="8d94728afe1982f697d9dcd51993e2e6786fbbf9bbd6b38289bd8743ea97a1d6" Feb 26 15:47:15 crc kubenswrapper[5033]: I0226 15:47:15.788849 5033 scope.go:117] "RemoveContainer" containerID="bbaac34f35d74a81fd3cf0a23bd8fa3cd3b43a606939e9c59fad10c43b05d69b" Feb 26 15:47:37 crc kubenswrapper[5033]: I0226 15:47:37.988563 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kjzmq"] Feb 26 15:47:37 crc kubenswrapper[5033]: I0226 15:47:37.989431 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kjzmq" podUID="7a38e065-f09c-4182-ba61-d4db657510de" containerName="registry-server" containerID="cri-o://cb437783cf4bd310d39365090e25aaf5d90f48190ebf201e29effe6ab29f8ed3" gracePeriod=30 Feb 26 15:47:37 crc kubenswrapper[5033]: I0226 15:47:37.998122 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4br75"] Feb 26 15:47:37 crc kubenswrapper[5033]: I0226 15:47:37.998421 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4br75" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" containerName="registry-server" containerID="cri-o://8102b71cd1ea2248a8abbf1b2f431266fd8006c4e5e7ac694794ada6a7386d38" gracePeriod=30 Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.024815 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8mgvk"] Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.025085 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerName="marketplace-operator" containerID="cri-o://b7cb3c95697725b9fdf4bc02b14c4b27bcd7db93e1be2115c64a29ebfb6e94a5" gracePeriod=30 Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.039181 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wctnw"] Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.039514 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wctnw" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerName="registry-server" containerID="cri-o://3bbd3ff437a4b73696e2ff50f30aff7e0e60498900c83b708912f10597e8ec23" gracePeriod=30 Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.052136 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dsvmf"] Feb 26 15:47:38 crc kubenswrapper[5033]: E0226 15:47:38.052417 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ea6827-f34e-4f48-8939-6fbf283f63f0" containerName="oc" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.052438 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ea6827-f34e-4f48-8939-6fbf283f63f0" containerName="oc" Feb 26 15:47:38 crc kubenswrapper[5033]: E0226 15:47:38.052450 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9352ff50-4ff9-481c-99fc-ec47324bab97" containerName="collect-profiles" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.052458 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9352ff50-4ff9-481c-99fc-ec47324bab97" containerName="collect-profiles" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.052575 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9352ff50-4ff9-481c-99fc-ec47324bab97" containerName="collect-profiles" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.052592 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ea6827-f34e-4f48-8939-6fbf283f63f0" containerName="oc" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.053015 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.058695 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npgt5"] Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.058940 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-npgt5" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerName="registry-server" containerID="cri-o://22d0f7f4ccb6f101d1b620fb32ce4e44cf84dddb9e732145ceafe18dcd48b49e" gracePeriod=30 Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.063072 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dsvmf"] Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.172368 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c5e786b-b322-4756-9ad5-678d16b4b9f6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dsvmf\" (UID: \"3c5e786b-b322-4756-9ad5-678d16b4b9f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.172443 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4kqb\" (UniqueName: \"kubernetes.io/projected/3c5e786b-b322-4756-9ad5-678d16b4b9f6-kube-api-access-z4kqb\") pod \"marketplace-operator-79b997595-dsvmf\" (UID: \"3c5e786b-b322-4756-9ad5-678d16b4b9f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.172465 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c5e786b-b322-4756-9ad5-678d16b4b9f6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dsvmf\" (UID: \"3c5e786b-b322-4756-9ad5-678d16b4b9f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.273706 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c5e786b-b322-4756-9ad5-678d16b4b9f6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dsvmf\" (UID: \"3c5e786b-b322-4756-9ad5-678d16b4b9f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.274316 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4kqb\" (UniqueName: \"kubernetes.io/projected/3c5e786b-b322-4756-9ad5-678d16b4b9f6-kube-api-access-z4kqb\") pod \"marketplace-operator-79b997595-dsvmf\" (UID: \"3c5e786b-b322-4756-9ad5-678d16b4b9f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.274364 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c5e786b-b322-4756-9ad5-678d16b4b9f6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dsvmf\" (UID: \"3c5e786b-b322-4756-9ad5-678d16b4b9f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.275097 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c5e786b-b322-4756-9ad5-678d16b4b9f6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dsvmf\" (UID: \"3c5e786b-b322-4756-9ad5-678d16b4b9f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.286325 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c5e786b-b322-4756-9ad5-678d16b4b9f6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dsvmf\" (UID: \"3c5e786b-b322-4756-9ad5-678d16b4b9f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.294140 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4kqb\" (UniqueName: \"kubernetes.io/projected/3c5e786b-b322-4756-9ad5-678d16b4b9f6-kube-api-access-z4kqb\") pod \"marketplace-operator-79b997595-dsvmf\" (UID: \"3c5e786b-b322-4756-9ad5-678d16b4b9f6\") " pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.376847 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.629346 5033 generic.go:334] "Generic (PLEG): container finished" podID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerID="b7cb3c95697725b9fdf4bc02b14c4b27bcd7db93e1be2115c64a29ebfb6e94a5" exitCode=0 Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.629678 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" event={"ID":"4fdf7757-4d06-43cc-be27-0be0a8155b11","Type":"ContainerDied","Data":"b7cb3c95697725b9fdf4bc02b14c4b27bcd7db93e1be2115c64a29ebfb6e94a5"} Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.629708 5033 scope.go:117] "RemoveContainer" containerID="5d8830706bf9e10a080568e69feea28da653a9a09f7e195348be9abcb238cdff" Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.631867 5033 generic.go:334] "Generic (PLEG): container finished" podID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerID="22d0f7f4ccb6f101d1b620fb32ce4e44cf84dddb9e732145ceafe18dcd48b49e" exitCode=0 Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.631911 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgt5" event={"ID":"65f83e7b-3a72-450b-a54f-899aba4acd8a","Type":"ContainerDied","Data":"22d0f7f4ccb6f101d1b620fb32ce4e44cf84dddb9e732145ceafe18dcd48b49e"} Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.634735 5033 generic.go:334] "Generic (PLEG): container finished" podID="7a38e065-f09c-4182-ba61-d4db657510de" containerID="cb437783cf4bd310d39365090e25aaf5d90f48190ebf201e29effe6ab29f8ed3" exitCode=0 Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.634789 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjzmq" event={"ID":"7a38e065-f09c-4182-ba61-d4db657510de","Type":"ContainerDied","Data":"cb437783cf4bd310d39365090e25aaf5d90f48190ebf201e29effe6ab29f8ed3"} Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.636583 5033 generic.go:334] "Generic (PLEG): container finished" podID="5049f268-de58-4e79-9deb-d08c2f33397c" containerID="8102b71cd1ea2248a8abbf1b2f431266fd8006c4e5e7ac694794ada6a7386d38" exitCode=0 Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.636641 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4br75" event={"ID":"5049f268-de58-4e79-9deb-d08c2f33397c","Type":"ContainerDied","Data":"8102b71cd1ea2248a8abbf1b2f431266fd8006c4e5e7ac694794ada6a7386d38"} Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.638559 5033 generic.go:334] "Generic (PLEG): container finished" podID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerID="3bbd3ff437a4b73696e2ff50f30aff7e0e60498900c83b708912f10597e8ec23" exitCode=0 Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.638584 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wctnw" event={"ID":"596c5e6d-16b2-47b7-a052-a164aa944a7f","Type":"ContainerDied","Data":"3bbd3ff437a4b73696e2ff50f30aff7e0e60498900c83b708912f10597e8ec23"} Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.844605 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dsvmf"] Feb 26 15:47:38 crc kubenswrapper[5033]: I0226 15:47:38.939240 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4br75" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.028113 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5n9w\" (UniqueName: \"kubernetes.io/projected/5049f268-de58-4e79-9deb-d08c2f33397c-kube-api-access-j5n9w\") pod \"5049f268-de58-4e79-9deb-d08c2f33397c\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.028191 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-catalog-content\") pod \"5049f268-de58-4e79-9deb-d08c2f33397c\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.028221 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-utilities\") pod \"5049f268-de58-4e79-9deb-d08c2f33397c\" (UID: \"5049f268-de58-4e79-9deb-d08c2f33397c\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.029105 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-utilities" (OuterVolumeSpecName: "utilities") pod "5049f268-de58-4e79-9deb-d08c2f33397c" (UID: "5049f268-de58-4e79-9deb-d08c2f33397c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.037748 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5049f268-de58-4e79-9deb-d08c2f33397c-kube-api-access-j5n9w" (OuterVolumeSpecName: "kube-api-access-j5n9w") pod "5049f268-de58-4e79-9deb-d08c2f33397c" (UID: "5049f268-de58-4e79-9deb-d08c2f33397c"). InnerVolumeSpecName "kube-api-access-j5n9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.060026 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.076119 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.097272 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5049f268-de58-4e79-9deb-d08c2f33397c" (UID: "5049f268-de58-4e79-9deb-d08c2f33397c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.129061 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.129099 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5049f268-de58-4e79-9deb-d08c2f33397c-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.129112 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5n9w\" (UniqueName: \"kubernetes.io/projected/5049f268-de58-4e79-9deb-d08c2f33397c-kube-api-access-j5n9w\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.203150 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.213102 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.229769 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28t87\" (UniqueName: \"kubernetes.io/projected/4fdf7757-4d06-43cc-be27-0be0a8155b11-kube-api-access-28t87\") pod \"4fdf7757-4d06-43cc-be27-0be0a8155b11\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.229838 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6grln\" (UniqueName: \"kubernetes.io/projected/596c5e6d-16b2-47b7-a052-a164aa944a7f-kube-api-access-6grln\") pod \"596c5e6d-16b2-47b7-a052-a164aa944a7f\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.229886 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-operator-metrics\") pod \"4fdf7757-4d06-43cc-be27-0be0a8155b11\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.229911 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-catalog-content\") pod \"596c5e6d-16b2-47b7-a052-a164aa944a7f\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.229931 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-utilities\") pod \"596c5e6d-16b2-47b7-a052-a164aa944a7f\" (UID: \"596c5e6d-16b2-47b7-a052-a164aa944a7f\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.229976 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-trusted-ca\") pod \"4fdf7757-4d06-43cc-be27-0be0a8155b11\" (UID: \"4fdf7757-4d06-43cc-be27-0be0a8155b11\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.230834 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4fdf7757-4d06-43cc-be27-0be0a8155b11" (UID: "4fdf7757-4d06-43cc-be27-0be0a8155b11"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.231735 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-utilities" (OuterVolumeSpecName: "utilities") pod "596c5e6d-16b2-47b7-a052-a164aa944a7f" (UID: "596c5e6d-16b2-47b7-a052-a164aa944a7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.233619 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4fdf7757-4d06-43cc-be27-0be0a8155b11" (UID: "4fdf7757-4d06-43cc-be27-0be0a8155b11"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.233834 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596c5e6d-16b2-47b7-a052-a164aa944a7f-kube-api-access-6grln" (OuterVolumeSpecName: "kube-api-access-6grln") pod "596c5e6d-16b2-47b7-a052-a164aa944a7f" (UID: "596c5e6d-16b2-47b7-a052-a164aa944a7f"). InnerVolumeSpecName "kube-api-access-6grln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.237176 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fdf7757-4d06-43cc-be27-0be0a8155b11-kube-api-access-28t87" (OuterVolumeSpecName: "kube-api-access-28t87") pod "4fdf7757-4d06-43cc-be27-0be0a8155b11" (UID: "4fdf7757-4d06-43cc-be27-0be0a8155b11"). InnerVolumeSpecName "kube-api-access-28t87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.265353 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "596c5e6d-16b2-47b7-a052-a164aa944a7f" (UID: "596c5e6d-16b2-47b7-a052-a164aa944a7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.331217 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p5jc\" (UniqueName: \"kubernetes.io/projected/7a38e065-f09c-4182-ba61-d4db657510de-kube-api-access-6p5jc\") pod \"7a38e065-f09c-4182-ba61-d4db657510de\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.331572 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-catalog-content\") pod \"65f83e7b-3a72-450b-a54f-899aba4acd8a\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.331630 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-catalog-content\") pod \"7a38e065-f09c-4182-ba61-d4db657510de\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.331662 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9fh5\" (UniqueName: \"kubernetes.io/projected/65f83e7b-3a72-450b-a54f-899aba4acd8a-kube-api-access-x9fh5\") pod \"65f83e7b-3a72-450b-a54f-899aba4acd8a\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.331716 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-utilities\") pod \"65f83e7b-3a72-450b-a54f-899aba4acd8a\" (UID: \"65f83e7b-3a72-450b-a54f-899aba4acd8a\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.331746 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-utilities\") pod \"7a38e065-f09c-4182-ba61-d4db657510de\" (UID: \"7a38e065-f09c-4182-ba61-d4db657510de\") " Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.331948 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.331975 5033 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.331989 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28t87\" (UniqueName: \"kubernetes.io/projected/4fdf7757-4d06-43cc-be27-0be0a8155b11-kube-api-access-28t87\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.332000 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6grln\" (UniqueName: \"kubernetes.io/projected/596c5e6d-16b2-47b7-a052-a164aa944a7f-kube-api-access-6grln\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.332011 5033 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fdf7757-4d06-43cc-be27-0be0a8155b11-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.332023 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596c5e6d-16b2-47b7-a052-a164aa944a7f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.332691 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-utilities" (OuterVolumeSpecName: "utilities") pod "65f83e7b-3a72-450b-a54f-899aba4acd8a" (UID: "65f83e7b-3a72-450b-a54f-899aba4acd8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.332790 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-utilities" (OuterVolumeSpecName: "utilities") pod "7a38e065-f09c-4182-ba61-d4db657510de" (UID: "7a38e065-f09c-4182-ba61-d4db657510de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.334685 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f83e7b-3a72-450b-a54f-899aba4acd8a-kube-api-access-x9fh5" (OuterVolumeSpecName: "kube-api-access-x9fh5") pod "65f83e7b-3a72-450b-a54f-899aba4acd8a" (UID: "65f83e7b-3a72-450b-a54f-899aba4acd8a"). InnerVolumeSpecName "kube-api-access-x9fh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.340940 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a38e065-f09c-4182-ba61-d4db657510de-kube-api-access-6p5jc" (OuterVolumeSpecName: "kube-api-access-6p5jc") pod "7a38e065-f09c-4182-ba61-d4db657510de" (UID: "7a38e065-f09c-4182-ba61-d4db657510de"). InnerVolumeSpecName "kube-api-access-6p5jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.395140 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a38e065-f09c-4182-ba61-d4db657510de" (UID: "7a38e065-f09c-4182-ba61-d4db657510de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.432440 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.432469 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.432479 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p5jc\" (UniqueName: \"kubernetes.io/projected/7a38e065-f09c-4182-ba61-d4db657510de-kube-api-access-6p5jc\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.432489 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a38e065-f09c-4182-ba61-d4db657510de-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.432498 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9fh5\" (UniqueName: \"kubernetes.io/projected/65f83e7b-3a72-450b-a54f-899aba4acd8a-kube-api-access-x9fh5\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.447491 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65f83e7b-3a72-450b-a54f-899aba4acd8a" (UID: "65f83e7b-3a72-450b-a54f-899aba4acd8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.583381 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65f83e7b-3a72-450b-a54f-899aba4acd8a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.644723 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4br75" event={"ID":"5049f268-de58-4e79-9deb-d08c2f33397c","Type":"ContainerDied","Data":"267ba42384a51bc68db366f899a6a99a71d428d8a12931122b26125f017049ef"} Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.644793 5033 scope.go:117] "RemoveContainer" containerID="8102b71cd1ea2248a8abbf1b2f431266fd8006c4e5e7ac694794ada6a7386d38" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.644860 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4br75" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.650340 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wctnw" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.650878 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wctnw" event={"ID":"596c5e6d-16b2-47b7-a052-a164aa944a7f","Type":"ContainerDied","Data":"cbb767710e944035572e13b4752206f8d56ee6617d089809d6dbc6958588ed48"} Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.653516 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" event={"ID":"4fdf7757-4d06-43cc-be27-0be0a8155b11","Type":"ContainerDied","Data":"e2ce26c3cb0b01b8fd00c000e9750607af4d5b091ef2066cfd45cda39500fe43"} Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.653618 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8mgvk" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.664891 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npgt5" event={"ID":"65f83e7b-3a72-450b-a54f-899aba4acd8a","Type":"ContainerDied","Data":"432f497f1bed32f4d0107d79edd8ef44146810ef5c18734e11319965a2a2f651"} Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.664983 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npgt5" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.667083 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" event={"ID":"3c5e786b-b322-4756-9ad5-678d16b4b9f6","Type":"ContainerStarted","Data":"83ef2d9bb6bb9e33f5fb2cabb2739e65f9cbd3e4c403982d997081222f156328"} Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.667142 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" event={"ID":"3c5e786b-b322-4756-9ad5-678d16b4b9f6","Type":"ContainerStarted","Data":"c8ed281191253e68f25d6aa9492b250d5b2009c50aa34a3c4dd0eba0af175ed5"} Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.667237 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.669136 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kjzmq" event={"ID":"7a38e065-f09c-4182-ba61-d4db657510de","Type":"ContainerDied","Data":"a4d808434f981e953d255e24eef4ebf50c38dc65b6779a4fa2d03e39f5e535c8"} Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.669231 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kjzmq" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.672104 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.673693 5033 scope.go:117] "RemoveContainer" containerID="68c89a639e0a8745fbd5d4c8ce1781a2270809c01e72030a78e323e68208f099" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.690947 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dsvmf" podStartSLOduration=1.690922313 podStartE2EDuration="1.690922313s" podCreationTimestamp="2026-02-26 15:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:47:39.689487745 +0000 UTC m=+386.902086405" watchObservedRunningTime="2026-02-26 15:47:39.690922313 +0000 UTC m=+386.903520973" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.702779 5033 scope.go:117] "RemoveContainer" containerID="c0c2b9132dc3f945fe67add326a93134658a66191bb4a2f25fd720d50bd9ce68" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.744784 5033 scope.go:117] "RemoveContainer" containerID="3bbd3ff437a4b73696e2ff50f30aff7e0e60498900c83b708912f10597e8ec23" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.744911 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8mgvk"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.763723 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8mgvk"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.771115 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kjzmq"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.771803 5033 scope.go:117] "RemoveContainer" containerID="6b24334721a668295eb067b97461f033406e54e8fa89b758ad3b94664ff95f6e" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.799153 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kjzmq"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.800467 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wctnw"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.804404 5033 scope.go:117] "RemoveContainer" containerID="4b5d0b4caa3a96117deadec9b600360ca80a8b5d6be802c3feb1259491733efa" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.804471 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wctnw"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.808554 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4br75"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.811981 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4br75"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.815500 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npgt5"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.815922 5033 scope.go:117] "RemoveContainer" containerID="b7cb3c95697725b9fdf4bc02b14c4b27bcd7db93e1be2115c64a29ebfb6e94a5" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.818469 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-npgt5"] Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.832436 5033 scope.go:117] "RemoveContainer" containerID="22d0f7f4ccb6f101d1b620fb32ce4e44cf84dddb9e732145ceafe18dcd48b49e" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.853329 5033 scope.go:117] "RemoveContainer" containerID="927d38af4dbda54e8446d731cc611fdb6063d2f7a8b9ef4abcd392f1c5777f88" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.869965 5033 scope.go:117] "RemoveContainer" containerID="9e2324306761b1d114a6b1eac3c1903531e40aabdb7f08e3001b2abce0b28140" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.885612 5033 scope.go:117] "RemoveContainer" containerID="cb437783cf4bd310d39365090e25aaf5d90f48190ebf201e29effe6ab29f8ed3" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.898464 5033 scope.go:117] "RemoveContainer" containerID="0ef3d0c9a98070414756a5c0c82627b297ff7c2ff2ef6620d8519a3376b0b4e5" Feb 26 15:47:39 crc kubenswrapper[5033]: I0226 15:47:39.912947 5033 scope.go:117] "RemoveContainer" containerID="ff3a18365a2a548b36c9cc426fcae81f56aaebb13ce4c5a4cfc2f8c1a6be2085" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215039 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9b5td"] Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215377 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a38e065-f09c-4182-ba61-d4db657510de" containerName="extract-utilities" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215420 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a38e065-f09c-4182-ba61-d4db657510de" containerName="extract-utilities" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215440 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerName="extract-content" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215452 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerName="extract-content" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215471 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215481 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215496 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerName="extract-utilities" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215510 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerName="extract-utilities" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215529 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" containerName="extract-content" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215540 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" containerName="extract-content" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215558 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerName="extract-utilities" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215568 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerName="extract-utilities" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215590 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerName="extract-content" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215600 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerName="extract-content" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215618 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215628 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215640 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a38e065-f09c-4182-ba61-d4db657510de" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215651 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a38e065-f09c-4182-ba61-d4db657510de" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215664 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215674 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215690 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a38e065-f09c-4182-ba61-d4db657510de" containerName="extract-content" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215700 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a38e065-f09c-4182-ba61-d4db657510de" containerName="extract-content" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215714 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerName="marketplace-operator" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215727 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerName="marketplace-operator" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.215743 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" containerName="extract-utilities" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215756 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" containerName="extract-utilities" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215903 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerName="marketplace-operator" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215928 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerName="marketplace-operator" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215942 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a38e065-f09c-4182-ba61-d4db657510de" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215955 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215969 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.215981 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" containerName="registry-server" Feb 26 15:47:40 crc kubenswrapper[5033]: E0226 15:47:40.216127 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerName="marketplace-operator" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.216142 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" containerName="marketplace-operator" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.217172 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.218759 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b5td"] Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.219727 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.292614 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/215f71fa-45e7-4a6b-9401-60e194203f62-catalog-content\") pod \"redhat-marketplace-9b5td\" (UID: \"215f71fa-45e7-4a6b-9401-60e194203f62\") " pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.292730 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfsrp\" (UniqueName: \"kubernetes.io/projected/215f71fa-45e7-4a6b-9401-60e194203f62-kube-api-access-dfsrp\") pod \"redhat-marketplace-9b5td\" (UID: \"215f71fa-45e7-4a6b-9401-60e194203f62\") " pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.292774 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/215f71fa-45e7-4a6b-9401-60e194203f62-utilities\") pod \"redhat-marketplace-9b5td\" (UID: \"215f71fa-45e7-4a6b-9401-60e194203f62\") " pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.393458 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfsrp\" (UniqueName: \"kubernetes.io/projected/215f71fa-45e7-4a6b-9401-60e194203f62-kube-api-access-dfsrp\") pod \"redhat-marketplace-9b5td\" (UID: \"215f71fa-45e7-4a6b-9401-60e194203f62\") " pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.393549 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/215f71fa-45e7-4a6b-9401-60e194203f62-utilities\") pod \"redhat-marketplace-9b5td\" (UID: \"215f71fa-45e7-4a6b-9401-60e194203f62\") " pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.393615 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/215f71fa-45e7-4a6b-9401-60e194203f62-catalog-content\") pod \"redhat-marketplace-9b5td\" (UID: \"215f71fa-45e7-4a6b-9401-60e194203f62\") " pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.394529 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/215f71fa-45e7-4a6b-9401-60e194203f62-catalog-content\") pod \"redhat-marketplace-9b5td\" (UID: \"215f71fa-45e7-4a6b-9401-60e194203f62\") " pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.394846 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/215f71fa-45e7-4a6b-9401-60e194203f62-utilities\") pod \"redhat-marketplace-9b5td\" (UID: \"215f71fa-45e7-4a6b-9401-60e194203f62\") " pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.408709 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qdwq8"] Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.409926 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.413166 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.430534 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfsrp\" (UniqueName: \"kubernetes.io/projected/215f71fa-45e7-4a6b-9401-60e194203f62-kube-api-access-dfsrp\") pod \"redhat-marketplace-9b5td\" (UID: \"215f71fa-45e7-4a6b-9401-60e194203f62\") " pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.437750 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdwq8"] Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.495494 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkfln\" (UniqueName: \"kubernetes.io/projected/bd27068a-bd2d-482e-90ee-144ad9348bbe-kube-api-access-fkfln\") pod \"community-operators-qdwq8\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.495657 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-utilities\") pod \"community-operators-qdwq8\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.495691 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-catalog-content\") pod \"community-operators-qdwq8\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.540897 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.596926 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkfln\" (UniqueName: \"kubernetes.io/projected/bd27068a-bd2d-482e-90ee-144ad9348bbe-kube-api-access-fkfln\") pod \"community-operators-qdwq8\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.597466 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-utilities\") pod \"community-operators-qdwq8\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.597505 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-catalog-content\") pod \"community-operators-qdwq8\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.598161 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-catalog-content\") pod \"community-operators-qdwq8\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.598632 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-utilities\") pod \"community-operators-qdwq8\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.631241 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkfln\" (UniqueName: \"kubernetes.io/projected/bd27068a-bd2d-482e-90ee-144ad9348bbe-kube-api-access-fkfln\") pod \"community-operators-qdwq8\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.728930 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:40 crc kubenswrapper[5033]: I0226 15:47:40.923311 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdwq8"] Feb 26 15:47:40 crc kubenswrapper[5033]: W0226 15:47:40.925772 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd27068a_bd2d_482e_90ee_144ad9348bbe.slice/crio-097ac23ab0d6ca9a33369ffc39f056ea9d6f78d5a12dfa3e0a41bc10b7ac19e5 WatchSource:0}: Error finding container 097ac23ab0d6ca9a33369ffc39f056ea9d6f78d5a12dfa3e0a41bc10b7ac19e5: Status 404 returned error can't find the container with id 097ac23ab0d6ca9a33369ffc39f056ea9d6f78d5a12dfa3e0a41bc10b7ac19e5 Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.024076 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9b5td"] Feb 26 15:47:41 crc kubenswrapper[5033]: W0226 15:47:41.030244 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215f71fa_45e7_4a6b_9401_60e194203f62.slice/crio-027329b0db1d88584e4fa8cee0ff7ce52b6fd8c39af4deda92abdda993588224 WatchSource:0}: Error finding container 027329b0db1d88584e4fa8cee0ff7ce52b6fd8c39af4deda92abdda993588224: Status 404 returned error can't find the container with id 027329b0db1d88584e4fa8cee0ff7ce52b6fd8c39af4deda92abdda993588224 Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.637358 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fdf7757-4d06-43cc-be27-0be0a8155b11" path="/var/lib/kubelet/pods/4fdf7757-4d06-43cc-be27-0be0a8155b11/volumes" Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.638412 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5049f268-de58-4e79-9deb-d08c2f33397c" path="/var/lib/kubelet/pods/5049f268-de58-4e79-9deb-d08c2f33397c/volumes" Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.639674 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596c5e6d-16b2-47b7-a052-a164aa944a7f" path="/var/lib/kubelet/pods/596c5e6d-16b2-47b7-a052-a164aa944a7f/volumes" Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.641815 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f83e7b-3a72-450b-a54f-899aba4acd8a" path="/var/lib/kubelet/pods/65f83e7b-3a72-450b-a54f-899aba4acd8a/volumes" Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.643069 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a38e065-f09c-4182-ba61-d4db657510de" path="/var/lib/kubelet/pods/7a38e065-f09c-4182-ba61-d4db657510de/volumes" Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.689607 5033 generic.go:334] "Generic (PLEG): container finished" podID="215f71fa-45e7-4a6b-9401-60e194203f62" containerID="4e68c9b90dab53ba11333957596d03ac085f16a016c2a32438c713fefd9cc2e9" exitCode=0 Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.689731 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5td" event={"ID":"215f71fa-45e7-4a6b-9401-60e194203f62","Type":"ContainerDied","Data":"4e68c9b90dab53ba11333957596d03ac085f16a016c2a32438c713fefd9cc2e9"} Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.689794 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5td" event={"ID":"215f71fa-45e7-4a6b-9401-60e194203f62","Type":"ContainerStarted","Data":"027329b0db1d88584e4fa8cee0ff7ce52b6fd8c39af4deda92abdda993588224"} Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.694844 5033 generic.go:334] "Generic (PLEG): container finished" podID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerID="a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148" exitCode=0 Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.694916 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdwq8" event={"ID":"bd27068a-bd2d-482e-90ee-144ad9348bbe","Type":"ContainerDied","Data":"a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148"} Feb 26 15:47:41 crc kubenswrapper[5033]: I0226 15:47:41.694943 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdwq8" event={"ID":"bd27068a-bd2d-482e-90ee-144ad9348bbe","Type":"ContainerStarted","Data":"097ac23ab0d6ca9a33369ffc39f056ea9d6f78d5a12dfa3e0a41bc10b7ac19e5"} Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.605746 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rthm4"] Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.607451 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.610415 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.630673 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rthm4"] Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.724922 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9fk\" (UniqueName: \"kubernetes.io/projected/99b57b7d-f577-4511-99db-fb69eab2bfdc-kube-api-access-rd9fk\") pod \"certified-operators-rthm4\" (UID: \"99b57b7d-f577-4511-99db-fb69eab2bfdc\") " pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.725030 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b57b7d-f577-4511-99db-fb69eab2bfdc-catalog-content\") pod \"certified-operators-rthm4\" (UID: \"99b57b7d-f577-4511-99db-fb69eab2bfdc\") " pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.725163 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b57b7d-f577-4511-99db-fb69eab2bfdc-utilities\") pod \"certified-operators-rthm4\" (UID: \"99b57b7d-f577-4511-99db-fb69eab2bfdc\") " pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.806289 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-svqb2"] Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.807143 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.809965 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.820578 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svqb2"] Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.826668 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b57b7d-f577-4511-99db-fb69eab2bfdc-catalog-content\") pod \"certified-operators-rthm4\" (UID: \"99b57b7d-f577-4511-99db-fb69eab2bfdc\") " pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.826779 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b57b7d-f577-4511-99db-fb69eab2bfdc-utilities\") pod \"certified-operators-rthm4\" (UID: \"99b57b7d-f577-4511-99db-fb69eab2bfdc\") " pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.826860 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9fk\" (UniqueName: \"kubernetes.io/projected/99b57b7d-f577-4511-99db-fb69eab2bfdc-kube-api-access-rd9fk\") pod \"certified-operators-rthm4\" (UID: \"99b57b7d-f577-4511-99db-fb69eab2bfdc\") " pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.827724 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99b57b7d-f577-4511-99db-fb69eab2bfdc-catalog-content\") pod \"certified-operators-rthm4\" (UID: \"99b57b7d-f577-4511-99db-fb69eab2bfdc\") " pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.828813 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99b57b7d-f577-4511-99db-fb69eab2bfdc-utilities\") pod \"certified-operators-rthm4\" (UID: \"99b57b7d-f577-4511-99db-fb69eab2bfdc\") " pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.850754 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9fk\" (UniqueName: \"kubernetes.io/projected/99b57b7d-f577-4511-99db-fb69eab2bfdc-kube-api-access-rd9fk\") pod \"certified-operators-rthm4\" (UID: \"99b57b7d-f577-4511-99db-fb69eab2bfdc\") " pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.927816 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-utilities\") pod \"redhat-operators-svqb2\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.927913 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmhn5\" (UniqueName: \"kubernetes.io/projected/1171e541-6384-4918-b83b-1eeff776308f-kube-api-access-nmhn5\") pod \"redhat-operators-svqb2\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.927964 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-catalog-content\") pod \"redhat-operators-svqb2\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:42 crc kubenswrapper[5033]: I0226 15:47:42.971768 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.028737 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmhn5\" (UniqueName: \"kubernetes.io/projected/1171e541-6384-4918-b83b-1eeff776308f-kube-api-access-nmhn5\") pod \"redhat-operators-svqb2\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.028809 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-catalog-content\") pod \"redhat-operators-svqb2\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.028859 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-utilities\") pod \"redhat-operators-svqb2\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.029424 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-utilities\") pod \"redhat-operators-svqb2\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.029805 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-catalog-content\") pod \"redhat-operators-svqb2\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.058236 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmhn5\" (UniqueName: \"kubernetes.io/projected/1171e541-6384-4918-b83b-1eeff776308f-kube-api-access-nmhn5\") pod \"redhat-operators-svqb2\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.182647 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rthm4"] Feb 26 15:47:43 crc kubenswrapper[5033]: W0226 15:47:43.198807 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b57b7d_f577_4511_99db_fb69eab2bfdc.slice/crio-708a97351b19da7f4b28c9844201a407764a44990a6af20d5a4bf38fa349ddc7 WatchSource:0}: Error finding container 708a97351b19da7f4b28c9844201a407764a44990a6af20d5a4bf38fa349ddc7: Status 404 returned error can't find the container with id 708a97351b19da7f4b28c9844201a407764a44990a6af20d5a4bf38fa349ddc7 Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.255348 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:43 crc kubenswrapper[5033]: W0226 15:47:43.450495 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1171e541_6384_4918_b83b_1eeff776308f.slice/crio-71514f8e574bafc481750be954b016bc88fb47b1189a681532c982e18306699a WatchSource:0}: Error finding container 71514f8e574bafc481750be954b016bc88fb47b1189a681532c982e18306699a: Status 404 returned error can't find the container with id 71514f8e574bafc481750be954b016bc88fb47b1189a681532c982e18306699a Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.453177 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svqb2"] Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.716957 5033 generic.go:334] "Generic (PLEG): container finished" podID="1171e541-6384-4918-b83b-1eeff776308f" containerID="a93dbb1f3d946e7230d163d231dfe67800e8f9a3815c1f0a62a274754e1af53c" exitCode=0 Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.717026 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svqb2" event={"ID":"1171e541-6384-4918-b83b-1eeff776308f","Type":"ContainerDied","Data":"a93dbb1f3d946e7230d163d231dfe67800e8f9a3815c1f0a62a274754e1af53c"} Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.717386 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svqb2" event={"ID":"1171e541-6384-4918-b83b-1eeff776308f","Type":"ContainerStarted","Data":"71514f8e574bafc481750be954b016bc88fb47b1189a681532c982e18306699a"} Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.721072 5033 generic.go:334] "Generic (PLEG): container finished" podID="99b57b7d-f577-4511-99db-fb69eab2bfdc" containerID="a111e74033c07463ab151930e038efad7401f6be753f5d1e8ca5fbf89351f142" exitCode=0 Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.721148 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rthm4" event={"ID":"99b57b7d-f577-4511-99db-fb69eab2bfdc","Type":"ContainerDied","Data":"a111e74033c07463ab151930e038efad7401f6be753f5d1e8ca5fbf89351f142"} Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.721212 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rthm4" event={"ID":"99b57b7d-f577-4511-99db-fb69eab2bfdc","Type":"ContainerStarted","Data":"708a97351b19da7f4b28c9844201a407764a44990a6af20d5a4bf38fa349ddc7"} Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.725438 5033 generic.go:334] "Generic (PLEG): container finished" podID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerID="46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145" exitCode=0 Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.725476 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdwq8" event={"ID":"bd27068a-bd2d-482e-90ee-144ad9348bbe","Type":"ContainerDied","Data":"46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145"} Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.730472 5033 generic.go:334] "Generic (PLEG): container finished" podID="215f71fa-45e7-4a6b-9401-60e194203f62" containerID="9771db848d6493e90e1449b504a2d90aa9bdc17ee1cc2e1a6f4471c87926021d" exitCode=0 Feb 26 15:47:43 crc kubenswrapper[5033]: I0226 15:47:43.730507 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5td" event={"ID":"215f71fa-45e7-4a6b-9401-60e194203f62","Type":"ContainerDied","Data":"9771db848d6493e90e1449b504a2d90aa9bdc17ee1cc2e1a6f4471c87926021d"} Feb 26 15:47:44 crc kubenswrapper[5033]: I0226 15:47:44.740792 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdwq8" event={"ID":"bd27068a-bd2d-482e-90ee-144ad9348bbe","Type":"ContainerStarted","Data":"33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431"} Feb 26 15:47:44 crc kubenswrapper[5033]: I0226 15:47:44.760702 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qdwq8" podStartSLOduration=2.103655993 podStartE2EDuration="4.760677608s" podCreationTimestamp="2026-02-26 15:47:40 +0000 UTC" firstStartedPulling="2026-02-26 15:47:41.700559277 +0000 UTC m=+388.913157967" lastFinishedPulling="2026-02-26 15:47:44.357580922 +0000 UTC m=+391.570179582" observedRunningTime="2026-02-26 15:47:44.75779237 +0000 UTC m=+391.970391060" watchObservedRunningTime="2026-02-26 15:47:44.760677608 +0000 UTC m=+391.973276268" Feb 26 15:47:45 crc kubenswrapper[5033]: I0226 15:47:45.747984 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9b5td" event={"ID":"215f71fa-45e7-4a6b-9401-60e194203f62","Type":"ContainerStarted","Data":"91f1046ef5c3fd30378cafd7defe5f0111c1efee09ac77f63f00685f7b3215ec"} Feb 26 15:47:45 crc kubenswrapper[5033]: I0226 15:47:45.750872 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svqb2" event={"ID":"1171e541-6384-4918-b83b-1eeff776308f","Type":"ContainerStarted","Data":"43a08a482fe2817cc2b3bf99b789d9c3fc55e937a654787bc31b29b4d1db6b2e"} Feb 26 15:47:45 crc kubenswrapper[5033]: I0226 15:47:45.753580 5033 generic.go:334] "Generic (PLEG): container finished" podID="99b57b7d-f577-4511-99db-fb69eab2bfdc" containerID="d5a7639576a415f36a21acf923ba19e422734db7f1317d96d2d7895b432c33da" exitCode=0 Feb 26 15:47:45 crc kubenswrapper[5033]: I0226 15:47:45.753643 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rthm4" event={"ID":"99b57b7d-f577-4511-99db-fb69eab2bfdc","Type":"ContainerDied","Data":"d5a7639576a415f36a21acf923ba19e422734db7f1317d96d2d7895b432c33da"} Feb 26 15:47:45 crc kubenswrapper[5033]: I0226 15:47:45.777795 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9b5td" podStartSLOduration=2.637063285 podStartE2EDuration="5.777773253s" podCreationTimestamp="2026-02-26 15:47:40 +0000 UTC" firstStartedPulling="2026-02-26 15:47:41.691583792 +0000 UTC m=+388.904182482" lastFinishedPulling="2026-02-26 15:47:44.8322938 +0000 UTC m=+392.044892450" observedRunningTime="2026-02-26 15:47:45.772440306 +0000 UTC m=+392.985038956" watchObservedRunningTime="2026-02-26 15:47:45.777773253 +0000 UTC m=+392.990371943" Feb 26 15:47:46 crc kubenswrapper[5033]: I0226 15:47:46.770385 5033 generic.go:334] "Generic (PLEG): container finished" podID="1171e541-6384-4918-b83b-1eeff776308f" containerID="43a08a482fe2817cc2b3bf99b789d9c3fc55e937a654787bc31b29b4d1db6b2e" exitCode=0 Feb 26 15:47:46 crc kubenswrapper[5033]: I0226 15:47:46.771122 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svqb2" event={"ID":"1171e541-6384-4918-b83b-1eeff776308f","Type":"ContainerDied","Data":"43a08a482fe2817cc2b3bf99b789d9c3fc55e937a654787bc31b29b4d1db6b2e"} Feb 26 15:47:46 crc kubenswrapper[5033]: I0226 15:47:46.791069 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rthm4" event={"ID":"99b57b7d-f577-4511-99db-fb69eab2bfdc","Type":"ContainerStarted","Data":"48c3cd31086698281d7f873d4db6279fd6cbcad780c0be7170c596549579a974"} Feb 26 15:47:46 crc kubenswrapper[5033]: I0226 15:47:46.827731 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rthm4" podStartSLOduration=2.258672994 podStartE2EDuration="4.827709454s" podCreationTimestamp="2026-02-26 15:47:42 +0000 UTC" firstStartedPulling="2026-02-26 15:47:43.723125909 +0000 UTC m=+390.935724569" lastFinishedPulling="2026-02-26 15:47:46.292162379 +0000 UTC m=+393.504761029" observedRunningTime="2026-02-26 15:47:46.82161137 +0000 UTC m=+394.034210030" watchObservedRunningTime="2026-02-26 15:47:46.827709454 +0000 UTC m=+394.040308104" Feb 26 15:47:47 crc kubenswrapper[5033]: I0226 15:47:47.815671 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svqb2" event={"ID":"1171e541-6384-4918-b83b-1eeff776308f","Type":"ContainerStarted","Data":"6c05275a297a57bd37b28881816a31e844d667470d8c5ebd534665d84c1ed373"} Feb 26 15:47:47 crc kubenswrapper[5033]: I0226 15:47:47.834641 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-svqb2" podStartSLOduration=2.325998895 podStartE2EDuration="5.834625069s" podCreationTimestamp="2026-02-26 15:47:42 +0000 UTC" firstStartedPulling="2026-02-26 15:47:43.719983414 +0000 UTC m=+390.932582094" lastFinishedPulling="2026-02-26 15:47:47.228609608 +0000 UTC m=+394.441208268" observedRunningTime="2026-02-26 15:47:47.83253118 +0000 UTC m=+395.045129830" watchObservedRunningTime="2026-02-26 15:47:47.834625069 +0000 UTC m=+395.047223719" Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.189305 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.189369 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.541601 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.541666 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.595683 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.730495 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.730598 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.772081 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.883056 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9b5td" Feb 26 15:47:50 crc kubenswrapper[5033]: I0226 15:47:50.888401 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qdwq8" Feb 26 15:47:52 crc kubenswrapper[5033]: I0226 15:47:52.972853 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:52 crc kubenswrapper[5033]: I0226 15:47:52.973198 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:53 crc kubenswrapper[5033]: I0226 15:47:53.017044 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:53 crc kubenswrapper[5033]: I0226 15:47:53.255886 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:53 crc kubenswrapper[5033]: I0226 15:47:53.255954 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:47:53 crc kubenswrapper[5033]: I0226 15:47:53.906515 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rthm4" Feb 26 15:47:54 crc kubenswrapper[5033]: I0226 15:47:54.319200 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svqb2" podUID="1171e541-6384-4918-b83b-1eeff776308f" containerName="registry-server" probeResult="failure" output=< Feb 26 15:47:54 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 15:47:54 crc kubenswrapper[5033]: > Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.148579 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535348-fmnfx"] Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.150405 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-fmnfx" Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.152122 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.155654 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.157095 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535348-fmnfx"] Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.157908 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.265018 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64lp\" (UniqueName: \"kubernetes.io/projected/859a38e3-1926-4c07-8ae1-2745516cffa0-kube-api-access-s64lp\") pod \"auto-csr-approver-29535348-fmnfx\" (UID: \"859a38e3-1926-4c07-8ae1-2745516cffa0\") " pod="openshift-infra/auto-csr-approver-29535348-fmnfx" Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.367220 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s64lp\" (UniqueName: \"kubernetes.io/projected/859a38e3-1926-4c07-8ae1-2745516cffa0-kube-api-access-s64lp\") pod \"auto-csr-approver-29535348-fmnfx\" (UID: \"859a38e3-1926-4c07-8ae1-2745516cffa0\") " pod="openshift-infra/auto-csr-approver-29535348-fmnfx" Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.400499 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64lp\" (UniqueName: \"kubernetes.io/projected/859a38e3-1926-4c07-8ae1-2745516cffa0-kube-api-access-s64lp\") pod \"auto-csr-approver-29535348-fmnfx\" (UID: \"859a38e3-1926-4c07-8ae1-2745516cffa0\") " pod="openshift-infra/auto-csr-approver-29535348-fmnfx" Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.471317 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-fmnfx" Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.685924 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535348-fmnfx"] Feb 26 15:48:00 crc kubenswrapper[5033]: I0226 15:48:00.891289 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535348-fmnfx" event={"ID":"859a38e3-1926-4c07-8ae1-2745516cffa0","Type":"ContainerStarted","Data":"436ab691f9d7ee6e031367345526ea1aeef5c077ecd2778c9aa0939f3b42b19d"} Feb 26 15:48:02 crc kubenswrapper[5033]: I0226 15:48:02.905388 5033 generic.go:334] "Generic (PLEG): container finished" podID="859a38e3-1926-4c07-8ae1-2745516cffa0" containerID="ce3e7ed8c304bf24b1ef3e58aa4817fe75e653e695189f4925a354922817be54" exitCode=0 Feb 26 15:48:02 crc kubenswrapper[5033]: I0226 15:48:02.905527 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535348-fmnfx" event={"ID":"859a38e3-1926-4c07-8ae1-2745516cffa0","Type":"ContainerDied","Data":"ce3e7ed8c304bf24b1ef3e58aa4817fe75e653e695189f4925a354922817be54"} Feb 26 15:48:03 crc kubenswrapper[5033]: I0226 15:48:03.325712 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:48:03 crc kubenswrapper[5033]: I0226 15:48:03.388656 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 15:48:04 crc kubenswrapper[5033]: I0226 15:48:04.158371 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-fmnfx" Feb 26 15:48:04 crc kubenswrapper[5033]: I0226 15:48:04.317575 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s64lp\" (UniqueName: \"kubernetes.io/projected/859a38e3-1926-4c07-8ae1-2745516cffa0-kube-api-access-s64lp\") pod \"859a38e3-1926-4c07-8ae1-2745516cffa0\" (UID: \"859a38e3-1926-4c07-8ae1-2745516cffa0\") " Feb 26 15:48:04 crc kubenswrapper[5033]: I0226 15:48:04.326794 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/859a38e3-1926-4c07-8ae1-2745516cffa0-kube-api-access-s64lp" (OuterVolumeSpecName: "kube-api-access-s64lp") pod "859a38e3-1926-4c07-8ae1-2745516cffa0" (UID: "859a38e3-1926-4c07-8ae1-2745516cffa0"). InnerVolumeSpecName "kube-api-access-s64lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:48:04 crc kubenswrapper[5033]: I0226 15:48:04.419981 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s64lp\" (UniqueName: \"kubernetes.io/projected/859a38e3-1926-4c07-8ae1-2745516cffa0-kube-api-access-s64lp\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:04 crc kubenswrapper[5033]: I0226 15:48:04.919155 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535348-fmnfx" event={"ID":"859a38e3-1926-4c07-8ae1-2745516cffa0","Type":"ContainerDied","Data":"436ab691f9d7ee6e031367345526ea1aeef5c077ecd2778c9aa0939f3b42b19d"} Feb 26 15:48:04 crc kubenswrapper[5033]: I0226 15:48:04.919194 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="436ab691f9d7ee6e031367345526ea1aeef5c077ecd2778c9aa0939f3b42b19d" Feb 26 15:48:04 crc kubenswrapper[5033]: I0226 15:48:04.919217 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-fmnfx" Feb 26 15:48:15 crc kubenswrapper[5033]: I0226 15:48:15.833135 5033 scope.go:117] "RemoveContainer" containerID="197f920fa47fcbaf0cbf3ec395375f94ed5987d0c4a41304c0f91de4f42c2413" Feb 26 15:48:15 crc kubenswrapper[5033]: I0226 15:48:15.855745 5033 scope.go:117] "RemoveContainer" containerID="2ee34457003ba51d3d5b231d494ceec240dc1b98dc66a709ba91c149f761aa0e" Feb 26 15:48:15 crc kubenswrapper[5033]: I0226 15:48:15.876782 5033 scope.go:117] "RemoveContainer" containerID="7cc9fb46f1235beacf00d3a8c240bde10f977dcaa7fc3c8a1fa0c07d9c81a203" Feb 26 15:48:20 crc kubenswrapper[5033]: I0226 15:48:20.189216 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:48:20 crc kubenswrapper[5033]: I0226 15:48:20.189599 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:48:50 crc kubenswrapper[5033]: I0226 15:48:50.189458 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:48:50 crc kubenswrapper[5033]: I0226 15:48:50.190102 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:48:50 crc kubenswrapper[5033]: I0226 15:48:50.190167 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:48:50 crc kubenswrapper[5033]: I0226 15:48:50.190914 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fc44b0fd7677be1be0b3e7cc01e29c502f9cf985c2a2bb180a7227a759d93cf"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:48:50 crc kubenswrapper[5033]: I0226 15:48:50.190985 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://4fc44b0fd7677be1be0b3e7cc01e29c502f9cf985c2a2bb180a7227a759d93cf" gracePeriod=600 Feb 26 15:48:51 crc kubenswrapper[5033]: I0226 15:48:51.265880 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="4fc44b0fd7677be1be0b3e7cc01e29c502f9cf985c2a2bb180a7227a759d93cf" exitCode=0 Feb 26 15:48:51 crc kubenswrapper[5033]: I0226 15:48:51.265994 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"4fc44b0fd7677be1be0b3e7cc01e29c502f9cf985c2a2bb180a7227a759d93cf"} Feb 26 15:48:51 crc kubenswrapper[5033]: I0226 15:48:51.266314 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"70e42a035ba4f49db38bd22aeb347145c8b4c34b93000efaa8e848e164bec4f1"} Feb 26 15:48:51 crc kubenswrapper[5033]: I0226 15:48:51.266346 5033 scope.go:117] "RemoveContainer" containerID="8c3b531c4becd39d370617a6d65865b0e8eb7fbdb2814420616e198b3e7ab4da" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.139168 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535350-45mph"] Feb 26 15:50:00 crc kubenswrapper[5033]: E0226 15:50:00.140297 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="859a38e3-1926-4c07-8ae1-2745516cffa0" containerName="oc" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.140318 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="859a38e3-1926-4c07-8ae1-2745516cffa0" containerName="oc" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.140555 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="859a38e3-1926-4c07-8ae1-2745516cffa0" containerName="oc" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.141152 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535350-45mph" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.144341 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.144560 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.145905 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.146137 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535350-45mph"] Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.304684 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v47xb\" (UniqueName: \"kubernetes.io/projected/aa22708b-29dc-4833-ae29-d89fa7465f05-kube-api-access-v47xb\") pod \"auto-csr-approver-29535350-45mph\" (UID: \"aa22708b-29dc-4833-ae29-d89fa7465f05\") " pod="openshift-infra/auto-csr-approver-29535350-45mph" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.406130 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v47xb\" (UniqueName: \"kubernetes.io/projected/aa22708b-29dc-4833-ae29-d89fa7465f05-kube-api-access-v47xb\") pod \"auto-csr-approver-29535350-45mph\" (UID: \"aa22708b-29dc-4833-ae29-d89fa7465f05\") " pod="openshift-infra/auto-csr-approver-29535350-45mph" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.427329 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v47xb\" (UniqueName: \"kubernetes.io/projected/aa22708b-29dc-4833-ae29-d89fa7465f05-kube-api-access-v47xb\") pod \"auto-csr-approver-29535350-45mph\" (UID: \"aa22708b-29dc-4833-ae29-d89fa7465f05\") " pod="openshift-infra/auto-csr-approver-29535350-45mph" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.461806 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535350-45mph" Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.657197 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535350-45mph"] Feb 26 15:50:00 crc kubenswrapper[5033]: I0226 15:50:00.662176 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 15:50:01 crc kubenswrapper[5033]: I0226 15:50:01.172752 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535350-45mph" event={"ID":"aa22708b-29dc-4833-ae29-d89fa7465f05","Type":"ContainerStarted","Data":"da456c6deeeb2e337f0163350a20bc40843643793ada63c8957ee666af5b12c4"} Feb 26 15:50:02 crc kubenswrapper[5033]: I0226 15:50:02.183620 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535350-45mph" event={"ID":"aa22708b-29dc-4833-ae29-d89fa7465f05","Type":"ContainerStarted","Data":"b7d858f662b824e7399e6a2140f83edd1fd4b58e685f668288fab1993e10c90d"} Feb 26 15:50:02 crc kubenswrapper[5033]: I0226 15:50:02.204751 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535350-45mph" podStartSLOduration=1.096147055 podStartE2EDuration="2.204725848s" podCreationTimestamp="2026-02-26 15:50:00 +0000 UTC" firstStartedPulling="2026-02-26 15:50:00.661908113 +0000 UTC m=+527.874506773" lastFinishedPulling="2026-02-26 15:50:01.770486916 +0000 UTC m=+528.983085566" observedRunningTime="2026-02-26 15:50:02.203072959 +0000 UTC m=+529.415671619" watchObservedRunningTime="2026-02-26 15:50:02.204725848 +0000 UTC m=+529.417324498" Feb 26 15:50:03 crc kubenswrapper[5033]: I0226 15:50:03.193990 5033 generic.go:334] "Generic (PLEG): container finished" podID="aa22708b-29dc-4833-ae29-d89fa7465f05" containerID="b7d858f662b824e7399e6a2140f83edd1fd4b58e685f668288fab1993e10c90d" exitCode=0 Feb 26 15:50:03 crc kubenswrapper[5033]: I0226 15:50:03.194098 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535350-45mph" event={"ID":"aa22708b-29dc-4833-ae29-d89fa7465f05","Type":"ContainerDied","Data":"b7d858f662b824e7399e6a2140f83edd1fd4b58e685f668288fab1993e10c90d"} Feb 26 15:50:04 crc kubenswrapper[5033]: I0226 15:50:04.415677 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535350-45mph" Feb 26 15:50:04 crc kubenswrapper[5033]: I0226 15:50:04.559509 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47xb\" (UniqueName: \"kubernetes.io/projected/aa22708b-29dc-4833-ae29-d89fa7465f05-kube-api-access-v47xb\") pod \"aa22708b-29dc-4833-ae29-d89fa7465f05\" (UID: \"aa22708b-29dc-4833-ae29-d89fa7465f05\") " Feb 26 15:50:04 crc kubenswrapper[5033]: I0226 15:50:04.564474 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa22708b-29dc-4833-ae29-d89fa7465f05-kube-api-access-v47xb" (OuterVolumeSpecName: "kube-api-access-v47xb") pod "aa22708b-29dc-4833-ae29-d89fa7465f05" (UID: "aa22708b-29dc-4833-ae29-d89fa7465f05"). InnerVolumeSpecName "kube-api-access-v47xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:04 crc kubenswrapper[5033]: I0226 15:50:04.661626 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47xb\" (UniqueName: \"kubernetes.io/projected/aa22708b-29dc-4833-ae29-d89fa7465f05-kube-api-access-v47xb\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:05 crc kubenswrapper[5033]: I0226 15:50:05.214334 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535350-45mph" event={"ID":"aa22708b-29dc-4833-ae29-d89fa7465f05","Type":"ContainerDied","Data":"da456c6deeeb2e337f0163350a20bc40843643793ada63c8957ee666af5b12c4"} Feb 26 15:50:05 crc kubenswrapper[5033]: I0226 15:50:05.214412 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da456c6deeeb2e337f0163350a20bc40843643793ada63c8957ee666af5b12c4" Feb 26 15:50:05 crc kubenswrapper[5033]: I0226 15:50:05.214448 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535350-45mph" Feb 26 15:50:05 crc kubenswrapper[5033]: I0226 15:50:05.271661 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535344-glfw9"] Feb 26 15:50:05 crc kubenswrapper[5033]: I0226 15:50:05.275480 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535344-glfw9"] Feb 26 15:50:05 crc kubenswrapper[5033]: I0226 15:50:05.635391 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea999e72-9356-4693-a1f4-ced6617d4b65" path="/var/lib/kubelet/pods/ea999e72-9356-4693-a1f4-ced6617d4b65/volumes" Feb 26 15:50:15 crc kubenswrapper[5033]: I0226 15:50:15.981357 5033 scope.go:117] "RemoveContainer" containerID="2da09d43f341fda6117b35af9902a0dadd423ab611361a601e53c33f7c424abf" Feb 26 15:50:16 crc kubenswrapper[5033]: I0226 15:50:16.015550 5033 scope.go:117] "RemoveContainer" containerID="05cd628789e65980a86fade8b907e76524f06a00fa636a0ef4705f36900fa4e1" Feb 26 15:50:16 crc kubenswrapper[5033]: I0226 15:50:16.060469 5033 scope.go:117] "RemoveContainer" containerID="48e04d557923ed884258f88133ca1a399352dd3cb22d15a13d8afa5f7566c5ee" Feb 26 15:50:50 crc kubenswrapper[5033]: I0226 15:50:50.188847 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:50:50 crc kubenswrapper[5033]: I0226 15:50:50.189474 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:51:20 crc kubenswrapper[5033]: I0226 15:51:20.189496 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:51:20 crc kubenswrapper[5033]: I0226 15:51:20.190287 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:51:50 crc kubenswrapper[5033]: I0226 15:51:50.189559 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:51:50 crc kubenswrapper[5033]: I0226 15:51:50.190578 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:51:50 crc kubenswrapper[5033]: I0226 15:51:50.190757 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:51:50 crc kubenswrapper[5033]: I0226 15:51:50.192345 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70e42a035ba4f49db38bd22aeb347145c8b4c34b93000efaa8e848e164bec4f1"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:51:50 crc kubenswrapper[5033]: I0226 15:51:50.192559 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://70e42a035ba4f49db38bd22aeb347145c8b4c34b93000efaa8e848e164bec4f1" gracePeriod=600 Feb 26 15:51:50 crc kubenswrapper[5033]: I0226 15:51:50.897206 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="70e42a035ba4f49db38bd22aeb347145c8b4c34b93000efaa8e848e164bec4f1" exitCode=0 Feb 26 15:51:50 crc kubenswrapper[5033]: I0226 15:51:50.897291 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"70e42a035ba4f49db38bd22aeb347145c8b4c34b93000efaa8e848e164bec4f1"} Feb 26 15:51:50 crc kubenswrapper[5033]: I0226 15:51:50.897726 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"8ea382d5339c87a85284b67b2964d5c78dfbd09c05b9d81cf3fc9df0642c2fcc"} Feb 26 15:51:50 crc kubenswrapper[5033]: I0226 15:51:50.897758 5033 scope.go:117] "RemoveContainer" containerID="4fc44b0fd7677be1be0b3e7cc01e29c502f9cf985c2a2bb180a7227a759d93cf" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.144255 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535352-b7nrc"] Feb 26 15:52:00 crc kubenswrapper[5033]: E0226 15:52:00.145203 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa22708b-29dc-4833-ae29-d89fa7465f05" containerName="oc" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.145223 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa22708b-29dc-4833-ae29-d89fa7465f05" containerName="oc" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.145397 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa22708b-29dc-4833-ae29-d89fa7465f05" containerName="oc" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.145958 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535352-b7nrc" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.149755 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.150179 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.151309 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535352-b7nrc"] Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.156088 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.247855 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrzf\" (UniqueName: \"kubernetes.io/projected/7bef3ffd-290a-42fb-ae79-e423db030c18-kube-api-access-7zrzf\") pod \"auto-csr-approver-29535352-b7nrc\" (UID: \"7bef3ffd-290a-42fb-ae79-e423db030c18\") " pod="openshift-infra/auto-csr-approver-29535352-b7nrc" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.348663 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrzf\" (UniqueName: \"kubernetes.io/projected/7bef3ffd-290a-42fb-ae79-e423db030c18-kube-api-access-7zrzf\") pod \"auto-csr-approver-29535352-b7nrc\" (UID: \"7bef3ffd-290a-42fb-ae79-e423db030c18\") " pod="openshift-infra/auto-csr-approver-29535352-b7nrc" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.374029 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrzf\" (UniqueName: \"kubernetes.io/projected/7bef3ffd-290a-42fb-ae79-e423db030c18-kube-api-access-7zrzf\") pod \"auto-csr-approver-29535352-b7nrc\" (UID: \"7bef3ffd-290a-42fb-ae79-e423db030c18\") " pod="openshift-infra/auto-csr-approver-29535352-b7nrc" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.480524 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535352-b7nrc" Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.682157 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535352-b7nrc"] Feb 26 15:52:00 crc kubenswrapper[5033]: W0226 15:52:00.691447 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bef3ffd_290a_42fb_ae79_e423db030c18.slice/crio-b3b501ae04407389360e9b97c7e31e817983502056df5cdc654aaecaa6c0a041 WatchSource:0}: Error finding container b3b501ae04407389360e9b97c7e31e817983502056df5cdc654aaecaa6c0a041: Status 404 returned error can't find the container with id b3b501ae04407389360e9b97c7e31e817983502056df5cdc654aaecaa6c0a041 Feb 26 15:52:00 crc kubenswrapper[5033]: I0226 15:52:00.963626 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535352-b7nrc" event={"ID":"7bef3ffd-290a-42fb-ae79-e423db030c18","Type":"ContainerStarted","Data":"b3b501ae04407389360e9b97c7e31e817983502056df5cdc654aaecaa6c0a041"} Feb 26 15:52:02 crc kubenswrapper[5033]: I0226 15:52:02.977115 5033 generic.go:334] "Generic (PLEG): container finished" podID="7bef3ffd-290a-42fb-ae79-e423db030c18" containerID="0ab25daca996b2874ca4275af71b08437be053512a7d47f352be8c2c16a7979f" exitCode=0 Feb 26 15:52:02 crc kubenswrapper[5033]: I0226 15:52:02.977180 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535352-b7nrc" event={"ID":"7bef3ffd-290a-42fb-ae79-e423db030c18","Type":"ContainerDied","Data":"0ab25daca996b2874ca4275af71b08437be053512a7d47f352be8c2c16a7979f"} Feb 26 15:52:04 crc kubenswrapper[5033]: I0226 15:52:04.273010 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535352-b7nrc" Feb 26 15:52:04 crc kubenswrapper[5033]: I0226 15:52:04.302592 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zrzf\" (UniqueName: \"kubernetes.io/projected/7bef3ffd-290a-42fb-ae79-e423db030c18-kube-api-access-7zrzf\") pod \"7bef3ffd-290a-42fb-ae79-e423db030c18\" (UID: \"7bef3ffd-290a-42fb-ae79-e423db030c18\") " Feb 26 15:52:04 crc kubenswrapper[5033]: I0226 15:52:04.316505 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bef3ffd-290a-42fb-ae79-e423db030c18-kube-api-access-7zrzf" (OuterVolumeSpecName: "kube-api-access-7zrzf") pod "7bef3ffd-290a-42fb-ae79-e423db030c18" (UID: "7bef3ffd-290a-42fb-ae79-e423db030c18"). InnerVolumeSpecName "kube-api-access-7zrzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:52:04 crc kubenswrapper[5033]: I0226 15:52:04.404829 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zrzf\" (UniqueName: \"kubernetes.io/projected/7bef3ffd-290a-42fb-ae79-e423db030c18-kube-api-access-7zrzf\") on node \"crc\" DevicePath \"\"" Feb 26 15:52:04 crc kubenswrapper[5033]: I0226 15:52:04.991096 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535352-b7nrc" event={"ID":"7bef3ffd-290a-42fb-ae79-e423db030c18","Type":"ContainerDied","Data":"b3b501ae04407389360e9b97c7e31e817983502056df5cdc654aaecaa6c0a041"} Feb 26 15:52:04 crc kubenswrapper[5033]: I0226 15:52:04.991154 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535352-b7nrc" Feb 26 15:52:04 crc kubenswrapper[5033]: I0226 15:52:04.991169 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b501ae04407389360e9b97c7e31e817983502056df5cdc654aaecaa6c0a041" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.337538 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535346-655p9"] Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.341733 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535346-655p9"] Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.637698 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ea6827-f34e-4f48-8939-6fbf283f63f0" path="/var/lib/kubelet/pods/a8ea6827-f34e-4f48-8939-6fbf283f63f0/volumes" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.765027 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mhhhz"] Feb 26 15:52:05 crc kubenswrapper[5033]: E0226 15:52:05.765310 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bef3ffd-290a-42fb-ae79-e423db030c18" containerName="oc" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.765326 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bef3ffd-290a-42fb-ae79-e423db030c18" containerName="oc" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.765442 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bef3ffd-290a-42fb-ae79-e423db030c18" containerName="oc" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.765914 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.787638 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mhhhz"] Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.924993 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1fd1af5-744a-4ea8-8342-37f871f31037-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.925062 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.925095 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1fd1af5-744a-4ea8-8342-37f871f31037-bound-sa-token\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.925120 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1fd1af5-744a-4ea8-8342-37f871f31037-trusted-ca\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.925153 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1fd1af5-744a-4ea8-8342-37f871f31037-registry-certificates\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.925178 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m8wv\" (UniqueName: \"kubernetes.io/projected/e1fd1af5-744a-4ea8-8342-37f871f31037-kube-api-access-9m8wv\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.925245 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1fd1af5-744a-4ea8-8342-37f871f31037-registry-tls\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.925331 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1fd1af5-744a-4ea8-8342-37f871f31037-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:05 crc kubenswrapper[5033]: I0226 15:52:05.945725 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.025999 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1fd1af5-744a-4ea8-8342-37f871f31037-registry-tls\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.026042 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1fd1af5-744a-4ea8-8342-37f871f31037-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.026084 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1fd1af5-744a-4ea8-8342-37f871f31037-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.026107 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1fd1af5-744a-4ea8-8342-37f871f31037-bound-sa-token\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.026124 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1fd1af5-744a-4ea8-8342-37f871f31037-trusted-ca\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.026149 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1fd1af5-744a-4ea8-8342-37f871f31037-registry-certificates\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.026170 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m8wv\" (UniqueName: \"kubernetes.io/projected/e1fd1af5-744a-4ea8-8342-37f871f31037-kube-api-access-9m8wv\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.026638 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1fd1af5-744a-4ea8-8342-37f871f31037-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.027459 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1fd1af5-744a-4ea8-8342-37f871f31037-trusted-ca\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.028074 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1fd1af5-744a-4ea8-8342-37f871f31037-registry-certificates\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.029459 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1fd1af5-744a-4ea8-8342-37f871f31037-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.029483 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1fd1af5-744a-4ea8-8342-37f871f31037-registry-tls\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.048012 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m8wv\" (UniqueName: \"kubernetes.io/projected/e1fd1af5-744a-4ea8-8342-37f871f31037-kube-api-access-9m8wv\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.048679 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1fd1af5-744a-4ea8-8342-37f871f31037-bound-sa-token\") pod \"image-registry-66df7c8f76-mhhhz\" (UID: \"e1fd1af5-744a-4ea8-8342-37f871f31037\") " pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.082763 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:06 crc kubenswrapper[5033]: I0226 15:52:06.476631 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mhhhz"] Feb 26 15:52:06 crc kubenswrapper[5033]: W0226 15:52:06.483914 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1fd1af5_744a_4ea8_8342_37f871f31037.slice/crio-0749fe444f31505f8571d3d8e78a591ec356a15e921bd92d7430693653b7dd6f WatchSource:0}: Error finding container 0749fe444f31505f8571d3d8e78a591ec356a15e921bd92d7430693653b7dd6f: Status 404 returned error can't find the container with id 0749fe444f31505f8571d3d8e78a591ec356a15e921bd92d7430693653b7dd6f Feb 26 15:52:07 crc kubenswrapper[5033]: I0226 15:52:07.002604 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" event={"ID":"e1fd1af5-744a-4ea8-8342-37f871f31037","Type":"ContainerStarted","Data":"a1344e25b5df0d9982b99e5d8ed6124eb774817b5fc78a5e1c39c2927e93c9f1"} Feb 26 15:52:07 crc kubenswrapper[5033]: I0226 15:52:07.002943 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" event={"ID":"e1fd1af5-744a-4ea8-8342-37f871f31037","Type":"ContainerStarted","Data":"0749fe444f31505f8571d3d8e78a591ec356a15e921bd92d7430693653b7dd6f"} Feb 26 15:52:07 crc kubenswrapper[5033]: I0226 15:52:07.002964 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:07 crc kubenswrapper[5033]: I0226 15:52:07.023955 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" podStartSLOduration=2.023934397 podStartE2EDuration="2.023934397s" podCreationTimestamp="2026-02-26 15:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:52:07.02165527 +0000 UTC m=+654.234253940" watchObservedRunningTime="2026-02-26 15:52:07.023934397 +0000 UTC m=+654.236533047" Feb 26 15:52:26 crc kubenswrapper[5033]: I0226 15:52:26.091490 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mhhhz" Feb 26 15:52:26 crc kubenswrapper[5033]: I0226 15:52:26.151549 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p4fzm"] Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.192938 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" podUID="dcc382cb-23f7-44b0-89a2-43c41292c21e" containerName="registry" containerID="cri-o://8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee" gracePeriod=30 Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.565303 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.695306 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-tls\") pod \"dcc382cb-23f7-44b0-89a2-43c41292c21e\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.695381 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-bound-sa-token\") pod \"dcc382cb-23f7-44b0-89a2-43c41292c21e\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.695401 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdkgp\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-kube-api-access-zdkgp\") pod \"dcc382cb-23f7-44b0-89a2-43c41292c21e\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.695425 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dcc382cb-23f7-44b0-89a2-43c41292c21e-ca-trust-extracted\") pod \"dcc382cb-23f7-44b0-89a2-43c41292c21e\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.695613 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"dcc382cb-23f7-44b0-89a2-43c41292c21e\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.695644 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-certificates\") pod \"dcc382cb-23f7-44b0-89a2-43c41292c21e\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.695673 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-trusted-ca\") pod \"dcc382cb-23f7-44b0-89a2-43c41292c21e\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.695738 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dcc382cb-23f7-44b0-89a2-43c41292c21e-installation-pull-secrets\") pod \"dcc382cb-23f7-44b0-89a2-43c41292c21e\" (UID: \"dcc382cb-23f7-44b0-89a2-43c41292c21e\") " Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.696439 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "dcc382cb-23f7-44b0-89a2-43c41292c21e" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.696463 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "dcc382cb-23f7-44b0-89a2-43c41292c21e" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.696690 5033 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.696707 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dcc382cb-23f7-44b0-89a2-43c41292c21e-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.704043 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-kube-api-access-zdkgp" (OuterVolumeSpecName: "kube-api-access-zdkgp") pod "dcc382cb-23f7-44b0-89a2-43c41292c21e" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e"). InnerVolumeSpecName "kube-api-access-zdkgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.704155 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "dcc382cb-23f7-44b0-89a2-43c41292c21e" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.704844 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "dcc382cb-23f7-44b0-89a2-43c41292c21e" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.710843 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcc382cb-23f7-44b0-89a2-43c41292c21e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "dcc382cb-23f7-44b0-89a2-43c41292c21e" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.711064 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "dcc382cb-23f7-44b0-89a2-43c41292c21e" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.711886 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcc382cb-23f7-44b0-89a2-43c41292c21e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "dcc382cb-23f7-44b0-89a2-43c41292c21e" (UID: "dcc382cb-23f7-44b0-89a2-43c41292c21e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.797572 5033 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dcc382cb-23f7-44b0-89a2-43c41292c21e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.797604 5033 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.797614 5033 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.797622 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdkgp\" (UniqueName: \"kubernetes.io/projected/dcc382cb-23f7-44b0-89a2-43c41292c21e-kube-api-access-zdkgp\") on node \"crc\" DevicePath \"\"" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.797631 5033 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dcc382cb-23f7-44b0-89a2-43c41292c21e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.894014 5033 generic.go:334] "Generic (PLEG): container finished" podID="dcc382cb-23f7-44b0-89a2-43c41292c21e" containerID="8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee" exitCode=0 Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.894061 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" event={"ID":"dcc382cb-23f7-44b0-89a2-43c41292c21e","Type":"ContainerDied","Data":"8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee"} Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.894088 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" event={"ID":"dcc382cb-23f7-44b0-89a2-43c41292c21e","Type":"ContainerDied","Data":"2bdc305196091bcfd73e4844ed04f6d1d4f2a1ea6d2328447eb3b35144d4743d"} Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.894105 5033 scope.go:117] "RemoveContainer" containerID="8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.894206 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p4fzm" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.920461 5033 scope.go:117] "RemoveContainer" containerID="8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee" Feb 26 15:52:51 crc kubenswrapper[5033]: E0226 15:52:51.920924 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee\": container with ID starting with 8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee not found: ID does not exist" containerID="8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.920972 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee"} err="failed to get container status \"8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee\": rpc error: code = NotFound desc = could not find container \"8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee\": container with ID starting with 8c425011692fad919aaebfe8fc9724f2a2bc9497d2ea91f94c3cc49a4d8c09ee not found: ID does not exist" Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.936314 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p4fzm"] Feb 26 15:52:51 crc kubenswrapper[5033]: I0226 15:52:51.942282 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p4fzm"] Feb 26 15:52:52 crc kubenswrapper[5033]: E0226 15:52:52.011684 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc382cb_23f7_44b0_89a2_43c41292c21e.slice/crio-2bdc305196091bcfd73e4844ed04f6d1d4f2a1ea6d2328447eb3b35144d4743d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc382cb_23f7_44b0_89a2_43c41292c21e.slice\": RecentStats: unable to find data in memory cache]" Feb 26 15:52:53 crc kubenswrapper[5033]: I0226 15:52:53.642728 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc382cb-23f7-44b0-89a2-43c41292c21e" path="/var/lib/kubelet/pods/dcc382cb-23f7-44b0-89a2-43c41292c21e/volumes" Feb 26 15:53:16 crc kubenswrapper[5033]: I0226 15:53:16.149679 5033 scope.go:117] "RemoveContainer" containerID="896f36ed28c99db1b518718b80d53dfc088043f963218f9fee55457c0038b4b1" Feb 26 15:53:35 crc kubenswrapper[5033]: I0226 15:53:35.520562 5033 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 15:53:44 crc kubenswrapper[5033]: I0226 15:53:44.974447 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64hl8"] Feb 26 15:53:44 crc kubenswrapper[5033]: E0226 15:53:44.975554 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc382cb-23f7-44b0-89a2-43c41292c21e" containerName="registry" Feb 26 15:53:44 crc kubenswrapper[5033]: I0226 15:53:44.975574 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc382cb-23f7-44b0-89a2-43c41292c21e" containerName="registry" Feb 26 15:53:44 crc kubenswrapper[5033]: I0226 15:53:44.975759 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcc382cb-23f7-44b0-89a2-43c41292c21e" containerName="registry" Feb 26 15:53:44 crc kubenswrapper[5033]: I0226 15:53:44.976948 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:44 crc kubenswrapper[5033]: I0226 15:53:44.990968 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hl8"] Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.130289 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-utilities\") pod \"redhat-marketplace-64hl8\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.130555 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-catalog-content\") pod \"redhat-marketplace-64hl8\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.130702 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2pxs\" (UniqueName: \"kubernetes.io/projected/afebce63-d7da-46cc-8a63-99ae01b30904-kube-api-access-c2pxs\") pod \"redhat-marketplace-64hl8\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.232760 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2pxs\" (UniqueName: \"kubernetes.io/projected/afebce63-d7da-46cc-8a63-99ae01b30904-kube-api-access-c2pxs\") pod \"redhat-marketplace-64hl8\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.232863 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-utilities\") pod \"redhat-marketplace-64hl8\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.232889 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-catalog-content\") pod \"redhat-marketplace-64hl8\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.233563 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-catalog-content\") pod \"redhat-marketplace-64hl8\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.233878 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-utilities\") pod \"redhat-marketplace-64hl8\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.261923 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2pxs\" (UniqueName: \"kubernetes.io/projected/afebce63-d7da-46cc-8a63-99ae01b30904-kube-api-access-c2pxs\") pod \"redhat-marketplace-64hl8\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.306692 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:45 crc kubenswrapper[5033]: I0226 15:53:45.549020 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hl8"] Feb 26 15:53:46 crc kubenswrapper[5033]: I0226 15:53:46.229829 5033 generic.go:334] "Generic (PLEG): container finished" podID="afebce63-d7da-46cc-8a63-99ae01b30904" containerID="ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984" exitCode=0 Feb 26 15:53:46 crc kubenswrapper[5033]: I0226 15:53:46.229982 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hl8" event={"ID":"afebce63-d7da-46cc-8a63-99ae01b30904","Type":"ContainerDied","Data":"ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984"} Feb 26 15:53:46 crc kubenswrapper[5033]: I0226 15:53:46.230335 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hl8" event={"ID":"afebce63-d7da-46cc-8a63-99ae01b30904","Type":"ContainerStarted","Data":"c3a7b671e06217d64ef06d3919b030286decf21a94c42f509738a8b3ee3a6356"} Feb 26 15:53:48 crc kubenswrapper[5033]: I0226 15:53:48.244148 5033 generic.go:334] "Generic (PLEG): container finished" podID="afebce63-d7da-46cc-8a63-99ae01b30904" containerID="3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d" exitCode=0 Feb 26 15:53:48 crc kubenswrapper[5033]: I0226 15:53:48.244338 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hl8" event={"ID":"afebce63-d7da-46cc-8a63-99ae01b30904","Type":"ContainerDied","Data":"3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d"} Feb 26 15:53:49 crc kubenswrapper[5033]: I0226 15:53:49.256415 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hl8" event={"ID":"afebce63-d7da-46cc-8a63-99ae01b30904","Type":"ContainerStarted","Data":"952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f"} Feb 26 15:53:49 crc kubenswrapper[5033]: I0226 15:53:49.284416 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64hl8" podStartSLOduration=2.6896866189999997 podStartE2EDuration="5.284387518s" podCreationTimestamp="2026-02-26 15:53:44 +0000 UTC" firstStartedPulling="2026-02-26 15:53:46.231977288 +0000 UTC m=+753.444575988" lastFinishedPulling="2026-02-26 15:53:48.826678197 +0000 UTC m=+756.039276887" observedRunningTime="2026-02-26 15:53:49.278617609 +0000 UTC m=+756.491216309" watchObservedRunningTime="2026-02-26 15:53:49.284387518 +0000 UTC m=+756.496986208" Feb 26 15:53:50 crc kubenswrapper[5033]: I0226 15:53:50.189193 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:53:50 crc kubenswrapper[5033]: I0226 15:53:50.189569 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:53:55 crc kubenswrapper[5033]: I0226 15:53:55.306951 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:55 crc kubenswrapper[5033]: I0226 15:53:55.307385 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:55 crc kubenswrapper[5033]: I0226 15:53:55.380543 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:56 crc kubenswrapper[5033]: I0226 15:53:56.370195 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:56 crc kubenswrapper[5033]: I0226 15:53:56.436626 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hl8"] Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.312163 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64hl8" podUID="afebce63-d7da-46cc-8a63-99ae01b30904" containerName="registry-server" containerID="cri-o://952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f" gracePeriod=2 Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.683248 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.821577 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2pxs\" (UniqueName: \"kubernetes.io/projected/afebce63-d7da-46cc-8a63-99ae01b30904-kube-api-access-c2pxs\") pod \"afebce63-d7da-46cc-8a63-99ae01b30904\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.821693 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-utilities\") pod \"afebce63-d7da-46cc-8a63-99ae01b30904\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.821756 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-catalog-content\") pod \"afebce63-d7da-46cc-8a63-99ae01b30904\" (UID: \"afebce63-d7da-46cc-8a63-99ae01b30904\") " Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.823369 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-utilities" (OuterVolumeSpecName: "utilities") pod "afebce63-d7da-46cc-8a63-99ae01b30904" (UID: "afebce63-d7da-46cc-8a63-99ae01b30904"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.829200 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afebce63-d7da-46cc-8a63-99ae01b30904-kube-api-access-c2pxs" (OuterVolumeSpecName: "kube-api-access-c2pxs") pod "afebce63-d7da-46cc-8a63-99ae01b30904" (UID: "afebce63-d7da-46cc-8a63-99ae01b30904"). InnerVolumeSpecName "kube-api-access-c2pxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.858241 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afebce63-d7da-46cc-8a63-99ae01b30904" (UID: "afebce63-d7da-46cc-8a63-99ae01b30904"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.923026 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2pxs\" (UniqueName: \"kubernetes.io/projected/afebce63-d7da-46cc-8a63-99ae01b30904-kube-api-access-c2pxs\") on node \"crc\" DevicePath \"\"" Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.923062 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:53:58 crc kubenswrapper[5033]: I0226 15:53:58.923075 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afebce63-d7da-46cc-8a63-99ae01b30904-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.323378 5033 generic.go:334] "Generic (PLEG): container finished" podID="afebce63-d7da-46cc-8a63-99ae01b30904" containerID="952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f" exitCode=0 Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.323456 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hl8" event={"ID":"afebce63-d7da-46cc-8a63-99ae01b30904","Type":"ContainerDied","Data":"952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f"} Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.323562 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hl8" event={"ID":"afebce63-d7da-46cc-8a63-99ae01b30904","Type":"ContainerDied","Data":"c3a7b671e06217d64ef06d3919b030286decf21a94c42f509738a8b3ee3a6356"} Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.323605 5033 scope.go:117] "RemoveContainer" containerID="952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.323668 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64hl8" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.347914 5033 scope.go:117] "RemoveContainer" containerID="3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.380371 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hl8"] Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.393106 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hl8"] Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.395908 5033 scope.go:117] "RemoveContainer" containerID="ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.419497 5033 scope.go:117] "RemoveContainer" containerID="952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f" Feb 26 15:53:59 crc kubenswrapper[5033]: E0226 15:53:59.420240 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f\": container with ID starting with 952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f not found: ID does not exist" containerID="952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.420335 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f"} err="failed to get container status \"952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f\": rpc error: code = NotFound desc = could not find container \"952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f\": container with ID starting with 952cc35a910a1400a92b73c2c4a93641cc7698b92c4eea319261de9fdee8218f not found: ID does not exist" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.420379 5033 scope.go:117] "RemoveContainer" containerID="3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d" Feb 26 15:53:59 crc kubenswrapper[5033]: E0226 15:53:59.421117 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d\": container with ID starting with 3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d not found: ID does not exist" containerID="3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.421162 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d"} err="failed to get container status \"3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d\": rpc error: code = NotFound desc = could not find container \"3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d\": container with ID starting with 3c72c44e9fee1da40f44decdf0997aa7e58eadc0f14b47ffd0c6caf4e215506d not found: ID does not exist" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.421190 5033 scope.go:117] "RemoveContainer" containerID="ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984" Feb 26 15:53:59 crc kubenswrapper[5033]: E0226 15:53:59.421630 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984\": container with ID starting with ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984 not found: ID does not exist" containerID="ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.421695 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984"} err="failed to get container status \"ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984\": rpc error: code = NotFound desc = could not find container \"ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984\": container with ID starting with ab284731b3076f56b7b14adc8e1a536aa49c1bf4f5e18922e3277a6e0be0e984 not found: ID does not exist" Feb 26 15:53:59 crc kubenswrapper[5033]: I0226 15:53:59.641605 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afebce63-d7da-46cc-8a63-99ae01b30904" path="/var/lib/kubelet/pods/afebce63-d7da-46cc-8a63-99ae01b30904/volumes" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.156870 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535354-4kqtg"] Feb 26 15:54:00 crc kubenswrapper[5033]: E0226 15:54:00.157251 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afebce63-d7da-46cc-8a63-99ae01b30904" containerName="extract-utilities" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.157311 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="afebce63-d7da-46cc-8a63-99ae01b30904" containerName="extract-utilities" Feb 26 15:54:00 crc kubenswrapper[5033]: E0226 15:54:00.157338 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afebce63-d7da-46cc-8a63-99ae01b30904" containerName="registry-server" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.157354 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="afebce63-d7da-46cc-8a63-99ae01b30904" containerName="registry-server" Feb 26 15:54:00 crc kubenswrapper[5033]: E0226 15:54:00.157385 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afebce63-d7da-46cc-8a63-99ae01b30904" containerName="extract-content" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.157399 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="afebce63-d7da-46cc-8a63-99ae01b30904" containerName="extract-content" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.157587 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="afebce63-d7da-46cc-8a63-99ae01b30904" containerName="registry-server" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.158348 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535354-4kqtg" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.162108 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.162494 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.162655 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.163898 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535354-4kqtg"] Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.244193 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnq54\" (UniqueName: \"kubernetes.io/projected/9b93d288-0120-4f53-a82e-10c9e6e1f08b-kube-api-access-tnq54\") pod \"auto-csr-approver-29535354-4kqtg\" (UID: \"9b93d288-0120-4f53-a82e-10c9e6e1f08b\") " pod="openshift-infra/auto-csr-approver-29535354-4kqtg" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.346147 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnq54\" (UniqueName: \"kubernetes.io/projected/9b93d288-0120-4f53-a82e-10c9e6e1f08b-kube-api-access-tnq54\") pod \"auto-csr-approver-29535354-4kqtg\" (UID: \"9b93d288-0120-4f53-a82e-10c9e6e1f08b\") " pod="openshift-infra/auto-csr-approver-29535354-4kqtg" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.378945 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnq54\" (UniqueName: \"kubernetes.io/projected/9b93d288-0120-4f53-a82e-10c9e6e1f08b-kube-api-access-tnq54\") pod \"auto-csr-approver-29535354-4kqtg\" (UID: \"9b93d288-0120-4f53-a82e-10c9e6e1f08b\") " pod="openshift-infra/auto-csr-approver-29535354-4kqtg" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.483917 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535354-4kqtg" Feb 26 15:54:00 crc kubenswrapper[5033]: I0226 15:54:00.716291 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535354-4kqtg"] Feb 26 15:54:00 crc kubenswrapper[5033]: W0226 15:54:00.724808 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b93d288_0120_4f53_a82e_10c9e6e1f08b.slice/crio-6d404f4da6ee36b731c122f0f3351e54afba6ace1bc18121748caaf0096d3dd7 WatchSource:0}: Error finding container 6d404f4da6ee36b731c122f0f3351e54afba6ace1bc18121748caaf0096d3dd7: Status 404 returned error can't find the container with id 6d404f4da6ee36b731c122f0f3351e54afba6ace1bc18121748caaf0096d3dd7 Feb 26 15:54:01 crc kubenswrapper[5033]: I0226 15:54:01.338837 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535354-4kqtg" event={"ID":"9b93d288-0120-4f53-a82e-10c9e6e1f08b","Type":"ContainerStarted","Data":"6d404f4da6ee36b731c122f0f3351e54afba6ace1bc18121748caaf0096d3dd7"} Feb 26 15:54:02 crc kubenswrapper[5033]: I0226 15:54:02.347319 5033 generic.go:334] "Generic (PLEG): container finished" podID="9b93d288-0120-4f53-a82e-10c9e6e1f08b" containerID="5dd4f0e58aa6ca572dc3458182f11b5edb14cace47435266a20ac3c7fb777499" exitCode=0 Feb 26 15:54:02 crc kubenswrapper[5033]: I0226 15:54:02.347359 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535354-4kqtg" event={"ID":"9b93d288-0120-4f53-a82e-10c9e6e1f08b","Type":"ContainerDied","Data":"5dd4f0e58aa6ca572dc3458182f11b5edb14cace47435266a20ac3c7fb777499"} Feb 26 15:54:03 crc kubenswrapper[5033]: I0226 15:54:03.555626 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535354-4kqtg" Feb 26 15:54:03 crc kubenswrapper[5033]: I0226 15:54:03.690210 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnq54\" (UniqueName: \"kubernetes.io/projected/9b93d288-0120-4f53-a82e-10c9e6e1f08b-kube-api-access-tnq54\") pod \"9b93d288-0120-4f53-a82e-10c9e6e1f08b\" (UID: \"9b93d288-0120-4f53-a82e-10c9e6e1f08b\") " Feb 26 15:54:03 crc kubenswrapper[5033]: I0226 15:54:03.698345 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b93d288-0120-4f53-a82e-10c9e6e1f08b-kube-api-access-tnq54" (OuterVolumeSpecName: "kube-api-access-tnq54") pod "9b93d288-0120-4f53-a82e-10c9e6e1f08b" (UID: "9b93d288-0120-4f53-a82e-10c9e6e1f08b"). InnerVolumeSpecName "kube-api-access-tnq54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:54:03 crc kubenswrapper[5033]: I0226 15:54:03.792547 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnq54\" (UniqueName: \"kubernetes.io/projected/9b93d288-0120-4f53-a82e-10c9e6e1f08b-kube-api-access-tnq54\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:04 crc kubenswrapper[5033]: I0226 15:54:04.365810 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535354-4kqtg" event={"ID":"9b93d288-0120-4f53-a82e-10c9e6e1f08b","Type":"ContainerDied","Data":"6d404f4da6ee36b731c122f0f3351e54afba6ace1bc18121748caaf0096d3dd7"} Feb 26 15:54:04 crc kubenswrapper[5033]: I0226 15:54:04.366738 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d404f4da6ee36b731c122f0f3351e54afba6ace1bc18121748caaf0096d3dd7" Feb 26 15:54:04 crc kubenswrapper[5033]: I0226 15:54:04.365873 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535354-4kqtg" Feb 26 15:54:04 crc kubenswrapper[5033]: I0226 15:54:04.636199 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535348-fmnfx"] Feb 26 15:54:04 crc kubenswrapper[5033]: I0226 15:54:04.642302 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535348-fmnfx"] Feb 26 15:54:05 crc kubenswrapper[5033]: I0226 15:54:05.639308 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="859a38e3-1926-4c07-8ae1-2745516cffa0" path="/var/lib/kubelet/pods/859a38e3-1926-4c07-8ae1-2745516cffa0/volumes" Feb 26 15:54:16 crc kubenswrapper[5033]: I0226 15:54:16.223113 5033 scope.go:117] "RemoveContainer" containerID="ce3e7ed8c304bf24b1ef3e58aa4817fe75e653e695189f4925a354922817be54" Feb 26 15:54:20 crc kubenswrapper[5033]: I0226 15:54:20.189413 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:54:20 crc kubenswrapper[5033]: I0226 15:54:20.189472 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:54:31 crc kubenswrapper[5033]: I0226 15:54:31.698020 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fth8v"] Feb 26 15:54:31 crc kubenswrapper[5033]: I0226 15:54:31.699042 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovn-controller" containerID="cri-o://10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f" gracePeriod=30 Feb 26 15:54:31 crc kubenswrapper[5033]: I0226 15:54:31.699442 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="sbdb" containerID="cri-o://d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c" gracePeriod=30 Feb 26 15:54:31 crc kubenswrapper[5033]: I0226 15:54:31.699489 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="nbdb" containerID="cri-o://0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889" gracePeriod=30 Feb 26 15:54:31 crc kubenswrapper[5033]: I0226 15:54:31.699521 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="northd" containerID="cri-o://c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5" gracePeriod=30 Feb 26 15:54:31 crc kubenswrapper[5033]: I0226 15:54:31.699547 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933" gracePeriod=30 Feb 26 15:54:31 crc kubenswrapper[5033]: I0226 15:54:31.699573 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kube-rbac-proxy-node" containerID="cri-o://77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca" gracePeriod=30 Feb 26 15:54:31 crc kubenswrapper[5033]: I0226 15:54:31.699605 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovn-acl-logging" containerID="cri-o://df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05" gracePeriod=30 Feb 26 15:54:31 crc kubenswrapper[5033]: I0226 15:54:31.743242 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovnkube-controller" containerID="cri-o://0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4" gracePeriod=30 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.034834 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fth8v_6e439a58-bd20-4da1-96e5-7e6a4329fd6b/ovn-acl-logging/0.log" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.035735 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fth8v_6e439a58-bd20-4da1-96e5-7e6a4329fd6b/ovn-controller/0.log" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.036323 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.089980 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gvxkp"] Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090177 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovn-controller" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090188 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovn-controller" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090198 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="sbdb" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090205 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="sbdb" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090213 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b93d288-0120-4f53-a82e-10c9e6e1f08b" containerName="oc" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090220 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b93d288-0120-4f53-a82e-10c9e6e1f08b" containerName="oc" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090231 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovnkube-controller" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090237 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovnkube-controller" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090245 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kube-rbac-proxy-node" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090267 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kube-rbac-proxy-node" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090275 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovn-acl-logging" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090281 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovn-acl-logging" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090290 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kubecfg-setup" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090295 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kubecfg-setup" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090302 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="nbdb" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090307 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="nbdb" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090315 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="northd" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090320 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="northd" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.090329 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090335 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090419 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090427 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovn-acl-logging" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090435 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="nbdb" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090441 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovnkube-controller" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090448 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="sbdb" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090456 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="northd" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090465 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="ovn-controller" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090473 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerName="kube-rbac-proxy-node" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.090481 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b93d288-0120-4f53-a82e-10c9e6e1f08b" containerName="oc" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.092136 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150168 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-systemd-units\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150280 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-bin\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150305 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-slash\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150299 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150321 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-systemd\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150409 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-var-lib-openvswitch\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150440 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150464 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-openvswitch\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150450 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150490 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-node-log\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150505 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-etc-openvswitch\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150517 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150521 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-slash" (OuterVolumeSpecName: "host-slash") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150547 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-log-socket" (OuterVolumeSpecName: "log-socket") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150549 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150569 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150527 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-log-socket\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150568 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-node-log" (OuterVolumeSpecName: "node-log") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150585 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150662 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-netns\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150708 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150730 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-config\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150773 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-script-lib\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150809 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-netd\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150832 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-ovn-kubernetes\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150853 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-env-overrides\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150869 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4m9j\" (UniqueName: \"kubernetes.io/projected/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-kube-api-access-m4m9j\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150921 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-ovn\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150940 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-kubelet\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.150962 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovn-node-metrics-cert\") pod \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\" (UID: \"6e439a58-bd20-4da1-96e5-7e6a4329fd6b\") " Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151282 5033 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151284 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151294 5033 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151329 5033 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-slash\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151345 5033 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151361 5033 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151376 5033 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151388 5033 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-node-log\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151399 5033 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151411 5033 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-log-socket\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151423 5033 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151724 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151837 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151862 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151882 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.151889 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.152009 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.156497 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-kube-api-access-m4m9j" (OuterVolumeSpecName: "kube-api-access-m4m9j") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "kube-api-access-m4m9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.156582 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.165786 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6e439a58-bd20-4da1-96e5-7e6a4329fd6b" (UID: "6e439a58-bd20-4da1-96e5-7e6a4329fd6b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.252918 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-var-lib-openvswitch\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.252968 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-slash\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253005 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-run-ovn-kubernetes\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253024 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-run-netns\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253174 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc982\" (UniqueName: \"kubernetes.io/projected/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-kube-api-access-xc982\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253337 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-ovn-node-metrics-cert\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253394 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-log-socket\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253434 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-systemd-units\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253461 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-ovnkube-config\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253490 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-ovnkube-script-lib\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253599 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-env-overrides\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253627 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-run-systemd\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253664 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-etc-openvswitch\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253680 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-run-openvswitch\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253755 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-cni-netd\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253830 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-run-ovn\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253868 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-kubelet\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253897 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-node-log\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253920 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.253991 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-cni-bin\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254162 5033 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254181 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4m9j\" (UniqueName: \"kubernetes.io/projected/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-kube-api-access-m4m9j\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254197 5033 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254209 5033 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254221 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254235 5033 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254272 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254287 5033 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254298 5033 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.254310 5033 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e439a58-bd20-4da1-96e5-7e6a4329fd6b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.355903 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-run-ovn\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.355970 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-kubelet\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356002 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-node-log\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356031 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356065 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-cni-bin\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356092 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-var-lib-openvswitch\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356124 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-slash\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356171 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-run-ovn-kubernetes\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356202 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-run-netns\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356244 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc982\" (UniqueName: \"kubernetes.io/projected/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-kube-api-access-xc982\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356322 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-ovn-node-metrics-cert\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356354 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-log-socket\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356387 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-systemd-units\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356422 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-ovnkube-config\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356457 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-ovnkube-script-lib\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356503 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-run-systemd\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356535 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-env-overrides\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356573 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-etc-openvswitch\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356601 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-run-openvswitch\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356629 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-cni-netd\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.356750 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-cni-netd\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357063 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-etc-openvswitch\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357104 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-log-socket\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357070 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-systemd-units\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357143 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-run-openvswitch\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357168 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-slash\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357176 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-run-ovn\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357183 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-run-ovn-kubernetes\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357197 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-cni-bin\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357203 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-kubelet\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357223 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-run-netns\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357284 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-node-log\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357305 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357332 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-var-lib-openvswitch\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.357331 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-run-systemd\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.358192 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-env-overrides\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.358284 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-ovnkube-config\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.358370 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-ovnkube-script-lib\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.362540 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-ovn-node-metrics-cert\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.378735 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc982\" (UniqueName: \"kubernetes.io/projected/7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35-kube-api-access-xc982\") pod \"ovnkube-node-gvxkp\" (UID: \"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.405663 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:32 crc kubenswrapper[5033]: W0226 15:54:32.433199 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f2a45b6_b9b4_4ed8_a290_f04b9e59ee35.slice/crio-3319dfe1562436df8ada86276721473fc22ac07439f6f290fa82db45d2510953 WatchSource:0}: Error finding container 3319dfe1562436df8ada86276721473fc22ac07439f6f290fa82db45d2510953: Status 404 returned error can't find the container with id 3319dfe1562436df8ada86276721473fc22ac07439f6f290fa82db45d2510953 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.565636 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fth8v_6e439a58-bd20-4da1-96e5-7e6a4329fd6b/ovn-acl-logging/0.log" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.566428 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fth8v_6e439a58-bd20-4da1-96e5-7e6a4329fd6b/ovn-controller/0.log" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.566967 5033 generic.go:334] "Generic (PLEG): container finished" podID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerID="0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4" exitCode=0 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.566987 5033 generic.go:334] "Generic (PLEG): container finished" podID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerID="d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c" exitCode=0 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.566994 5033 generic.go:334] "Generic (PLEG): container finished" podID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerID="0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889" exitCode=0 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567001 5033 generic.go:334] "Generic (PLEG): container finished" podID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerID="c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5" exitCode=0 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567008 5033 generic.go:334] "Generic (PLEG): container finished" podID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerID="7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933" exitCode=0 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567014 5033 generic.go:334] "Generic (PLEG): container finished" podID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerID="77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca" exitCode=0 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567021 5033 generic.go:334] "Generic (PLEG): container finished" podID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerID="df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05" exitCode=143 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567029 5033 generic.go:334] "Generic (PLEG): container finished" podID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" containerID="10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f" exitCode=143 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567052 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567077 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567170 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567194 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567214 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567235 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567253 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567342 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567365 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567377 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567392 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567410 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567423 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567434 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567444 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567456 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567467 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567477 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567487 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567498 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567512 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567293 5033 scope.go:117] "RemoveContainer" containerID="0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567528 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567541 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567554 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567565 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567577 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567588 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567600 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567611 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567621 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567635 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fth8v" event={"ID":"6e439a58-bd20-4da1-96e5-7e6a4329fd6b","Type":"ContainerDied","Data":"37123b5dc05c886f85c0200969c75d459f4a40773fdfb69634d0c93a0f232ae2"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567653 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567672 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567687 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567702 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567716 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567727 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567738 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567755 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.567769 5033 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.568871 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"9e96834d20646283d96f8c2befcbb51ec4940c1f50575526210440550c14af15"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.568930 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"3319dfe1562436df8ada86276721473fc22ac07439f6f290fa82db45d2510953"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.573189 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2h9gn_e974dc06-da1a-49ae-af6e-80b3436df208/kube-multus/0.log" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.573243 5033 generic.go:334] "Generic (PLEG): container finished" podID="e974dc06-da1a-49ae-af6e-80b3436df208" containerID="1cbfbc700e95c4db0e198395ac967657336eadd3ff9df7a4f96fdfbe66689b9c" exitCode=2 Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.573292 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2h9gn" event={"ID":"e974dc06-da1a-49ae-af6e-80b3436df208","Type":"ContainerDied","Data":"1cbfbc700e95c4db0e198395ac967657336eadd3ff9df7a4f96fdfbe66689b9c"} Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.574844 5033 scope.go:117] "RemoveContainer" containerID="1cbfbc700e95c4db0e198395ac967657336eadd3ff9df7a4f96fdfbe66689b9c" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.636458 5033 scope.go:117] "RemoveContainer" containerID="d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.665346 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fth8v"] Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.667546 5033 scope.go:117] "RemoveContainer" containerID="0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.669751 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fth8v"] Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.699825 5033 scope.go:117] "RemoveContainer" containerID="c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.716421 5033 scope.go:117] "RemoveContainer" containerID="7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.733432 5033 scope.go:117] "RemoveContainer" containerID="77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.759812 5033 scope.go:117] "RemoveContainer" containerID="df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.776473 5033 scope.go:117] "RemoveContainer" containerID="10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.791606 5033 scope.go:117] "RemoveContainer" containerID="496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.806700 5033 scope.go:117] "RemoveContainer" containerID="0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.807120 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": container with ID starting with 0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4 not found: ID does not exist" containerID="0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.807166 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} err="failed to get container status \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": rpc error: code = NotFound desc = could not find container \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": container with ID starting with 0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.807192 5033 scope.go:117] "RemoveContainer" containerID="d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.811641 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": container with ID starting with d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c not found: ID does not exist" containerID="d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.811676 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} err="failed to get container status \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": rpc error: code = NotFound desc = could not find container \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": container with ID starting with d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.811695 5033 scope.go:117] "RemoveContainer" containerID="0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.811995 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": container with ID starting with 0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889 not found: ID does not exist" containerID="0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.812019 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} err="failed to get container status \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": rpc error: code = NotFound desc = could not find container \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": container with ID starting with 0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.812036 5033 scope.go:117] "RemoveContainer" containerID="c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.812629 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": container with ID starting with c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5 not found: ID does not exist" containerID="c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.812651 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} err="failed to get container status \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": rpc error: code = NotFound desc = could not find container \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": container with ID starting with c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.812672 5033 scope.go:117] "RemoveContainer" containerID="7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.812966 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": container with ID starting with 7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933 not found: ID does not exist" containerID="7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.812995 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} err="failed to get container status \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": rpc error: code = NotFound desc = could not find container \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": container with ID starting with 7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.813013 5033 scope.go:117] "RemoveContainer" containerID="77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.813381 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": container with ID starting with 77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca not found: ID does not exist" containerID="77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.813449 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} err="failed to get container status \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": rpc error: code = NotFound desc = could not find container \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": container with ID starting with 77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.813482 5033 scope.go:117] "RemoveContainer" containerID="df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.813906 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05\": container with ID starting with df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05 not found: ID does not exist" containerID="df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.813932 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} err="failed to get container status \"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05\": rpc error: code = NotFound desc = could not find container \"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05\": container with ID starting with df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.813951 5033 scope.go:117] "RemoveContainer" containerID="10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.814458 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f\": container with ID starting with 10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f not found: ID does not exist" containerID="10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.814495 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} err="failed to get container status \"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f\": rpc error: code = NotFound desc = could not find container \"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f\": container with ID starting with 10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.814519 5033 scope.go:117] "RemoveContainer" containerID="496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b" Feb 26 15:54:32 crc kubenswrapper[5033]: E0226 15:54:32.815017 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\": container with ID starting with 496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b not found: ID does not exist" containerID="496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.815051 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b"} err="failed to get container status \"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\": rpc error: code = NotFound desc = could not find container \"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\": container with ID starting with 496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.815072 5033 scope.go:117] "RemoveContainer" containerID="0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.815410 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} err="failed to get container status \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": rpc error: code = NotFound desc = could not find container \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": container with ID starting with 0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.815435 5033 scope.go:117] "RemoveContainer" containerID="d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.815710 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} err="failed to get container status \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": rpc error: code = NotFound desc = could not find container \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": container with ID starting with d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.815735 5033 scope.go:117] "RemoveContainer" containerID="0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.816074 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} err="failed to get container status \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": rpc error: code = NotFound desc = could not find container \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": container with ID starting with 0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.816095 5033 scope.go:117] "RemoveContainer" containerID="c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.816357 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} err="failed to get container status \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": rpc error: code = NotFound desc = could not find container \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": container with ID starting with c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.816393 5033 scope.go:117] "RemoveContainer" containerID="7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.816658 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} err="failed to get container status \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": rpc error: code = NotFound desc = could not find container \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": container with ID starting with 7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.816680 5033 scope.go:117] "RemoveContainer" containerID="77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.816889 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} err="failed to get container status \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": rpc error: code = NotFound desc = could not find container \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": container with ID starting with 77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.816910 5033 scope.go:117] "RemoveContainer" containerID="df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.817126 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} err="failed to get container status \"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05\": rpc error: code = NotFound desc = could not find container \"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05\": container with ID starting with df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.817152 5033 scope.go:117] "RemoveContainer" containerID="10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.817403 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} err="failed to get container status \"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f\": rpc error: code = NotFound desc = could not find container \"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f\": container with ID starting with 10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.817426 5033 scope.go:117] "RemoveContainer" containerID="496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.817971 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b"} err="failed to get container status \"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\": rpc error: code = NotFound desc = could not find container \"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\": container with ID starting with 496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.818007 5033 scope.go:117] "RemoveContainer" containerID="0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.818393 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} err="failed to get container status \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": rpc error: code = NotFound desc = could not find container \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": container with ID starting with 0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.818423 5033 scope.go:117] "RemoveContainer" containerID="d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.818669 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} err="failed to get container status \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": rpc error: code = NotFound desc = could not find container \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": container with ID starting with d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.818691 5033 scope.go:117] "RemoveContainer" containerID="0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.818980 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} err="failed to get container status \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": rpc error: code = NotFound desc = could not find container \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": container with ID starting with 0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.819002 5033 scope.go:117] "RemoveContainer" containerID="c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.819230 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} err="failed to get container status \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": rpc error: code = NotFound desc = could not find container \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": container with ID starting with c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.819267 5033 scope.go:117] "RemoveContainer" containerID="7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.819483 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} err="failed to get container status \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": rpc error: code = NotFound desc = could not find container \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": container with ID starting with 7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.819505 5033 scope.go:117] "RemoveContainer" containerID="77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.819697 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} err="failed to get container status \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": rpc error: code = NotFound desc = could not find container \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": container with ID starting with 77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.819721 5033 scope.go:117] "RemoveContainer" containerID="df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.819959 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} err="failed to get container status \"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05\": rpc error: code = NotFound desc = could not find container \"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05\": container with ID starting with df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.819983 5033 scope.go:117] "RemoveContainer" containerID="10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.820622 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} err="failed to get container status \"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f\": rpc error: code = NotFound desc = could not find container \"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f\": container with ID starting with 10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.820647 5033 scope.go:117] "RemoveContainer" containerID="496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.820847 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b"} err="failed to get container status \"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\": rpc error: code = NotFound desc = could not find container \"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\": container with ID starting with 496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.820869 5033 scope.go:117] "RemoveContainer" containerID="0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.821094 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} err="failed to get container status \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": rpc error: code = NotFound desc = could not find container \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": container with ID starting with 0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.821116 5033 scope.go:117] "RemoveContainer" containerID="d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.821562 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} err="failed to get container status \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": rpc error: code = NotFound desc = could not find container \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": container with ID starting with d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.821600 5033 scope.go:117] "RemoveContainer" containerID="0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.822029 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} err="failed to get container status \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": rpc error: code = NotFound desc = could not find container \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": container with ID starting with 0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.822062 5033 scope.go:117] "RemoveContainer" containerID="c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.822522 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} err="failed to get container status \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": rpc error: code = NotFound desc = could not find container \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": container with ID starting with c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.822546 5033 scope.go:117] "RemoveContainer" containerID="7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.822761 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} err="failed to get container status \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": rpc error: code = NotFound desc = could not find container \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": container with ID starting with 7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.822780 5033 scope.go:117] "RemoveContainer" containerID="77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.822971 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} err="failed to get container status \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": rpc error: code = NotFound desc = could not find container \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": container with ID starting with 77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.822989 5033 scope.go:117] "RemoveContainer" containerID="df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.823423 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05"} err="failed to get container status \"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05\": rpc error: code = NotFound desc = could not find container \"df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05\": container with ID starting with df8ce19cca11a955d3b9a8a8e5602b30ec3b042b891dd26935648bd0efeecf05 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.823441 5033 scope.go:117] "RemoveContainer" containerID="10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.823632 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f"} err="failed to get container status \"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f\": rpc error: code = NotFound desc = could not find container \"10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f\": container with ID starting with 10e6476db3175fe310df317b05bcfd69687316da0fd25e31015598895c795f6f not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.823645 5033 scope.go:117] "RemoveContainer" containerID="496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.823920 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b"} err="failed to get container status \"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\": rpc error: code = NotFound desc = could not find container \"496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b\": container with ID starting with 496085f007301996518d522b00c76099fcfa6a37881e7f664315440c52e4802b not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.823953 5033 scope.go:117] "RemoveContainer" containerID="0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.824293 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4"} err="failed to get container status \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": rpc error: code = NotFound desc = could not find container \"0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4\": container with ID starting with 0b9a9cffc4d62bdbf5101379f3a6443b9971c9ede64191e2ed7ec654a92b64b4 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.824315 5033 scope.go:117] "RemoveContainer" containerID="d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.824543 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c"} err="failed to get container status \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": rpc error: code = NotFound desc = could not find container \"d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c\": container with ID starting with d46ea2a44b64d5071c88a9a7e41474b8dbe141eaef9ff51a7a4a11a5976ae45c not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.824570 5033 scope.go:117] "RemoveContainer" containerID="0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.824770 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889"} err="failed to get container status \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": rpc error: code = NotFound desc = could not find container \"0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889\": container with ID starting with 0349c7d00055eb156080dcaa996bbc21a77deaaa80513cd72748288194eaf889 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.824792 5033 scope.go:117] "RemoveContainer" containerID="c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.825006 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5"} err="failed to get container status \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": rpc error: code = NotFound desc = could not find container \"c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5\": container with ID starting with c081254837461c2198509cb109d220b8ce1b11fbb7d0392a8aef61fe9ae7f2f5 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.825029 5033 scope.go:117] "RemoveContainer" containerID="7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.825494 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933"} err="failed to get container status \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": rpc error: code = NotFound desc = could not find container \"7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933\": container with ID starting with 7d9be822f307426de661f5e4995620142c44a52cf5442e47a5c6b34719694933 not found: ID does not exist" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.825517 5033 scope.go:117] "RemoveContainer" containerID="77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca" Feb 26 15:54:32 crc kubenswrapper[5033]: I0226 15:54:32.825753 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca"} err="failed to get container status \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": rpc error: code = NotFound desc = could not find container \"77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca\": container with ID starting with 77ab2f1a7247ac4bf74a565d0fd9e9875b4786fb8f642925162347d16bb2f4ca not found: ID does not exist" Feb 26 15:54:33 crc kubenswrapper[5033]: I0226 15:54:33.582751 5033 generic.go:334] "Generic (PLEG): container finished" podID="7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35" containerID="9e96834d20646283d96f8c2befcbb51ec4940c1f50575526210440550c14af15" exitCode=0 Feb 26 15:54:33 crc kubenswrapper[5033]: I0226 15:54:33.583137 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerDied","Data":"9e96834d20646283d96f8c2befcbb51ec4940c1f50575526210440550c14af15"} Feb 26 15:54:33 crc kubenswrapper[5033]: I0226 15:54:33.587955 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2h9gn_e974dc06-da1a-49ae-af6e-80b3436df208/kube-multus/0.log" Feb 26 15:54:33 crc kubenswrapper[5033]: I0226 15:54:33.588122 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2h9gn" event={"ID":"e974dc06-da1a-49ae-af6e-80b3436df208","Type":"ContainerStarted","Data":"73c4910a9ccfc41221d79bd2ced0d1a37464ee59e343dc1a26c929982a3b2555"} Feb 26 15:54:33 crc kubenswrapper[5033]: I0226 15:54:33.638562 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e439a58-bd20-4da1-96e5-7e6a4329fd6b" path="/var/lib/kubelet/pods/6e439a58-bd20-4da1-96e5-7e6a4329fd6b/volumes" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.466981 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d2xpz"] Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.468214 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.585690 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-utilities\") pod \"certified-operators-d2xpz\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.585776 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4x4\" (UniqueName: \"kubernetes.io/projected/04af002b-5b03-4a03-9baf-858770be46d4-kube-api-access-lt4x4\") pod \"certified-operators-d2xpz\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.585953 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-catalog-content\") pod \"certified-operators-d2xpz\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.601961 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"f5d314e59c95ab2c032b2b1e576da0074257c14b9715cab0874f19b8ad49dbc0"} Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.602018 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"6708e4384e2e274c6493457083d30527169feb42edf7825d5e6216b09f89b6dc"} Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.602036 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"c5d1a862370fe94c0d0d78ec69fd94ce2d2a06262b745876ac9f7f2a51294fb9"} Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.602053 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"7969d803fd59d8668f8557057e61a239831eb6495521493f50dab88da78c1aaf"} Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.602071 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"57322d443eb4f57509940ec1ca5fbde7c77cd1dfb3d570e282c439cc40988b4e"} Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.602085 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"1df8713a3a8bf3d6baabe7aa5a6b719e2c87ad45ba779b17949290c60e3740e5"} Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.687612 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-catalog-content\") pod \"certified-operators-d2xpz\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.687686 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-utilities\") pod \"certified-operators-d2xpz\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.687735 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4x4\" (UniqueName: \"kubernetes.io/projected/04af002b-5b03-4a03-9baf-858770be46d4-kube-api-access-lt4x4\") pod \"certified-operators-d2xpz\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.688169 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-utilities\") pod \"certified-operators-d2xpz\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.688203 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-catalog-content\") pod \"certified-operators-d2xpz\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.713965 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4x4\" (UniqueName: \"kubernetes.io/projected/04af002b-5b03-4a03-9baf-858770be46d4-kube-api-access-lt4x4\") pod \"certified-operators-d2xpz\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: I0226 15:54:34.781854 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: E0226 15:54:34.801105 5033 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-d2xpz_openshift-marketplace_04af002b-5b03-4a03-9baf-858770be46d4_0(2309c6866aa3f6d3fff95b35ca18e5a1506d85d88a34e2eb3475a9665209b775): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:54:34 crc kubenswrapper[5033]: E0226 15:54:34.801181 5033 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-d2xpz_openshift-marketplace_04af002b-5b03-4a03-9baf-858770be46d4_0(2309c6866aa3f6d3fff95b35ca18e5a1506d85d88a34e2eb3475a9665209b775): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: E0226 15:54:34.801209 5033 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-d2xpz_openshift-marketplace_04af002b-5b03-4a03-9baf-858770be46d4_0(2309c6866aa3f6d3fff95b35ca18e5a1506d85d88a34e2eb3475a9665209b775): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:34 crc kubenswrapper[5033]: E0226 15:54:34.801293 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-d2xpz_openshift-marketplace(04af002b-5b03-4a03-9baf-858770be46d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-d2xpz_openshift-marketplace(04af002b-5b03-4a03-9baf-858770be46d4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-d2xpz_openshift-marketplace_04af002b-5b03-4a03-9baf-858770be46d4_0(2309c6866aa3f6d3fff95b35ca18e5a1506d85d88a34e2eb3475a9665209b775): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/certified-operators-d2xpz" podUID="04af002b-5b03-4a03-9baf-858770be46d4" Feb 26 15:54:36 crc kubenswrapper[5033]: I0226 15:54:36.616725 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"0c205b7ea67373a509db9be64d4c754c6ff3d072b8492ce94c10ba5b084e042a"} Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.348198 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-92b7r"] Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.349218 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.352071 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.352398 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.352728 5033 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xlhkr" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.355129 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.422021 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-node-mnt\") pod \"crc-storage-crc-92b7r\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.422090 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-crc-storage\") pod \"crc-storage-crc-92b7r\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.422125 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgvc\" (UniqueName: \"kubernetes.io/projected/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-kube-api-access-rvgvc\") pod \"crc-storage-crc-92b7r\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.523805 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgvc\" (UniqueName: \"kubernetes.io/projected/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-kube-api-access-rvgvc\") pod \"crc-storage-crc-92b7r\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.524190 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-node-mnt\") pod \"crc-storage-crc-92b7r\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.524340 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-crc-storage\") pod \"crc-storage-crc-92b7r\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.524675 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-node-mnt\") pod \"crc-storage-crc-92b7r\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.525134 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-crc-storage\") pod \"crc-storage-crc-92b7r\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.551676 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgvc\" (UniqueName: \"kubernetes.io/projected/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-kube-api-access-rvgvc\") pod \"crc-storage-crc-92b7r\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: I0226 15:54:37.669382 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: E0226 15:54:37.698742 5033 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-92b7r_crc-storage_72c3abd5-9b43-4dc5-afbc-ba8b0433c512_0(ee0a5ca489e52a2c52ba6204f24bdd51d37ece88734cc05565dd271734edd9b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:54:37 crc kubenswrapper[5033]: E0226 15:54:37.698822 5033 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-92b7r_crc-storage_72c3abd5-9b43-4dc5-afbc-ba8b0433c512_0(ee0a5ca489e52a2c52ba6204f24bdd51d37ece88734cc05565dd271734edd9b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: E0226 15:54:37.698857 5033 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-92b7r_crc-storage_72c3abd5-9b43-4dc5-afbc-ba8b0433c512_0(ee0a5ca489e52a2c52ba6204f24bdd51d37ece88734cc05565dd271734edd9b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:37 crc kubenswrapper[5033]: E0226 15:54:37.698931 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-92b7r_crc-storage(72c3abd5-9b43-4dc5-afbc-ba8b0433c512)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-92b7r_crc-storage(72c3abd5-9b43-4dc5-afbc-ba8b0433c512)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-92b7r_crc-storage_72c3abd5-9b43-4dc5-afbc-ba8b0433c512_0(ee0a5ca489e52a2c52ba6204f24bdd51d37ece88734cc05565dd271734edd9b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-92b7r" podUID="72c3abd5-9b43-4dc5-afbc-ba8b0433c512" Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.497735 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-92b7r"] Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.498153 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.498670 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.502878 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2xpz"] Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.503015 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.503561 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:39 crc kubenswrapper[5033]: E0226 15:54:39.541508 5033 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-92b7r_crc-storage_72c3abd5-9b43-4dc5-afbc-ba8b0433c512_0(af0e94106d5ee4367a0e6c24b77760ea7d8964b062efaf0160b1beeb99fe9ab8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:54:39 crc kubenswrapper[5033]: E0226 15:54:39.541789 5033 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-92b7r_crc-storage_72c3abd5-9b43-4dc5-afbc-ba8b0433c512_0(af0e94106d5ee4367a0e6c24b77760ea7d8964b062efaf0160b1beeb99fe9ab8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:39 crc kubenswrapper[5033]: E0226 15:54:39.541809 5033 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-92b7r_crc-storage_72c3abd5-9b43-4dc5-afbc-ba8b0433c512_0(af0e94106d5ee4367a0e6c24b77760ea7d8964b062efaf0160b1beeb99fe9ab8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:39 crc kubenswrapper[5033]: E0226 15:54:39.541856 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-92b7r_crc-storage(72c3abd5-9b43-4dc5-afbc-ba8b0433c512)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-92b7r_crc-storage(72c3abd5-9b43-4dc5-afbc-ba8b0433c512)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-92b7r_crc-storage_72c3abd5-9b43-4dc5-afbc-ba8b0433c512_0(af0e94106d5ee4367a0e6c24b77760ea7d8964b062efaf0160b1beeb99fe9ab8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-92b7r" podUID="72c3abd5-9b43-4dc5-afbc-ba8b0433c512" Feb 26 15:54:39 crc kubenswrapper[5033]: E0226 15:54:39.552523 5033 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-d2xpz_openshift-marketplace_04af002b-5b03-4a03-9baf-858770be46d4_0(1b1ee9b51634bf09278d2c7e3faccd75aaba5a5464431c7d03f416eb4451d84f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:54:39 crc kubenswrapper[5033]: E0226 15:54:39.552556 5033 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-d2xpz_openshift-marketplace_04af002b-5b03-4a03-9baf-858770be46d4_0(1b1ee9b51634bf09278d2c7e3faccd75aaba5a5464431c7d03f416eb4451d84f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:39 crc kubenswrapper[5033]: E0226 15:54:39.552578 5033 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-d2xpz_openshift-marketplace_04af002b-5b03-4a03-9baf-858770be46d4_0(1b1ee9b51634bf09278d2c7e3faccd75aaba5a5464431c7d03f416eb4451d84f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:39 crc kubenswrapper[5033]: E0226 15:54:39.552632 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-d2xpz_openshift-marketplace(04af002b-5b03-4a03-9baf-858770be46d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-d2xpz_openshift-marketplace(04af002b-5b03-4a03-9baf-858770be46d4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-d2xpz_openshift-marketplace_04af002b-5b03-4a03-9baf-858770be46d4_0(1b1ee9b51634bf09278d2c7e3faccd75aaba5a5464431c7d03f416eb4451d84f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/certified-operators-d2xpz" podUID="04af002b-5b03-4a03-9baf-858770be46d4" Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.639158 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" event={"ID":"7f2a45b6-b9b4-4ed8-a290-f04b9e59ee35","Type":"ContainerStarted","Data":"30cf97c47565f7c4d1c204c5e798f2ace2be7e0c9943686bfcb3f64c5e5c1fcc"} Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.639685 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.639836 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.672229 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" podStartSLOduration=7.672211075 podStartE2EDuration="7.672211075s" podCreationTimestamp="2026-02-26 15:54:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:54:39.672195565 +0000 UTC m=+806.884794215" watchObservedRunningTime="2026-02-26 15:54:39.672211075 +0000 UTC m=+806.884809725" Feb 26 15:54:39 crc kubenswrapper[5033]: I0226 15:54:39.677603 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:40 crc kubenswrapper[5033]: I0226 15:54:40.644787 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:40 crc kubenswrapper[5033]: I0226 15:54:40.684625 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.188962 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.189521 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.189564 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.190193 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ea382d5339c87a85284b67b2964d5c78dfbd09c05b9d81cf3fc9df0642c2fcc"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.190250 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://8ea382d5339c87a85284b67b2964d5c78dfbd09c05b9d81cf3fc9df0642c2fcc" gracePeriod=600 Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.627576 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.628692 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.715239 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="8ea382d5339c87a85284b67b2964d5c78dfbd09c05b9d81cf3fc9df0642c2fcc" exitCode=0 Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.715301 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"8ea382d5339c87a85284b67b2964d5c78dfbd09c05b9d81cf3fc9df0642c2fcc"} Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.715331 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"0364f13db4a9c5dfb04881c280d818ef2c52b6cfbe5f9db1cd51cf7111f504ed"} Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.715352 5033 scope.go:117] "RemoveContainer" containerID="70e42a035ba4f49db38bd22aeb347145c8b4c34b93000efaa8e848e164bec4f1" Feb 26 15:54:50 crc kubenswrapper[5033]: I0226 15:54:50.892220 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2xpz"] Feb 26 15:54:50 crc kubenswrapper[5033]: W0226 15:54:50.899401 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04af002b_5b03_4a03_9baf_858770be46d4.slice/crio-bee0abb2b44f7e02fe7ccc4beeaa9be3dc6cffae811cacc5946f01bbb43163ab WatchSource:0}: Error finding container bee0abb2b44f7e02fe7ccc4beeaa9be3dc6cffae811cacc5946f01bbb43163ab: Status 404 returned error can't find the container with id bee0abb2b44f7e02fe7ccc4beeaa9be3dc6cffae811cacc5946f01bbb43163ab Feb 26 15:54:51 crc kubenswrapper[5033]: I0226 15:54:51.725104 5033 generic.go:334] "Generic (PLEG): container finished" podID="04af002b-5b03-4a03-9baf-858770be46d4" containerID="aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8" exitCode=0 Feb 26 15:54:51 crc kubenswrapper[5033]: I0226 15:54:51.725303 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2xpz" event={"ID":"04af002b-5b03-4a03-9baf-858770be46d4","Type":"ContainerDied","Data":"aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8"} Feb 26 15:54:51 crc kubenswrapper[5033]: I0226 15:54:51.725706 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2xpz" event={"ID":"04af002b-5b03-4a03-9baf-858770be46d4","Type":"ContainerStarted","Data":"bee0abb2b44f7e02fe7ccc4beeaa9be3dc6cffae811cacc5946f01bbb43163ab"} Feb 26 15:54:53 crc kubenswrapper[5033]: I0226 15:54:53.628046 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:53 crc kubenswrapper[5033]: I0226 15:54:53.631591 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:53 crc kubenswrapper[5033]: I0226 15:54:53.840586 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-92b7r"] Feb 26 15:54:53 crc kubenswrapper[5033]: W0226 15:54:53.847457 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72c3abd5_9b43_4dc5_afbc_ba8b0433c512.slice/crio-2469696417bbcd72e5c1a1f777a85c75f5de2c74fe3679b8ce2baf95364e8835 WatchSource:0}: Error finding container 2469696417bbcd72e5c1a1f777a85c75f5de2c74fe3679b8ce2baf95364e8835: Status 404 returned error can't find the container with id 2469696417bbcd72e5c1a1f777a85c75f5de2c74fe3679b8ce2baf95364e8835 Feb 26 15:54:54 crc kubenswrapper[5033]: I0226 15:54:54.755642 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-92b7r" event={"ID":"72c3abd5-9b43-4dc5-afbc-ba8b0433c512","Type":"ContainerStarted","Data":"2469696417bbcd72e5c1a1f777a85c75f5de2c74fe3679b8ce2baf95364e8835"} Feb 26 15:54:54 crc kubenswrapper[5033]: I0226 15:54:54.757482 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2xpz" event={"ID":"04af002b-5b03-4a03-9baf-858770be46d4","Type":"ContainerStarted","Data":"85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c"} Feb 26 15:54:55 crc kubenswrapper[5033]: I0226 15:54:55.764755 5033 generic.go:334] "Generic (PLEG): container finished" podID="72c3abd5-9b43-4dc5-afbc-ba8b0433c512" containerID="a47c29de880901af71fb92193bb4ce5792e9f58369fcaa326aa5a05f90ca5b12" exitCode=0 Feb 26 15:54:55 crc kubenswrapper[5033]: I0226 15:54:55.764984 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-92b7r" event={"ID":"72c3abd5-9b43-4dc5-afbc-ba8b0433c512","Type":"ContainerDied","Data":"a47c29de880901af71fb92193bb4ce5792e9f58369fcaa326aa5a05f90ca5b12"} Feb 26 15:54:55 crc kubenswrapper[5033]: I0226 15:54:55.767541 5033 generic.go:334] "Generic (PLEG): container finished" podID="04af002b-5b03-4a03-9baf-858770be46d4" containerID="85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c" exitCode=0 Feb 26 15:54:55 crc kubenswrapper[5033]: I0226 15:54:55.767605 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2xpz" event={"ID":"04af002b-5b03-4a03-9baf-858770be46d4","Type":"ContainerDied","Data":"85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c"} Feb 26 15:54:56 crc kubenswrapper[5033]: I0226 15:54:56.778185 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2xpz" event={"ID":"04af002b-5b03-4a03-9baf-858770be46d4","Type":"ContainerStarted","Data":"de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165"} Feb 26 15:54:56 crc kubenswrapper[5033]: I0226 15:54:56.802528 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d2xpz" podStartSLOduration=17.954849302 podStartE2EDuration="22.802500834s" podCreationTimestamp="2026-02-26 15:54:34 +0000 UTC" firstStartedPulling="2026-02-26 15:54:51.728726212 +0000 UTC m=+818.941324902" lastFinishedPulling="2026-02-26 15:54:56.576377784 +0000 UTC m=+823.788976434" observedRunningTime="2026-02-26 15:54:56.796953831 +0000 UTC m=+824.009552491" watchObservedRunningTime="2026-02-26 15:54:56.802500834 +0000 UTC m=+824.015099494" Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.056395 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.204785 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-node-mnt\") pod \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.204848 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvgvc\" (UniqueName: \"kubernetes.io/projected/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-kube-api-access-rvgvc\") pod \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.204894 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-crc-storage\") pod \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\" (UID: \"72c3abd5-9b43-4dc5-afbc-ba8b0433c512\") " Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.204892 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "72c3abd5-9b43-4dc5-afbc-ba8b0433c512" (UID: "72c3abd5-9b43-4dc5-afbc-ba8b0433c512"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.205185 5033 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.211550 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-kube-api-access-rvgvc" (OuterVolumeSpecName: "kube-api-access-rvgvc") pod "72c3abd5-9b43-4dc5-afbc-ba8b0433c512" (UID: "72c3abd5-9b43-4dc5-afbc-ba8b0433c512"). InnerVolumeSpecName "kube-api-access-rvgvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.218295 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "72c3abd5-9b43-4dc5-afbc-ba8b0433c512" (UID: "72c3abd5-9b43-4dc5-afbc-ba8b0433c512"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.306040 5033 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.306073 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvgvc\" (UniqueName: \"kubernetes.io/projected/72c3abd5-9b43-4dc5-afbc-ba8b0433c512-kube-api-access-rvgvc\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.786173 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-92b7r" event={"ID":"72c3abd5-9b43-4dc5-afbc-ba8b0433c512","Type":"ContainerDied","Data":"2469696417bbcd72e5c1a1f777a85c75f5de2c74fe3679b8ce2baf95364e8835"} Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.786228 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2469696417bbcd72e5c1a1f777a85c75f5de2c74fe3679b8ce2baf95364e8835" Feb 26 15:54:57 crc kubenswrapper[5033]: I0226 15:54:57.786185 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-92b7r" Feb 26 15:55:02 crc kubenswrapper[5033]: I0226 15:55:02.429235 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gvxkp" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.472790 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk"] Feb 26 15:55:04 crc kubenswrapper[5033]: E0226 15:55:04.473005 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c3abd5-9b43-4dc5-afbc-ba8b0433c512" containerName="storage" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.473019 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c3abd5-9b43-4dc5-afbc-ba8b0433c512" containerName="storage" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.473142 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c3abd5-9b43-4dc5-afbc-ba8b0433c512" containerName="storage" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.473888 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.475798 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.482325 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk"] Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.603029 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.603152 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwr8d\" (UniqueName: \"kubernetes.io/projected/fed827dc-359a-4184-bd34-e6c223ab63e2-kube-api-access-lwr8d\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.603225 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.704978 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.705375 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.705469 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwr8d\" (UniqueName: \"kubernetes.io/projected/fed827dc-359a-4184-bd34-e6c223ab63e2-kube-api-access-lwr8d\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.705805 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.706198 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.728718 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwr8d\" (UniqueName: \"kubernetes.io/projected/fed827dc-359a-4184-bd34-e6c223ab63e2-kube-api-access-lwr8d\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.782803 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.782867 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.788520 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.831039 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.871992 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:55:04 crc kubenswrapper[5033]: I0226 15:55:04.972305 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk"] Feb 26 15:55:05 crc kubenswrapper[5033]: I0226 15:55:05.832717 5033 generic.go:334] "Generic (PLEG): container finished" podID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerID="943753588ba18016c7b93cf5e58d718f5ad5e4105d381488ab5378d462214e0f" exitCode=0 Feb 26 15:55:05 crc kubenswrapper[5033]: I0226 15:55:05.832845 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" event={"ID":"fed827dc-359a-4184-bd34-e6c223ab63e2","Type":"ContainerDied","Data":"943753588ba18016c7b93cf5e58d718f5ad5e4105d381488ab5378d462214e0f"} Feb 26 15:55:05 crc kubenswrapper[5033]: I0226 15:55:05.833018 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" event={"ID":"fed827dc-359a-4184-bd34-e6c223ab63e2","Type":"ContainerStarted","Data":"8b5bf2b50c9b4763681378cc84d1ccd4e1908422c44330e73ecf72e9a792541f"} Feb 26 15:55:05 crc kubenswrapper[5033]: I0226 15:55:05.835448 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.397149 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nr2f5"] Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.399569 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.414278 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nr2f5"] Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.545070 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hrb8\" (UniqueName: \"kubernetes.io/projected/3c4352e1-549e-47bc-82a4-189f7d219414-kube-api-access-8hrb8\") pod \"redhat-operators-nr2f5\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.545503 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-catalog-content\") pod \"redhat-operators-nr2f5\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.545577 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-utilities\") pod \"redhat-operators-nr2f5\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.646808 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-catalog-content\") pod \"redhat-operators-nr2f5\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.646846 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-utilities\") pod \"redhat-operators-nr2f5\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.646922 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrb8\" (UniqueName: \"kubernetes.io/projected/3c4352e1-549e-47bc-82a4-189f7d219414-kube-api-access-8hrb8\") pod \"redhat-operators-nr2f5\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.647618 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-utilities\") pod \"redhat-operators-nr2f5\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.647966 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-catalog-content\") pod \"redhat-operators-nr2f5\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.667063 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hrb8\" (UniqueName: \"kubernetes.io/projected/3c4352e1-549e-47bc-82a4-189f7d219414-kube-api-access-8hrb8\") pod \"redhat-operators-nr2f5\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.727184 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:07 crc kubenswrapper[5033]: I0226 15:55:07.925794 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nr2f5"] Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.381499 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2xpz"] Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.382093 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d2xpz" podUID="04af002b-5b03-4a03-9baf-858770be46d4" containerName="registry-server" containerID="cri-o://de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165" gracePeriod=2 Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.726956 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.856318 5033 generic.go:334] "Generic (PLEG): container finished" podID="04af002b-5b03-4a03-9baf-858770be46d4" containerID="de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165" exitCode=0 Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.856399 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2xpz" event={"ID":"04af002b-5b03-4a03-9baf-858770be46d4","Type":"ContainerDied","Data":"de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165"} Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.856422 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2xpz" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.856784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2xpz" event={"ID":"04af002b-5b03-4a03-9baf-858770be46d4","Type":"ContainerDied","Data":"bee0abb2b44f7e02fe7ccc4beeaa9be3dc6cffae811cacc5946f01bbb43163ab"} Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.856862 5033 scope.go:117] "RemoveContainer" containerID="de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.858658 5033 generic.go:334] "Generic (PLEG): container finished" podID="3c4352e1-549e-47bc-82a4-189f7d219414" containerID="0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479" exitCode=0 Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.858723 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nr2f5" event={"ID":"3c4352e1-549e-47bc-82a4-189f7d219414","Type":"ContainerDied","Data":"0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479"} Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.858822 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nr2f5" event={"ID":"3c4352e1-549e-47bc-82a4-189f7d219414","Type":"ContainerStarted","Data":"c153922cc37ec76bae3e8cf3bcb95bf5e675eccb49aa979a969bc34548cd3357"} Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.863859 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-utilities\") pod \"04af002b-5b03-4a03-9baf-858770be46d4\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.863914 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-catalog-content\") pod \"04af002b-5b03-4a03-9baf-858770be46d4\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.863938 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt4x4\" (UniqueName: \"kubernetes.io/projected/04af002b-5b03-4a03-9baf-858770be46d4-kube-api-access-lt4x4\") pod \"04af002b-5b03-4a03-9baf-858770be46d4\" (UID: \"04af002b-5b03-4a03-9baf-858770be46d4\") " Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.864848 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-utilities" (OuterVolumeSpecName: "utilities") pod "04af002b-5b03-4a03-9baf-858770be46d4" (UID: "04af002b-5b03-4a03-9baf-858770be46d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.875794 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04af002b-5b03-4a03-9baf-858770be46d4-kube-api-access-lt4x4" (OuterVolumeSpecName: "kube-api-access-lt4x4") pod "04af002b-5b03-4a03-9baf-858770be46d4" (UID: "04af002b-5b03-4a03-9baf-858770be46d4"). InnerVolumeSpecName "kube-api-access-lt4x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.884384 5033 scope.go:117] "RemoveContainer" containerID="85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.908312 5033 scope.go:117] "RemoveContainer" containerID="aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.922578 5033 scope.go:117] "RemoveContainer" containerID="de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165" Feb 26 15:55:08 crc kubenswrapper[5033]: E0226 15:55:08.923570 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165\": container with ID starting with de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165 not found: ID does not exist" containerID="de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.923662 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165"} err="failed to get container status \"de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165\": rpc error: code = NotFound desc = could not find container \"de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165\": container with ID starting with de3b775540b1927a55cd7dd41801c38eb85b615d7d945b48cdc3101614edb165 not found: ID does not exist" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.923695 5033 scope.go:117] "RemoveContainer" containerID="85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c" Feb 26 15:55:08 crc kubenswrapper[5033]: E0226 15:55:08.924518 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c\": container with ID starting with 85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c not found: ID does not exist" containerID="85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.924563 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c"} err="failed to get container status \"85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c\": rpc error: code = NotFound desc = could not find container \"85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c\": container with ID starting with 85ac8c312b31a8c2bef5bab0777d814b3ad0730a68039d330622b167df57f44c not found: ID does not exist" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.924588 5033 scope.go:117] "RemoveContainer" containerID="aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8" Feb 26 15:55:08 crc kubenswrapper[5033]: E0226 15:55:08.924880 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8\": container with ID starting with aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8 not found: ID does not exist" containerID="aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.924916 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8"} err="failed to get container status \"aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8\": rpc error: code = NotFound desc = could not find container \"aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8\": container with ID starting with aa50b95991a47a66bf1a0a781647357ce5bdd3e155d851f84fa4b4c2e368e4a8 not found: ID does not exist" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.947475 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04af002b-5b03-4a03-9baf-858770be46d4" (UID: "04af002b-5b03-4a03-9baf-858770be46d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.965021 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.965045 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04af002b-5b03-4a03-9baf-858770be46d4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:55:08 crc kubenswrapper[5033]: I0226 15:55:08.965058 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt4x4\" (UniqueName: \"kubernetes.io/projected/04af002b-5b03-4a03-9baf-858770be46d4-kube-api-access-lt4x4\") on node \"crc\" DevicePath \"\"" Feb 26 15:55:09 crc kubenswrapper[5033]: I0226 15:55:09.212137 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2xpz"] Feb 26 15:55:09 crc kubenswrapper[5033]: I0226 15:55:09.216666 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d2xpz"] Feb 26 15:55:09 crc kubenswrapper[5033]: I0226 15:55:09.637963 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04af002b-5b03-4a03-9baf-858770be46d4" path="/var/lib/kubelet/pods/04af002b-5b03-4a03-9baf-858770be46d4/volumes" Feb 26 15:55:11 crc kubenswrapper[5033]: I0226 15:55:11.879436 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nr2f5" event={"ID":"3c4352e1-549e-47bc-82a4-189f7d219414","Type":"ContainerStarted","Data":"66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2"} Feb 26 15:55:11 crc kubenswrapper[5033]: I0226 15:55:11.883618 5033 generic.go:334] "Generic (PLEG): container finished" podID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerID="cd0b8424e8f300f6a62b3a53b8672b60105228286b6feceb8d927e5cc178639f" exitCode=0 Feb 26 15:55:11 crc kubenswrapper[5033]: I0226 15:55:11.883665 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" event={"ID":"fed827dc-359a-4184-bd34-e6c223ab63e2","Type":"ContainerDied","Data":"cd0b8424e8f300f6a62b3a53b8672b60105228286b6feceb8d927e5cc178639f"} Feb 26 15:55:12 crc kubenswrapper[5033]: I0226 15:55:12.892087 5033 generic.go:334] "Generic (PLEG): container finished" podID="3c4352e1-549e-47bc-82a4-189f7d219414" containerID="66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2" exitCode=0 Feb 26 15:55:12 crc kubenswrapper[5033]: I0226 15:55:12.892188 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nr2f5" event={"ID":"3c4352e1-549e-47bc-82a4-189f7d219414","Type":"ContainerDied","Data":"66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2"} Feb 26 15:55:12 crc kubenswrapper[5033]: I0226 15:55:12.897610 5033 generic.go:334] "Generic (PLEG): container finished" podID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerID="dbe631e923a649350aa8b5617458470b9f1214e184226da3d8c484c9a198ef73" exitCode=0 Feb 26 15:55:12 crc kubenswrapper[5033]: I0226 15:55:12.897651 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" event={"ID":"fed827dc-359a-4184-bd34-e6c223ab63e2","Type":"ContainerDied","Data":"dbe631e923a649350aa8b5617458470b9f1214e184226da3d8c484c9a198ef73"} Feb 26 15:55:13 crc kubenswrapper[5033]: I0226 15:55:13.905810 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nr2f5" event={"ID":"3c4352e1-549e-47bc-82a4-189f7d219414","Type":"ContainerStarted","Data":"54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be"} Feb 26 15:55:13 crc kubenswrapper[5033]: I0226 15:55:13.928693 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nr2f5" podStartSLOduration=2.519017266 podStartE2EDuration="6.928676399s" podCreationTimestamp="2026-02-26 15:55:07 +0000 UTC" firstStartedPulling="2026-02-26 15:55:08.860668062 +0000 UTC m=+836.073266712" lastFinishedPulling="2026-02-26 15:55:13.270327155 +0000 UTC m=+840.482925845" observedRunningTime="2026-02-26 15:55:13.928270198 +0000 UTC m=+841.140868848" watchObservedRunningTime="2026-02-26 15:55:13.928676399 +0000 UTC m=+841.141275049" Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.104633 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.204361 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-util\") pod \"fed827dc-359a-4184-bd34-e6c223ab63e2\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.204426 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwr8d\" (UniqueName: \"kubernetes.io/projected/fed827dc-359a-4184-bd34-e6c223ab63e2-kube-api-access-lwr8d\") pod \"fed827dc-359a-4184-bd34-e6c223ab63e2\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.204454 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-bundle\") pod \"fed827dc-359a-4184-bd34-e6c223ab63e2\" (UID: \"fed827dc-359a-4184-bd34-e6c223ab63e2\") " Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.205007 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-bundle" (OuterVolumeSpecName: "bundle") pod "fed827dc-359a-4184-bd34-e6c223ab63e2" (UID: "fed827dc-359a-4184-bd34-e6c223ab63e2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.212477 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed827dc-359a-4184-bd34-e6c223ab63e2-kube-api-access-lwr8d" (OuterVolumeSpecName: "kube-api-access-lwr8d") pod "fed827dc-359a-4184-bd34-e6c223ab63e2" (UID: "fed827dc-359a-4184-bd34-e6c223ab63e2"). InnerVolumeSpecName "kube-api-access-lwr8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.220140 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-util" (OuterVolumeSpecName: "util") pod "fed827dc-359a-4184-bd34-e6c223ab63e2" (UID: "fed827dc-359a-4184-bd34-e6c223ab63e2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.306048 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.306093 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fed827dc-359a-4184-bd34-e6c223ab63e2-util\") on node \"crc\" DevicePath \"\"" Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.306105 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwr8d\" (UniqueName: \"kubernetes.io/projected/fed827dc-359a-4184-bd34-e6c223ab63e2-kube-api-access-lwr8d\") on node \"crc\" DevicePath \"\"" Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.914598 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" event={"ID":"fed827dc-359a-4184-bd34-e6c223ab63e2","Type":"ContainerDied","Data":"8b5bf2b50c9b4763681378cc84d1ccd4e1908422c44330e73ecf72e9a792541f"} Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.914665 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b5bf2b50c9b4763681378cc84d1ccd4e1908422c44330e73ecf72e9a792541f" Feb 26 15:55:14 crc kubenswrapper[5033]: I0226 15:55:14.914921 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk" Feb 26 15:55:17 crc kubenswrapper[5033]: I0226 15:55:17.727948 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:17 crc kubenswrapper[5033]: I0226 15:55:17.728026 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.297627 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk"] Feb 26 15:55:18 crc kubenswrapper[5033]: E0226 15:55:18.297851 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerName="pull" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.297866 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerName="pull" Feb 26 15:55:18 crc kubenswrapper[5033]: E0226 15:55:18.297879 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04af002b-5b03-4a03-9baf-858770be46d4" containerName="extract-content" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.297887 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="04af002b-5b03-4a03-9baf-858770be46d4" containerName="extract-content" Feb 26 15:55:18 crc kubenswrapper[5033]: E0226 15:55:18.297904 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerName="util" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.297913 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerName="util" Feb 26 15:55:18 crc kubenswrapper[5033]: E0226 15:55:18.297925 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04af002b-5b03-4a03-9baf-858770be46d4" containerName="extract-utilities" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.297932 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="04af002b-5b03-4a03-9baf-858770be46d4" containerName="extract-utilities" Feb 26 15:55:18 crc kubenswrapper[5033]: E0226 15:55:18.297943 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerName="extract" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.297950 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerName="extract" Feb 26 15:55:18 crc kubenswrapper[5033]: E0226 15:55:18.297961 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04af002b-5b03-4a03-9baf-858770be46d4" containerName="registry-server" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.297968 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="04af002b-5b03-4a03-9baf-858770be46d4" containerName="registry-server" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.298085 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed827dc-359a-4184-bd34-e6c223ab63e2" containerName="extract" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.298099 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="04af002b-5b03-4a03-9baf-858770be46d4" containerName="registry-server" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.298530 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.300305 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bg9rg" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.300608 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.302934 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.312803 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk"] Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.364087 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-869s5\" (UniqueName: \"kubernetes.io/projected/c31544e8-ba06-46d8-880f-cf5fdd7eb899-kube-api-access-869s5\") pod \"nmstate-operator-75c5dccd6c-px2gk\" (UID: \"c31544e8-ba06-46d8-880f-cf5fdd7eb899\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.464936 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-869s5\" (UniqueName: \"kubernetes.io/projected/c31544e8-ba06-46d8-880f-cf5fdd7eb899-kube-api-access-869s5\") pod \"nmstate-operator-75c5dccd6c-px2gk\" (UID: \"c31544e8-ba06-46d8-880f-cf5fdd7eb899\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.484870 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-869s5\" (UniqueName: \"kubernetes.io/projected/c31544e8-ba06-46d8-880f-cf5fdd7eb899-kube-api-access-869s5\") pod \"nmstate-operator-75c5dccd6c-px2gk\" (UID: \"c31544e8-ba06-46d8-880f-cf5fdd7eb899\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.648652 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk" Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.787630 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nr2f5" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" containerName="registry-server" probeResult="failure" output=< Feb 26 15:55:18 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 15:55:18 crc kubenswrapper[5033]: > Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.874746 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk"] Feb 26 15:55:18 crc kubenswrapper[5033]: I0226 15:55:18.940382 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk" event={"ID":"c31544e8-ba06-46d8-880f-cf5fdd7eb899","Type":"ContainerStarted","Data":"d64a5a5111dcdbe607cbed77773ba8e08bf23237bf7e11c7d4e6a05e685d2f38"} Feb 26 15:55:21 crc kubenswrapper[5033]: I0226 15:55:21.973083 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk" event={"ID":"c31544e8-ba06-46d8-880f-cf5fdd7eb899","Type":"ContainerStarted","Data":"9ca5be503d7f6a72aa9d5bdb191cb9c90672d06329779169198e3de6e0fbe970"} Feb 26 15:55:21 crc kubenswrapper[5033]: I0226 15:55:21.999202 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-px2gk" podStartSLOduration=1.846162321 podStartE2EDuration="3.999175603s" podCreationTimestamp="2026-02-26 15:55:18 +0000 UTC" firstStartedPulling="2026-02-26 15:55:18.893713852 +0000 UTC m=+846.106312512" lastFinishedPulling="2026-02-26 15:55:21.046727144 +0000 UTC m=+848.259325794" observedRunningTime="2026-02-26 15:55:21.99379019 +0000 UTC m=+849.206388880" watchObservedRunningTime="2026-02-26 15:55:21.999175603 +0000 UTC m=+849.211774283" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.647282 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-pntj4"] Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.649378 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-pntj4" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.651819 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wk4bm" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.658826 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-pntj4"] Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.667999 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl"] Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.668873 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.672072 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.681215 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl"] Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.699562 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-sxftv"] Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.700892 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.783346 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x"] Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.784166 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.786356 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.786419 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mmqmc" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.787654 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.827556 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x"] Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.845997 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5dadc038-c65b-4800-9ca4-e3a879aeae2d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-2hrjl\" (UID: \"5dadc038-c65b-4800-9ca4-e3a879aeae2d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.846048 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqrlb\" (UniqueName: \"kubernetes.io/projected/5dadc038-c65b-4800-9ca4-e3a879aeae2d-kube-api-access-cqrlb\") pod \"nmstate-webhook-786f45cff4-2hrjl\" (UID: \"5dadc038-c65b-4800-9ca4-e3a879aeae2d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.846474 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-nmstate-lock\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.846509 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-dbus-socket\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.846591 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l2v8\" (UniqueName: \"kubernetes.io/projected/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-kube-api-access-2l2v8\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.846625 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-ovs-socket\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.846678 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t54g\" (UniqueName: \"kubernetes.io/projected/46ef3bc4-83e8-4ec0-a5ba-af3f2a567726-kube-api-access-4t54g\") pod \"nmstate-metrics-69594cc75-pntj4\" (UID: \"46ef3bc4-83e8-4ec0-a5ba-af3f2a567726\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-pntj4" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948133 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5dadc038-c65b-4800-9ca4-e3a879aeae2d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-2hrjl\" (UID: \"5dadc038-c65b-4800-9ca4-e3a879aeae2d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948184 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5ed0fdc7-c54e-4cc9-bb77-249396ebea11-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-fhq9x\" (UID: \"5ed0fdc7-c54e-4cc9-bb77-249396ebea11\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948223 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqrlb\" (UniqueName: \"kubernetes.io/projected/5dadc038-c65b-4800-9ca4-e3a879aeae2d-kube-api-access-cqrlb\") pod \"nmstate-webhook-786f45cff4-2hrjl\" (UID: \"5dadc038-c65b-4800-9ca4-e3a879aeae2d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948282 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed0fdc7-c54e-4cc9-bb77-249396ebea11-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-fhq9x\" (UID: \"5ed0fdc7-c54e-4cc9-bb77-249396ebea11\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948312 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-nmstate-lock\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: E0226 15:55:24.948355 5033 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948431 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-dbus-socket\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: E0226 15:55:24.948469 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dadc038-c65b-4800-9ca4-e3a879aeae2d-tls-key-pair podName:5dadc038-c65b-4800-9ca4-e3a879aeae2d nodeName:}" failed. No retries permitted until 2026-02-26 15:55:25.448442375 +0000 UTC m=+852.661041025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5dadc038-c65b-4800-9ca4-e3a879aeae2d-tls-key-pair") pod "nmstate-webhook-786f45cff4-2hrjl" (UID: "5dadc038-c65b-4800-9ca4-e3a879aeae2d") : secret "openshift-nmstate-webhook" not found Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948495 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l2v8\" (UniqueName: \"kubernetes.io/projected/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-kube-api-access-2l2v8\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948523 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-ovs-socket\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948474 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-nmstate-lock\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948571 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrd7\" (UniqueName: \"kubernetes.io/projected/5ed0fdc7-c54e-4cc9-bb77-249396ebea11-kube-api-access-wqrd7\") pod \"nmstate-console-plugin-5dcbbd79cf-fhq9x\" (UID: \"5ed0fdc7-c54e-4cc9-bb77-249396ebea11\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948602 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t54g\" (UniqueName: \"kubernetes.io/projected/46ef3bc4-83e8-4ec0-a5ba-af3f2a567726-kube-api-access-4t54g\") pod \"nmstate-metrics-69594cc75-pntj4\" (UID: \"46ef3bc4-83e8-4ec0-a5ba-af3f2a567726\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-pntj4" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948639 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-ovs-socket\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.948716 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-dbus-socket\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.976842 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-586b779c96-gzltf"] Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.977667 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.981860 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqrlb\" (UniqueName: \"kubernetes.io/projected/5dadc038-c65b-4800-9ca4-e3a879aeae2d-kube-api-access-cqrlb\") pod \"nmstate-webhook-786f45cff4-2hrjl\" (UID: \"5dadc038-c65b-4800-9ca4-e3a879aeae2d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.987074 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t54g\" (UniqueName: \"kubernetes.io/projected/46ef3bc4-83e8-4ec0-a5ba-af3f2a567726-kube-api-access-4t54g\") pod \"nmstate-metrics-69594cc75-pntj4\" (UID: \"46ef3bc4-83e8-4ec0-a5ba-af3f2a567726\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-pntj4" Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.991636 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-586b779c96-gzltf"] Feb 26 15:55:24 crc kubenswrapper[5033]: I0226 15:55:24.997319 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l2v8\" (UniqueName: \"kubernetes.io/projected/d8a7b715-5bb0-4163-ab06-3aeff0badbd6-kube-api-access-2l2v8\") pod \"nmstate-handler-sxftv\" (UID: \"d8a7b715-5bb0-4163-ab06-3aeff0badbd6\") " pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.026101 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:25 crc kubenswrapper[5033]: W0226 15:55:25.045390 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8a7b715_5bb0_4163_ab06_3aeff0badbd6.slice/crio-0234ec2a1bef0ed27a12fe17b6023bb7c3b41691e9df0c2122a667ad6d2000af WatchSource:0}: Error finding container 0234ec2a1bef0ed27a12fe17b6023bb7c3b41691e9df0c2122a667ad6d2000af: Status 404 returned error can't find the container with id 0234ec2a1bef0ed27a12fe17b6023bb7c3b41691e9df0c2122a667ad6d2000af Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.049207 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5db7f67c-1e4e-479f-8889-de7491c851fb-console-serving-cert\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.050534 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-service-ca\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.050611 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5ed0fdc7-c54e-4cc9-bb77-249396ebea11-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-fhq9x\" (UID: \"5ed0fdc7-c54e-4cc9-bb77-249396ebea11\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.050669 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5db7f67c-1e4e-479f-8889-de7491c851fb-console-oauth-config\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.050724 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed0fdc7-c54e-4cc9-bb77-249396ebea11-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-fhq9x\" (UID: \"5ed0fdc7-c54e-4cc9-bb77-249396ebea11\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.050779 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-trusted-ca-bundle\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.050813 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k6dz\" (UniqueName: \"kubernetes.io/projected/5db7f67c-1e4e-479f-8889-de7491c851fb-kube-api-access-4k6dz\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.050840 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-oauth-serving-cert\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.050865 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-console-config\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.050897 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrd7\" (UniqueName: \"kubernetes.io/projected/5ed0fdc7-c54e-4cc9-bb77-249396ebea11-kube-api-access-wqrd7\") pod \"nmstate-console-plugin-5dcbbd79cf-fhq9x\" (UID: \"5ed0fdc7-c54e-4cc9-bb77-249396ebea11\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.053289 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5ed0fdc7-c54e-4cc9-bb77-249396ebea11-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-fhq9x\" (UID: \"5ed0fdc7-c54e-4cc9-bb77-249396ebea11\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.055622 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed0fdc7-c54e-4cc9-bb77-249396ebea11-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-fhq9x\" (UID: \"5ed0fdc7-c54e-4cc9-bb77-249396ebea11\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.069623 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrd7\" (UniqueName: \"kubernetes.io/projected/5ed0fdc7-c54e-4cc9-bb77-249396ebea11-kube-api-access-wqrd7\") pod \"nmstate-console-plugin-5dcbbd79cf-fhq9x\" (UID: \"5ed0fdc7-c54e-4cc9-bb77-249396ebea11\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.098618 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.151598 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5db7f67c-1e4e-479f-8889-de7491c851fb-console-oauth-config\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.151679 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-trusted-ca-bundle\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.151703 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k6dz\" (UniqueName: \"kubernetes.io/projected/5db7f67c-1e4e-479f-8889-de7491c851fb-kube-api-access-4k6dz\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.151724 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-console-config\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.151739 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-oauth-serving-cert\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.151766 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5db7f67c-1e4e-479f-8889-de7491c851fb-console-serving-cert\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.151784 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-service-ca\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.152684 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-service-ca\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.152763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-console-config\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.152841 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-oauth-serving-cert\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.153551 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db7f67c-1e4e-479f-8889-de7491c851fb-trusted-ca-bundle\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.155358 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5db7f67c-1e4e-479f-8889-de7491c851fb-console-oauth-config\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.155510 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5db7f67c-1e4e-479f-8889-de7491c851fb-console-serving-cert\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.174434 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k6dz\" (UniqueName: \"kubernetes.io/projected/5db7f67c-1e4e-479f-8889-de7491c851fb-kube-api-access-4k6dz\") pod \"console-586b779c96-gzltf\" (UID: \"5db7f67c-1e4e-479f-8889-de7491c851fb\") " pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.262900 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x"] Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.266916 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-pntj4" Feb 26 15:55:25 crc kubenswrapper[5033]: W0226 15:55:25.269211 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ed0fdc7_c54e_4cc9_bb77_249396ebea11.slice/crio-e6771483fb5c8d88bd8ceacecba9732d6f039fc841c4ab3cfe02920ac41d03c5 WatchSource:0}: Error finding container e6771483fb5c8d88bd8ceacecba9732d6f039fc841c4ab3cfe02920ac41d03c5: Status 404 returned error can't find the container with id e6771483fb5c8d88bd8ceacecba9732d6f039fc841c4ab3cfe02920ac41d03c5 Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.338250 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.455186 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5dadc038-c65b-4800-9ca4-e3a879aeae2d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-2hrjl\" (UID: \"5dadc038-c65b-4800-9ca4-e3a879aeae2d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.459400 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5dadc038-c65b-4800-9ca4-e3a879aeae2d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-2hrjl\" (UID: \"5dadc038-c65b-4800-9ca4-e3a879aeae2d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.476471 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-pntj4"] Feb 26 15:55:25 crc kubenswrapper[5033]: W0226 15:55:25.486049 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ef3bc4_83e8_4ec0_a5ba_af3f2a567726.slice/crio-2e632d8c301b7a404401bca4673bc4941b20279be6e22e5332612b6fdad5a752 WatchSource:0}: Error finding container 2e632d8c301b7a404401bca4673bc4941b20279be6e22e5332612b6fdad5a752: Status 404 returned error can't find the container with id 2e632d8c301b7a404401bca4673bc4941b20279be6e22e5332612b6fdad5a752 Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.525520 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-586b779c96-gzltf"] Feb 26 15:55:25 crc kubenswrapper[5033]: W0226 15:55:25.527810 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5db7f67c_1e4e_479f_8889_de7491c851fb.slice/crio-eccf05b73637868d4651d230412ddbff27d4c0c34d3fd87481731bc681fbed11 WatchSource:0}: Error finding container eccf05b73637868d4651d230412ddbff27d4c0c34d3fd87481731bc681fbed11: Status 404 returned error can't find the container with id eccf05b73637868d4651d230412ddbff27d4c0c34d3fd87481731bc681fbed11 Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.584857 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.769106 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl"] Feb 26 15:55:25 crc kubenswrapper[5033]: W0226 15:55:25.772714 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dadc038_c65b_4800_9ca4_e3a879aeae2d.slice/crio-e7f6da6ad93d4ff5bc46e4005a96dba0f9ab6e492d2f7f95aff91cd946ff0c7e WatchSource:0}: Error finding container e7f6da6ad93d4ff5bc46e4005a96dba0f9ab6e492d2f7f95aff91cd946ff0c7e: Status 404 returned error can't find the container with id e7f6da6ad93d4ff5bc46e4005a96dba0f9ab6e492d2f7f95aff91cd946ff0c7e Feb 26 15:55:25 crc kubenswrapper[5033]: I0226 15:55:25.998460 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sxftv" event={"ID":"d8a7b715-5bb0-4163-ab06-3aeff0badbd6","Type":"ContainerStarted","Data":"0234ec2a1bef0ed27a12fe17b6023bb7c3b41691e9df0c2122a667ad6d2000af"} Feb 26 15:55:26 crc kubenswrapper[5033]: I0226 15:55:26.002387 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" event={"ID":"5ed0fdc7-c54e-4cc9-bb77-249396ebea11","Type":"ContainerStarted","Data":"e6771483fb5c8d88bd8ceacecba9732d6f039fc841c4ab3cfe02920ac41d03c5"} Feb 26 15:55:26 crc kubenswrapper[5033]: I0226 15:55:26.003981 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-pntj4" event={"ID":"46ef3bc4-83e8-4ec0-a5ba-af3f2a567726","Type":"ContainerStarted","Data":"2e632d8c301b7a404401bca4673bc4941b20279be6e22e5332612b6fdad5a752"} Feb 26 15:55:26 crc kubenswrapper[5033]: I0226 15:55:26.006007 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" event={"ID":"5dadc038-c65b-4800-9ca4-e3a879aeae2d","Type":"ContainerStarted","Data":"e7f6da6ad93d4ff5bc46e4005a96dba0f9ab6e492d2f7f95aff91cd946ff0c7e"} Feb 26 15:55:26 crc kubenswrapper[5033]: I0226 15:55:26.007779 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-586b779c96-gzltf" event={"ID":"5db7f67c-1e4e-479f-8889-de7491c851fb","Type":"ContainerStarted","Data":"07aa3f430054d358c9ba504fb62d1d9873c69e908a6d4034ccc2d8d4c941c21c"} Feb 26 15:55:26 crc kubenswrapper[5033]: I0226 15:55:26.008449 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-586b779c96-gzltf" event={"ID":"5db7f67c-1e4e-479f-8889-de7491c851fb","Type":"ContainerStarted","Data":"eccf05b73637868d4651d230412ddbff27d4c0c34d3fd87481731bc681fbed11"} Feb 26 15:55:26 crc kubenswrapper[5033]: I0226 15:55:26.040697 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-586b779c96-gzltf" podStartSLOduration=2.040623165 podStartE2EDuration="2.040623165s" podCreationTimestamp="2026-02-26 15:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:55:26.035325594 +0000 UTC m=+853.247924314" watchObservedRunningTime="2026-02-26 15:55:26.040623165 +0000 UTC m=+853.253221865" Feb 26 15:55:27 crc kubenswrapper[5033]: I0226 15:55:27.776923 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:27 crc kubenswrapper[5033]: I0226 15:55:27.817879 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:28 crc kubenswrapper[5033]: I0226 15:55:28.011481 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nr2f5"] Feb 26 15:55:28 crc kubenswrapper[5033]: I0226 15:55:28.021575 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" event={"ID":"5ed0fdc7-c54e-4cc9-bb77-249396ebea11","Type":"ContainerStarted","Data":"d25c3789a068cb1fe26ff1c2bf0fa348c636b8b500510e63fd3b88d83a75cccc"} Feb 26 15:55:28 crc kubenswrapper[5033]: I0226 15:55:28.032759 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-fhq9x" podStartSLOduration=1.8665077399999999 podStartE2EDuration="4.032745151s" podCreationTimestamp="2026-02-26 15:55:24 +0000 UTC" firstStartedPulling="2026-02-26 15:55:25.274244378 +0000 UTC m=+852.486843028" lastFinishedPulling="2026-02-26 15:55:27.440481799 +0000 UTC m=+854.653080439" observedRunningTime="2026-02-26 15:55:28.032503746 +0000 UTC m=+855.245102426" watchObservedRunningTime="2026-02-26 15:55:28.032745151 +0000 UTC m=+855.245343801" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.029841 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sxftv" event={"ID":"d8a7b715-5bb0-4163-ab06-3aeff0badbd6","Type":"ContainerStarted","Data":"870ba13b1d7442a2b37e43d62ce876b6b38aafb7355e761e34d1c56fd1f3fb1f"} Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.030191 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.031960 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-pntj4" event={"ID":"46ef3bc4-83e8-4ec0-a5ba-af3f2a567726","Type":"ContainerStarted","Data":"e49412239c4462a0264cb478b604248120418c392b6197c6b16209fc32c89ce3"} Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.034237 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" event={"ID":"5dadc038-c65b-4800-9ca4-e3a879aeae2d","Type":"ContainerStarted","Data":"56658ea4eff93b5f107f04c2d653fc36132dd07dda0a161cc59d2eb138b9d489"} Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.034550 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.034791 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nr2f5" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" containerName="registry-server" containerID="cri-o://54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be" gracePeriod=2 Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.057359 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-sxftv" podStartSLOduration=1.437216688 podStartE2EDuration="5.057335563s" podCreationTimestamp="2026-02-26 15:55:24 +0000 UTC" firstStartedPulling="2026-02-26 15:55:25.048818879 +0000 UTC m=+852.261417529" lastFinishedPulling="2026-02-26 15:55:28.668937674 +0000 UTC m=+855.881536404" observedRunningTime="2026-02-26 15:55:29.054349569 +0000 UTC m=+856.266948239" watchObservedRunningTime="2026-02-26 15:55:29.057335563 +0000 UTC m=+856.269934213" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.075192 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" podStartSLOduration=2.504911979 podStartE2EDuration="5.075156486s" podCreationTimestamp="2026-02-26 15:55:24 +0000 UTC" firstStartedPulling="2026-02-26 15:55:25.774525625 +0000 UTC m=+852.987124275" lastFinishedPulling="2026-02-26 15:55:28.344770122 +0000 UTC m=+855.557368782" observedRunningTime="2026-02-26 15:55:29.072758456 +0000 UTC m=+856.285357106" watchObservedRunningTime="2026-02-26 15:55:29.075156486 +0000 UTC m=+856.287755136" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.475268 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.518822 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-utilities\") pod \"3c4352e1-549e-47bc-82a4-189f7d219414\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.518906 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-catalog-content\") pod \"3c4352e1-549e-47bc-82a4-189f7d219414\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.518939 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hrb8\" (UniqueName: \"kubernetes.io/projected/3c4352e1-549e-47bc-82a4-189f7d219414-kube-api-access-8hrb8\") pod \"3c4352e1-549e-47bc-82a4-189f7d219414\" (UID: \"3c4352e1-549e-47bc-82a4-189f7d219414\") " Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.536453 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c4352e1-549e-47bc-82a4-189f7d219414-kube-api-access-8hrb8" (OuterVolumeSpecName: "kube-api-access-8hrb8") pod "3c4352e1-549e-47bc-82a4-189f7d219414" (UID: "3c4352e1-549e-47bc-82a4-189f7d219414"). InnerVolumeSpecName "kube-api-access-8hrb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.537849 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-utilities" (OuterVolumeSpecName: "utilities") pod "3c4352e1-549e-47bc-82a4-189f7d219414" (UID: "3c4352e1-549e-47bc-82a4-189f7d219414"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.620204 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.620235 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hrb8\" (UniqueName: \"kubernetes.io/projected/3c4352e1-549e-47bc-82a4-189f7d219414-kube-api-access-8hrb8\") on node \"crc\" DevicePath \"\"" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.650294 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c4352e1-549e-47bc-82a4-189f7d219414" (UID: "3c4352e1-549e-47bc-82a4-189f7d219414"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:55:29 crc kubenswrapper[5033]: I0226 15:55:29.721172 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4352e1-549e-47bc-82a4-189f7d219414-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.043133 5033 generic.go:334] "Generic (PLEG): container finished" podID="3c4352e1-549e-47bc-82a4-189f7d219414" containerID="54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be" exitCode=0 Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.043845 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nr2f5" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.043882 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nr2f5" event={"ID":"3c4352e1-549e-47bc-82a4-189f7d219414","Type":"ContainerDied","Data":"54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be"} Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.043961 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nr2f5" event={"ID":"3c4352e1-549e-47bc-82a4-189f7d219414","Type":"ContainerDied","Data":"c153922cc37ec76bae3e8cf3bcb95bf5e675eccb49aa979a969bc34548cd3357"} Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.044013 5033 scope.go:117] "RemoveContainer" containerID="54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.075654 5033 scope.go:117] "RemoveContainer" containerID="66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.096934 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nr2f5"] Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.104080 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nr2f5"] Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.119208 5033 scope.go:117] "RemoveContainer" containerID="0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.146811 5033 scope.go:117] "RemoveContainer" containerID="54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be" Feb 26 15:55:30 crc kubenswrapper[5033]: E0226 15:55:30.148800 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be\": container with ID starting with 54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be not found: ID does not exist" containerID="54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.148843 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be"} err="failed to get container status \"54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be\": rpc error: code = NotFound desc = could not find container \"54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be\": container with ID starting with 54ae8afae6b24115a8577769026a7c01c06bf1d575274e302f9a2c02020f06be not found: ID does not exist" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.148877 5033 scope.go:117] "RemoveContainer" containerID="66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2" Feb 26 15:55:30 crc kubenswrapper[5033]: E0226 15:55:30.149677 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2\": container with ID starting with 66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2 not found: ID does not exist" containerID="66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.149727 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2"} err="failed to get container status \"66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2\": rpc error: code = NotFound desc = could not find container \"66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2\": container with ID starting with 66c2fcd4c1525e560598c559095c16b1be45a25fdcffc1cc2560af4015e6e5c2 not found: ID does not exist" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.149764 5033 scope.go:117] "RemoveContainer" containerID="0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479" Feb 26 15:55:30 crc kubenswrapper[5033]: E0226 15:55:30.150661 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479\": container with ID starting with 0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479 not found: ID does not exist" containerID="0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479" Feb 26 15:55:30 crc kubenswrapper[5033]: I0226 15:55:30.150822 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479"} err="failed to get container status \"0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479\": rpc error: code = NotFound desc = could not find container \"0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479\": container with ID starting with 0c58478ce61909f82614df73ac715aaa0e5ccfda68a5fbe781b15f1c729ee479 not found: ID does not exist" Feb 26 15:55:31 crc kubenswrapper[5033]: I0226 15:55:31.642419 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" path="/var/lib/kubelet/pods/3c4352e1-549e-47bc-82a4-189f7d219414/volumes" Feb 26 15:55:32 crc kubenswrapper[5033]: I0226 15:55:32.065581 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-pntj4" event={"ID":"46ef3bc4-83e8-4ec0-a5ba-af3f2a567726","Type":"ContainerStarted","Data":"cf84ca8c39d7a07b5eae88c5637a5d337f9f7871b90e81f8c296025768303cd2"} Feb 26 15:55:32 crc kubenswrapper[5033]: I0226 15:55:32.090414 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-pntj4" podStartSLOduration=2.569901534 podStartE2EDuration="8.090399236s" podCreationTimestamp="2026-02-26 15:55:24 +0000 UTC" firstStartedPulling="2026-02-26 15:55:25.489311621 +0000 UTC m=+852.701910271" lastFinishedPulling="2026-02-26 15:55:31.009809283 +0000 UTC m=+858.222407973" observedRunningTime="2026-02-26 15:55:32.087410362 +0000 UTC m=+859.300009012" watchObservedRunningTime="2026-02-26 15:55:32.090399236 +0000 UTC m=+859.302997886" Feb 26 15:55:35 crc kubenswrapper[5033]: I0226 15:55:35.060595 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-sxftv" Feb 26 15:55:35 crc kubenswrapper[5033]: I0226 15:55:35.338517 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:35 crc kubenswrapper[5033]: I0226 15:55:35.338611 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:35 crc kubenswrapper[5033]: I0226 15:55:35.347082 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:36 crc kubenswrapper[5033]: I0226 15:55:36.111784 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-586b779c96-gzltf" Feb 26 15:55:36 crc kubenswrapper[5033]: I0226 15:55:36.197241 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nq49v"] Feb 26 15:55:45 crc kubenswrapper[5033]: I0226 15:55:45.589791 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-2hrjl" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.149465 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9kxz7"] Feb 26 15:55:55 crc kubenswrapper[5033]: E0226 15:55:55.152284 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" containerName="extract-utilities" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.152309 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" containerName="extract-utilities" Feb 26 15:55:55 crc kubenswrapper[5033]: E0226 15:55:55.152321 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" containerName="extract-content" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.152329 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" containerName="extract-content" Feb 26 15:55:55 crc kubenswrapper[5033]: E0226 15:55:55.152338 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" containerName="registry-server" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.152346 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" containerName="registry-server" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.152509 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4352e1-549e-47bc-82a4-189f7d219414" containerName="registry-server" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.157618 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.160022 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9kxz7"] Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.334687 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-catalog-content\") pod \"community-operators-9kxz7\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.334821 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4cnl\" (UniqueName: \"kubernetes.io/projected/fd8e8fa7-36da-4870-8990-a2937072b281-kube-api-access-d4cnl\") pod \"community-operators-9kxz7\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.334845 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-utilities\") pod \"community-operators-9kxz7\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.435456 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-catalog-content\") pod \"community-operators-9kxz7\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.435800 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4cnl\" (UniqueName: \"kubernetes.io/projected/fd8e8fa7-36da-4870-8990-a2937072b281-kube-api-access-d4cnl\") pod \"community-operators-9kxz7\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.435823 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-utilities\") pod \"community-operators-9kxz7\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.436351 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-catalog-content\") pod \"community-operators-9kxz7\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.436487 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-utilities\") pod \"community-operators-9kxz7\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.455822 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4cnl\" (UniqueName: \"kubernetes.io/projected/fd8e8fa7-36da-4870-8990-a2937072b281-kube-api-access-d4cnl\") pod \"community-operators-9kxz7\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.520181 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:55:55 crc kubenswrapper[5033]: I0226 15:55:55.991220 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9kxz7"] Feb 26 15:55:56 crc kubenswrapper[5033]: I0226 15:55:56.239044 5033 generic.go:334] "Generic (PLEG): container finished" podID="fd8e8fa7-36da-4870-8990-a2937072b281" containerID="b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc" exitCode=0 Feb 26 15:55:56 crc kubenswrapper[5033]: I0226 15:55:56.239238 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kxz7" event={"ID":"fd8e8fa7-36da-4870-8990-a2937072b281","Type":"ContainerDied","Data":"b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc"} Feb 26 15:55:56 crc kubenswrapper[5033]: I0226 15:55:56.239354 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kxz7" event={"ID":"fd8e8fa7-36da-4870-8990-a2937072b281","Type":"ContainerStarted","Data":"3bc58214c8f54bb013c843c0295531d82ed08f17581942364ce5ae8709a386ff"} Feb 26 15:55:57 crc kubenswrapper[5033]: I0226 15:55:57.248847 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kxz7" event={"ID":"fd8e8fa7-36da-4870-8990-a2937072b281","Type":"ContainerStarted","Data":"34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae"} Feb 26 15:55:58 crc kubenswrapper[5033]: I0226 15:55:58.258304 5033 generic.go:334] "Generic (PLEG): container finished" podID="fd8e8fa7-36da-4870-8990-a2937072b281" containerID="34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae" exitCode=0 Feb 26 15:55:58 crc kubenswrapper[5033]: I0226 15:55:58.258946 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kxz7" event={"ID":"fd8e8fa7-36da-4870-8990-a2937072b281","Type":"ContainerDied","Data":"34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae"} Feb 26 15:55:59 crc kubenswrapper[5033]: I0226 15:55:59.267243 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kxz7" event={"ID":"fd8e8fa7-36da-4870-8990-a2937072b281","Type":"ContainerStarted","Data":"126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0"} Feb 26 15:55:59 crc kubenswrapper[5033]: I0226 15:55:59.286211 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9kxz7" podStartSLOduration=1.885055206 podStartE2EDuration="4.286182992s" podCreationTimestamp="2026-02-26 15:55:55 +0000 UTC" firstStartedPulling="2026-02-26 15:55:56.241978901 +0000 UTC m=+883.454577551" lastFinishedPulling="2026-02-26 15:55:58.643106687 +0000 UTC m=+885.855705337" observedRunningTime="2026-02-26 15:55:59.283075454 +0000 UTC m=+886.495674144" watchObservedRunningTime="2026-02-26 15:55:59.286182992 +0000 UTC m=+886.498781682" Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.142392 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535356-cjwln"] Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.143284 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535356-cjwln" Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.145549 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.145616 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.145637 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.147858 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535356-cjwln"] Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.301346 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99zp\" (UniqueName: \"kubernetes.io/projected/012bed3c-4f40-4873-86fd-1a97de8df980-kube-api-access-l99zp\") pod \"auto-csr-approver-29535356-cjwln\" (UID: \"012bed3c-4f40-4873-86fd-1a97de8df980\") " pod="openshift-infra/auto-csr-approver-29535356-cjwln" Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.402961 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l99zp\" (UniqueName: \"kubernetes.io/projected/012bed3c-4f40-4873-86fd-1a97de8df980-kube-api-access-l99zp\") pod \"auto-csr-approver-29535356-cjwln\" (UID: \"012bed3c-4f40-4873-86fd-1a97de8df980\") " pod="openshift-infra/auto-csr-approver-29535356-cjwln" Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.433240 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99zp\" (UniqueName: \"kubernetes.io/projected/012bed3c-4f40-4873-86fd-1a97de8df980-kube-api-access-l99zp\") pod \"auto-csr-approver-29535356-cjwln\" (UID: \"012bed3c-4f40-4873-86fd-1a97de8df980\") " pod="openshift-infra/auto-csr-approver-29535356-cjwln" Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.459705 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535356-cjwln" Feb 26 15:56:00 crc kubenswrapper[5033]: I0226 15:56:00.680098 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535356-cjwln"] Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.261212 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nq49v" podUID="d6d762dc-b9e3-44fd-a835-78df146dc35a" containerName="console" containerID="cri-o://db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c" gracePeriod=15 Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.282924 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535356-cjwln" event={"ID":"012bed3c-4f40-4873-86fd-1a97de8df980","Type":"ContainerStarted","Data":"2cd912cae4527b462d8a7bedaf3760b34d6728940aa44f63b5f27621c142be88"} Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.675466 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nq49v_d6d762dc-b9e3-44fd-a835-78df146dc35a/console/0.log" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.675773 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.823862 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-service-ca\") pod \"d6d762dc-b9e3-44fd-a835-78df146dc35a\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.823988 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-oauth-serving-cert\") pod \"d6d762dc-b9e3-44fd-a835-78df146dc35a\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.824079 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpqd7\" (UniqueName: \"kubernetes.io/projected/d6d762dc-b9e3-44fd-a835-78df146dc35a-kube-api-access-mpqd7\") pod \"d6d762dc-b9e3-44fd-a835-78df146dc35a\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.824127 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-serving-cert\") pod \"d6d762dc-b9e3-44fd-a835-78df146dc35a\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.824193 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-config\") pod \"d6d762dc-b9e3-44fd-a835-78df146dc35a\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.824227 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-trusted-ca-bundle\") pod \"d6d762dc-b9e3-44fd-a835-78df146dc35a\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.824298 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-oauth-config\") pod \"d6d762dc-b9e3-44fd-a835-78df146dc35a\" (UID: \"d6d762dc-b9e3-44fd-a835-78df146dc35a\") " Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.826780 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d6d762dc-b9e3-44fd-a835-78df146dc35a" (UID: "d6d762dc-b9e3-44fd-a835-78df146dc35a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.826789 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d6d762dc-b9e3-44fd-a835-78df146dc35a" (UID: "d6d762dc-b9e3-44fd-a835-78df146dc35a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.826937 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-service-ca" (OuterVolumeSpecName: "service-ca") pod "d6d762dc-b9e3-44fd-a835-78df146dc35a" (UID: "d6d762dc-b9e3-44fd-a835-78df146dc35a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.827606 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-config" (OuterVolumeSpecName: "console-config") pod "d6d762dc-b9e3-44fd-a835-78df146dc35a" (UID: "d6d762dc-b9e3-44fd-a835-78df146dc35a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.832441 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d6d762dc-b9e3-44fd-a835-78df146dc35a" (UID: "d6d762dc-b9e3-44fd-a835-78df146dc35a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.835849 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d6d762dc-b9e3-44fd-a835-78df146dc35a" (UID: "d6d762dc-b9e3-44fd-a835-78df146dc35a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.836392 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d762dc-b9e3-44fd-a835-78df146dc35a-kube-api-access-mpqd7" (OuterVolumeSpecName: "kube-api-access-mpqd7") pod "d6d762dc-b9e3-44fd-a835-78df146dc35a" (UID: "d6d762dc-b9e3-44fd-a835-78df146dc35a"). InnerVolumeSpecName "kube-api-access-mpqd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.925131 5033 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.925188 5033 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.925199 5033 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.925208 5033 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.925217 5033 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6d762dc-b9e3-44fd-a835-78df146dc35a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.925225 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpqd7\" (UniqueName: \"kubernetes.io/projected/d6d762dc-b9e3-44fd-a835-78df146dc35a-kube-api-access-mpqd7\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:01 crc kubenswrapper[5033]: I0226 15:56:01.925234 5033 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d762dc-b9e3-44fd-a835-78df146dc35a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.294321 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nq49v_d6d762dc-b9e3-44fd-a835-78df146dc35a/console/0.log" Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.294644 5033 generic.go:334] "Generic (PLEG): container finished" podID="d6d762dc-b9e3-44fd-a835-78df146dc35a" containerID="db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c" exitCode=2 Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.294791 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nq49v" Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.294722 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nq49v" event={"ID":"d6d762dc-b9e3-44fd-a835-78df146dc35a","Type":"ContainerDied","Data":"db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c"} Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.294922 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nq49v" event={"ID":"d6d762dc-b9e3-44fd-a835-78df146dc35a","Type":"ContainerDied","Data":"6503095f95f4ec7e815ae94bdd698f6c1fc9811a91dcd2ab3b9587b8edf5a28f"} Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.294948 5033 scope.go:117] "RemoveContainer" containerID="db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c" Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.298300 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535356-cjwln" event={"ID":"012bed3c-4f40-4873-86fd-1a97de8df980","Type":"ContainerDied","Data":"708a5969434e503948662e31899db8e9de9238ce44ad3d2abada45c3bb90752c"} Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.298246 5033 generic.go:334] "Generic (PLEG): container finished" podID="012bed3c-4f40-4873-86fd-1a97de8df980" containerID="708a5969434e503948662e31899db8e9de9238ce44ad3d2abada45c3bb90752c" exitCode=0 Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.320925 5033 scope.go:117] "RemoveContainer" containerID="db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c" Feb 26 15:56:02 crc kubenswrapper[5033]: E0226 15:56:02.325011 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c\": container with ID starting with db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c not found: ID does not exist" containerID="db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c" Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.325071 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c"} err="failed to get container status \"db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c\": rpc error: code = NotFound desc = could not find container \"db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c\": container with ID starting with db870d7bce369df7646ae59c590dee6c36c6fae771548d5ed01006684d0dae0c not found: ID does not exist" Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.337779 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nq49v"] Feb 26 15:56:02 crc kubenswrapper[5033]: I0226 15:56:02.341512 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nq49v"] Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.635767 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d762dc-b9e3-44fd-a835-78df146dc35a" path="/var/lib/kubelet/pods/d6d762dc-b9e3-44fd-a835-78df146dc35a/volumes" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.647485 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535356-cjwln" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.848128 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l99zp\" (UniqueName: \"kubernetes.io/projected/012bed3c-4f40-4873-86fd-1a97de8df980-kube-api-access-l99zp\") pod \"012bed3c-4f40-4873-86fd-1a97de8df980\" (UID: \"012bed3c-4f40-4873-86fd-1a97de8df980\") " Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.872915 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/012bed3c-4f40-4873-86fd-1a97de8df980-kube-api-access-l99zp" (OuterVolumeSpecName: "kube-api-access-l99zp") pod "012bed3c-4f40-4873-86fd-1a97de8df980" (UID: "012bed3c-4f40-4873-86fd-1a97de8df980"). InnerVolumeSpecName "kube-api-access-l99zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.949756 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l99zp\" (UniqueName: \"kubernetes.io/projected/012bed3c-4f40-4873-86fd-1a97de8df980-kube-api-access-l99zp\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.992933 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv"] Feb 26 15:56:03 crc kubenswrapper[5033]: E0226 15:56:03.993289 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d762dc-b9e3-44fd-a835-78df146dc35a" containerName="console" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.993318 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d762dc-b9e3-44fd-a835-78df146dc35a" containerName="console" Feb 26 15:56:03 crc kubenswrapper[5033]: E0226 15:56:03.993363 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="012bed3c-4f40-4873-86fd-1a97de8df980" containerName="oc" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.993376 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="012bed3c-4f40-4873-86fd-1a97de8df980" containerName="oc" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.993539 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="012bed3c-4f40-4873-86fd-1a97de8df980" containerName="oc" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.993568 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6d762dc-b9e3-44fd-a835-78df146dc35a" containerName="console" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.995008 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:03 crc kubenswrapper[5033]: I0226 15:56:03.998572 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.003161 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv"] Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.051054 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rrj2\" (UniqueName: \"kubernetes.io/projected/08c209a8-a926-4b83-a443-6959c51f2d6c-kube-api-access-9rrj2\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.051531 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.051703 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.152575 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.152694 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rrj2\" (UniqueName: \"kubernetes.io/projected/08c209a8-a926-4b83-a443-6959c51f2d6c-kube-api-access-9rrj2\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.152768 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.153394 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.154544 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.176012 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rrj2\" (UniqueName: \"kubernetes.io/projected/08c209a8-a926-4b83-a443-6959c51f2d6c-kube-api-access-9rrj2\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.317772 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535356-cjwln" event={"ID":"012bed3c-4f40-4873-86fd-1a97de8df980","Type":"ContainerDied","Data":"2cd912cae4527b462d8a7bedaf3760b34d6728940aa44f63b5f27621c142be88"} Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.317817 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd912cae4527b462d8a7bedaf3760b34d6728940aa44f63b5f27621c142be88" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.317833 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535356-cjwln" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.321906 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.626588 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv"] Feb 26 15:56:04 crc kubenswrapper[5033]: W0226 15:56:04.631307 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08c209a8_a926_4b83_a443_6959c51f2d6c.slice/crio-840e86559779399ce580f50dfab521201b7300ae5102350330f9649d80bc32eb WatchSource:0}: Error finding container 840e86559779399ce580f50dfab521201b7300ae5102350330f9649d80bc32eb: Status 404 returned error can't find the container with id 840e86559779399ce580f50dfab521201b7300ae5102350330f9649d80bc32eb Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.699210 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535350-45mph"] Feb 26 15:56:04 crc kubenswrapper[5033]: I0226 15:56:04.702312 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535350-45mph"] Feb 26 15:56:05 crc kubenswrapper[5033]: I0226 15:56:05.327978 5033 generic.go:334] "Generic (PLEG): container finished" podID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerID="e989779e6941128f9a895a5b676aec7ad4258c585bb038bc92266d0143bdee20" exitCode=0 Feb 26 15:56:05 crc kubenswrapper[5033]: I0226 15:56:05.328228 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" event={"ID":"08c209a8-a926-4b83-a443-6959c51f2d6c","Type":"ContainerDied","Data":"e989779e6941128f9a895a5b676aec7ad4258c585bb038bc92266d0143bdee20"} Feb 26 15:56:05 crc kubenswrapper[5033]: I0226 15:56:05.329340 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" event={"ID":"08c209a8-a926-4b83-a443-6959c51f2d6c","Type":"ContainerStarted","Data":"840e86559779399ce580f50dfab521201b7300ae5102350330f9649d80bc32eb"} Feb 26 15:56:05 crc kubenswrapper[5033]: I0226 15:56:05.520830 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:56:05 crc kubenswrapper[5033]: I0226 15:56:05.521017 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:56:05 crc kubenswrapper[5033]: I0226 15:56:05.565909 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:56:05 crc kubenswrapper[5033]: I0226 15:56:05.640357 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa22708b-29dc-4833-ae29-d89fa7465f05" path="/var/lib/kubelet/pods/aa22708b-29dc-4833-ae29-d89fa7465f05/volumes" Feb 26 15:56:06 crc kubenswrapper[5033]: I0226 15:56:06.412509 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:56:07 crc kubenswrapper[5033]: I0226 15:56:07.356617 5033 generic.go:334] "Generic (PLEG): container finished" podID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerID="512e822cd91f385bfaf659603476298e1482af082377729ffdb65d80248b663e" exitCode=0 Feb 26 15:56:07 crc kubenswrapper[5033]: I0226 15:56:07.356705 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" event={"ID":"08c209a8-a926-4b83-a443-6959c51f2d6c","Type":"ContainerDied","Data":"512e822cd91f385bfaf659603476298e1482af082377729ffdb65d80248b663e"} Feb 26 15:56:08 crc kubenswrapper[5033]: I0226 15:56:08.328523 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9kxz7"] Feb 26 15:56:08 crc kubenswrapper[5033]: I0226 15:56:08.366599 5033 generic.go:334] "Generic (PLEG): container finished" podID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerID="e054918d7d537e88a306fea34c4781ffaafa520ec260fdefcfc6c36c896822fa" exitCode=0 Feb 26 15:56:08 crc kubenswrapper[5033]: I0226 15:56:08.366689 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" event={"ID":"08c209a8-a926-4b83-a443-6959c51f2d6c","Type":"ContainerDied","Data":"e054918d7d537e88a306fea34c4781ffaafa520ec260fdefcfc6c36c896822fa"} Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.374492 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9kxz7" podUID="fd8e8fa7-36da-4870-8990-a2937072b281" containerName="registry-server" containerID="cri-o://126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0" gracePeriod=2 Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.710857 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.739658 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rrj2\" (UniqueName: \"kubernetes.io/projected/08c209a8-a926-4b83-a443-6959c51f2d6c-kube-api-access-9rrj2\") pod \"08c209a8-a926-4b83-a443-6959c51f2d6c\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.739713 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-util\") pod \"08c209a8-a926-4b83-a443-6959c51f2d6c\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.739750 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-bundle\") pod \"08c209a8-a926-4b83-a443-6959c51f2d6c\" (UID: \"08c209a8-a926-4b83-a443-6959c51f2d6c\") " Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.740695 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-bundle" (OuterVolumeSpecName: "bundle") pod "08c209a8-a926-4b83-a443-6959c51f2d6c" (UID: "08c209a8-a926-4b83-a443-6959c51f2d6c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.752181 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c209a8-a926-4b83-a443-6959c51f2d6c-kube-api-access-9rrj2" (OuterVolumeSpecName: "kube-api-access-9rrj2") pod "08c209a8-a926-4b83-a443-6959c51f2d6c" (UID: "08c209a8-a926-4b83-a443-6959c51f2d6c"). InnerVolumeSpecName "kube-api-access-9rrj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.755722 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.755767 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rrj2\" (UniqueName: \"kubernetes.io/projected/08c209a8-a926-4b83-a443-6959c51f2d6c-kube-api-access-9rrj2\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.763568 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-util" (OuterVolumeSpecName: "util") pod "08c209a8-a926-4b83-a443-6959c51f2d6c" (UID: "08c209a8-a926-4b83-a443-6959c51f2d6c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.764928 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.856629 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-utilities\") pod \"fd8e8fa7-36da-4870-8990-a2937072b281\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.856678 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4cnl\" (UniqueName: \"kubernetes.io/projected/fd8e8fa7-36da-4870-8990-a2937072b281-kube-api-access-d4cnl\") pod \"fd8e8fa7-36da-4870-8990-a2937072b281\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.856803 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-catalog-content\") pod \"fd8e8fa7-36da-4870-8990-a2937072b281\" (UID: \"fd8e8fa7-36da-4870-8990-a2937072b281\") " Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.857387 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-utilities" (OuterVolumeSpecName: "utilities") pod "fd8e8fa7-36da-4870-8990-a2937072b281" (UID: "fd8e8fa7-36da-4870-8990-a2937072b281"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.857845 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.857879 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c209a8-a926-4b83-a443-6959c51f2d6c-util\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.862289 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8e8fa7-36da-4870-8990-a2937072b281-kube-api-access-d4cnl" (OuterVolumeSpecName: "kube-api-access-d4cnl") pod "fd8e8fa7-36da-4870-8990-a2937072b281" (UID: "fd8e8fa7-36da-4870-8990-a2937072b281"). InnerVolumeSpecName "kube-api-access-d4cnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.906509 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd8e8fa7-36da-4870-8990-a2937072b281" (UID: "fd8e8fa7-36da-4870-8990-a2937072b281"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.958905 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd8e8fa7-36da-4870-8990-a2937072b281-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:09 crc kubenswrapper[5033]: I0226 15:56:09.958937 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4cnl\" (UniqueName: \"kubernetes.io/projected/fd8e8fa7-36da-4870-8990-a2937072b281-kube-api-access-d4cnl\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.394312 5033 generic.go:334] "Generic (PLEG): container finished" podID="fd8e8fa7-36da-4870-8990-a2937072b281" containerID="126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0" exitCode=0 Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.394401 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kxz7" event={"ID":"fd8e8fa7-36da-4870-8990-a2937072b281","Type":"ContainerDied","Data":"126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0"} Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.394780 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kxz7" event={"ID":"fd8e8fa7-36da-4870-8990-a2937072b281","Type":"ContainerDied","Data":"3bc58214c8f54bb013c843c0295531d82ed08f17581942364ce5ae8709a386ff"} Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.394529 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kxz7" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.394802 5033 scope.go:117] "RemoveContainer" containerID="126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.403151 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" event={"ID":"08c209a8-a926-4b83-a443-6959c51f2d6c","Type":"ContainerDied","Data":"840e86559779399ce580f50dfab521201b7300ae5102350330f9649d80bc32eb"} Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.403206 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="840e86559779399ce580f50dfab521201b7300ae5102350330f9649d80bc32eb" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.403235 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.431975 5033 scope.go:117] "RemoveContainer" containerID="34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.465182 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9kxz7"] Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.465956 5033 scope.go:117] "RemoveContainer" containerID="b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.469315 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9kxz7"] Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.495457 5033 scope.go:117] "RemoveContainer" containerID="126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0" Feb 26 15:56:10 crc kubenswrapper[5033]: E0226 15:56:10.495964 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0\": container with ID starting with 126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0 not found: ID does not exist" containerID="126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.496014 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0"} err="failed to get container status \"126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0\": rpc error: code = NotFound desc = could not find container \"126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0\": container with ID starting with 126e886a45b3f96102f0123bbbefd4f24e9a5c91a0934f378c9ea80b97cf8fa0 not found: ID does not exist" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.496044 5033 scope.go:117] "RemoveContainer" containerID="34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae" Feb 26 15:56:10 crc kubenswrapper[5033]: E0226 15:56:10.496471 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae\": container with ID starting with 34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae not found: ID does not exist" containerID="34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.496514 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae"} err="failed to get container status \"34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae\": rpc error: code = NotFound desc = could not find container \"34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae\": container with ID starting with 34203b91e2826e3e3338b7093b721aa55691b9e7041171ce516db2bfe2e51bae not found: ID does not exist" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.496547 5033 scope.go:117] "RemoveContainer" containerID="b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc" Feb 26 15:56:10 crc kubenswrapper[5033]: E0226 15:56:10.496839 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc\": container with ID starting with b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc not found: ID does not exist" containerID="b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc" Feb 26 15:56:10 crc kubenswrapper[5033]: I0226 15:56:10.496867 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc"} err="failed to get container status \"b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc\": rpc error: code = NotFound desc = could not find container \"b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc\": container with ID starting with b8ce707e644a2ec82c44fdc8126adb79222adc9095ad9532684bbbdc965c04cc not found: ID does not exist" Feb 26 15:56:11 crc kubenswrapper[5033]: I0226 15:56:11.636941 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd8e8fa7-36da-4870-8990-a2937072b281" path="/var/lib/kubelet/pods/fd8e8fa7-36da-4870-8990-a2937072b281/volumes" Feb 26 15:56:16 crc kubenswrapper[5033]: I0226 15:56:16.309517 5033 scope.go:117] "RemoveContainer" containerID="b7d858f662b824e7399e6a2140f83edd1fd4b58e685f668288fab1993e10c90d" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.243153 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn"] Feb 26 15:56:17 crc kubenswrapper[5033]: E0226 15:56:17.243655 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8e8fa7-36da-4870-8990-a2937072b281" containerName="extract-utilities" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.243670 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8e8fa7-36da-4870-8990-a2937072b281" containerName="extract-utilities" Feb 26 15:56:17 crc kubenswrapper[5033]: E0226 15:56:17.243683 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerName="pull" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.243691 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerName="pull" Feb 26 15:56:17 crc kubenswrapper[5033]: E0226 15:56:17.243705 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerName="extract" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.243713 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerName="extract" Feb 26 15:56:17 crc kubenswrapper[5033]: E0226 15:56:17.243721 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8e8fa7-36da-4870-8990-a2937072b281" containerName="extract-content" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.243727 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8e8fa7-36da-4870-8990-a2937072b281" containerName="extract-content" Feb 26 15:56:17 crc kubenswrapper[5033]: E0226 15:56:17.243736 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerName="util" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.243741 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerName="util" Feb 26 15:56:17 crc kubenswrapper[5033]: E0226 15:56:17.243749 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8e8fa7-36da-4870-8990-a2937072b281" containerName="registry-server" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.243756 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8e8fa7-36da-4870-8990-a2937072b281" containerName="registry-server" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.243852 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8e8fa7-36da-4870-8990-a2937072b281" containerName="registry-server" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.243866 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c209a8-a926-4b83-a443-6959c51f2d6c" containerName="extract" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.244229 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.247169 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mfwmt" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.247597 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.247950 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.248353 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.249244 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.266489 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn"] Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.347905 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41f5e1df-b55f-4861-a814-a111201ae275-apiservice-cert\") pod \"metallb-operator-controller-manager-7c4f4fbd6-mcbpn\" (UID: \"41f5e1df-b55f-4861-a814-a111201ae275\") " pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.347977 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41f5e1df-b55f-4861-a814-a111201ae275-webhook-cert\") pod \"metallb-operator-controller-manager-7c4f4fbd6-mcbpn\" (UID: \"41f5e1df-b55f-4861-a814-a111201ae275\") " pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.348017 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmgbx\" (UniqueName: \"kubernetes.io/projected/41f5e1df-b55f-4861-a814-a111201ae275-kube-api-access-cmgbx\") pod \"metallb-operator-controller-manager-7c4f4fbd6-mcbpn\" (UID: \"41f5e1df-b55f-4861-a814-a111201ae275\") " pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.449573 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41f5e1df-b55f-4861-a814-a111201ae275-apiservice-cert\") pod \"metallb-operator-controller-manager-7c4f4fbd6-mcbpn\" (UID: \"41f5e1df-b55f-4861-a814-a111201ae275\") " pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.449643 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41f5e1df-b55f-4861-a814-a111201ae275-webhook-cert\") pod \"metallb-operator-controller-manager-7c4f4fbd6-mcbpn\" (UID: \"41f5e1df-b55f-4861-a814-a111201ae275\") " pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.449683 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmgbx\" (UniqueName: \"kubernetes.io/projected/41f5e1df-b55f-4861-a814-a111201ae275-kube-api-access-cmgbx\") pod \"metallb-operator-controller-manager-7c4f4fbd6-mcbpn\" (UID: \"41f5e1df-b55f-4861-a814-a111201ae275\") " pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.456739 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41f5e1df-b55f-4861-a814-a111201ae275-apiservice-cert\") pod \"metallb-operator-controller-manager-7c4f4fbd6-mcbpn\" (UID: \"41f5e1df-b55f-4861-a814-a111201ae275\") " pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.457053 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41f5e1df-b55f-4861-a814-a111201ae275-webhook-cert\") pod \"metallb-operator-controller-manager-7c4f4fbd6-mcbpn\" (UID: \"41f5e1df-b55f-4861-a814-a111201ae275\") " pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.475965 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmgbx\" (UniqueName: \"kubernetes.io/projected/41f5e1df-b55f-4861-a814-a111201ae275-kube-api-access-cmgbx\") pod \"metallb-operator-controller-manager-7c4f4fbd6-mcbpn\" (UID: \"41f5e1df-b55f-4861-a814-a111201ae275\") " pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.488104 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv"] Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.488920 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.491114 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.491721 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.493281 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-85qgp" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.511938 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv"] Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.553132 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pslwz\" (UniqueName: \"kubernetes.io/projected/d1a05743-e247-444a-bbda-5f411c391cde-kube-api-access-pslwz\") pod \"metallb-operator-webhook-server-55cd77d8fc-wn4jv\" (UID: \"d1a05743-e247-444a-bbda-5f411c391cde\") " pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.553214 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1a05743-e247-444a-bbda-5f411c391cde-webhook-cert\") pod \"metallb-operator-webhook-server-55cd77d8fc-wn4jv\" (UID: \"d1a05743-e247-444a-bbda-5f411c391cde\") " pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.553280 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1a05743-e247-444a-bbda-5f411c391cde-apiservice-cert\") pod \"metallb-operator-webhook-server-55cd77d8fc-wn4jv\" (UID: \"d1a05743-e247-444a-bbda-5f411c391cde\") " pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.559655 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.654846 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pslwz\" (UniqueName: \"kubernetes.io/projected/d1a05743-e247-444a-bbda-5f411c391cde-kube-api-access-pslwz\") pod \"metallb-operator-webhook-server-55cd77d8fc-wn4jv\" (UID: \"d1a05743-e247-444a-bbda-5f411c391cde\") " pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.655170 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1a05743-e247-444a-bbda-5f411c391cde-webhook-cert\") pod \"metallb-operator-webhook-server-55cd77d8fc-wn4jv\" (UID: \"d1a05743-e247-444a-bbda-5f411c391cde\") " pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.655669 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1a05743-e247-444a-bbda-5f411c391cde-apiservice-cert\") pod \"metallb-operator-webhook-server-55cd77d8fc-wn4jv\" (UID: \"d1a05743-e247-444a-bbda-5f411c391cde\") " pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.658943 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1a05743-e247-444a-bbda-5f411c391cde-apiservice-cert\") pod \"metallb-operator-webhook-server-55cd77d8fc-wn4jv\" (UID: \"d1a05743-e247-444a-bbda-5f411c391cde\") " pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.659785 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1a05743-e247-444a-bbda-5f411c391cde-webhook-cert\") pod \"metallb-operator-webhook-server-55cd77d8fc-wn4jv\" (UID: \"d1a05743-e247-444a-bbda-5f411c391cde\") " pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.675806 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pslwz\" (UniqueName: \"kubernetes.io/projected/d1a05743-e247-444a-bbda-5f411c391cde-kube-api-access-pslwz\") pod \"metallb-operator-webhook-server-55cd77d8fc-wn4jv\" (UID: \"d1a05743-e247-444a-bbda-5f411c391cde\") " pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.821318 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:17 crc kubenswrapper[5033]: I0226 15:56:17.841490 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn"] Feb 26 15:56:17 crc kubenswrapper[5033]: W0226 15:56:17.853075 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41f5e1df_b55f_4861_a814_a111201ae275.slice/crio-db506b69a65e90a2ae82d9aabb637824f3cc8c1458614cf14114ea1f91e3e1a9 WatchSource:0}: Error finding container db506b69a65e90a2ae82d9aabb637824f3cc8c1458614cf14114ea1f91e3e1a9: Status 404 returned error can't find the container with id db506b69a65e90a2ae82d9aabb637824f3cc8c1458614cf14114ea1f91e3e1a9 Feb 26 15:56:18 crc kubenswrapper[5033]: I0226 15:56:18.021201 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv"] Feb 26 15:56:18 crc kubenswrapper[5033]: W0226 15:56:18.028907 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a05743_e247_444a_bbda_5f411c391cde.slice/crio-222afd5ee504bbf35882bacb373ddec2b05b5f04584b8679b3937e64448c8897 WatchSource:0}: Error finding container 222afd5ee504bbf35882bacb373ddec2b05b5f04584b8679b3937e64448c8897: Status 404 returned error can't find the container with id 222afd5ee504bbf35882bacb373ddec2b05b5f04584b8679b3937e64448c8897 Feb 26 15:56:18 crc kubenswrapper[5033]: I0226 15:56:18.450299 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" event={"ID":"d1a05743-e247-444a-bbda-5f411c391cde","Type":"ContainerStarted","Data":"222afd5ee504bbf35882bacb373ddec2b05b5f04584b8679b3937e64448c8897"} Feb 26 15:56:18 crc kubenswrapper[5033]: I0226 15:56:18.453063 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" event={"ID":"41f5e1df-b55f-4861-a814-a111201ae275","Type":"ContainerStarted","Data":"db506b69a65e90a2ae82d9aabb637824f3cc8c1458614cf14114ea1f91e3e1a9"} Feb 26 15:56:21 crc kubenswrapper[5033]: I0226 15:56:21.475345 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" event={"ID":"41f5e1df-b55f-4861-a814-a111201ae275","Type":"ContainerStarted","Data":"5dbab56a7229f25f75965d7360a5d01bd2c35b6d873f0eb811efe64247805bff"} Feb 26 15:56:21 crc kubenswrapper[5033]: I0226 15:56:21.475498 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:21 crc kubenswrapper[5033]: I0226 15:56:21.504743 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" podStartSLOduration=1.548040171 podStartE2EDuration="4.504716017s" podCreationTimestamp="2026-02-26 15:56:17 +0000 UTC" firstStartedPulling="2026-02-26 15:56:17.856446152 +0000 UTC m=+905.069044802" lastFinishedPulling="2026-02-26 15:56:20.813121998 +0000 UTC m=+908.025720648" observedRunningTime="2026-02-26 15:56:21.496629626 +0000 UTC m=+908.709228286" watchObservedRunningTime="2026-02-26 15:56:21.504716017 +0000 UTC m=+908.717314687" Feb 26 15:56:23 crc kubenswrapper[5033]: I0226 15:56:23.494807 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" event={"ID":"d1a05743-e247-444a-bbda-5f411c391cde","Type":"ContainerStarted","Data":"d4e90d228ca84d6f93237a64d42181c3cde099bdf279074eb22a1bbfed0f6083"} Feb 26 15:56:23 crc kubenswrapper[5033]: I0226 15:56:23.495183 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:23 crc kubenswrapper[5033]: I0226 15:56:23.514113 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" podStartSLOduration=1.849269124 podStartE2EDuration="6.514092262s" podCreationTimestamp="2026-02-26 15:56:17 +0000 UTC" firstStartedPulling="2026-02-26 15:56:18.032242908 +0000 UTC m=+905.244841558" lastFinishedPulling="2026-02-26 15:56:22.697066046 +0000 UTC m=+909.909664696" observedRunningTime="2026-02-26 15:56:23.509972369 +0000 UTC m=+910.722571039" watchObservedRunningTime="2026-02-26 15:56:23.514092262 +0000 UTC m=+910.726690912" Feb 26 15:56:37 crc kubenswrapper[5033]: I0226 15:56:37.825967 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-55cd77d8fc-wn4jv" Feb 26 15:56:50 crc kubenswrapper[5033]: I0226 15:56:50.188659 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:56:50 crc kubenswrapper[5033]: I0226 15:56:50.189099 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:56:57 crc kubenswrapper[5033]: I0226 15:56:57.561524 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7c4f4fbd6-mcbpn" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.173798 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f"] Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.174691 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.177850 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.178110 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4zc6l" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.188487 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-bz2qd"] Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.190964 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.192301 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.192915 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.200238 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f"] Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.272233 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5tc8n"] Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.273066 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.277849 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.277854 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.277970 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-mnp25" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.277979 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.283394 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zvbp\" (UniqueName: \"kubernetes.io/projected/0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055-kube-api-access-7zvbp\") pod \"frr-k8s-webhook-server-7f989f654f-ps55f\" (UID: \"0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.283466 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055-cert\") pod \"frr-k8s-webhook-server-7f989f654f-ps55f\" (UID: \"0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.293796 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-l876q"] Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.294592 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.295935 5033 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.311717 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-l876q"] Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.384661 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-metrics\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.384724 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/94d772d9-7831-40d7-9298-8fe371818c29-frr-startup\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.384999 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjfjv\" (UniqueName: \"kubernetes.io/projected/94d772d9-7831-40d7-9298-8fe371818c29-kube-api-access-cjfjv\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385038 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b9tp\" (UniqueName: \"kubernetes.io/projected/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-kube-api-access-9b9tp\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385064 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-frr-conf\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385096 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zvbp\" (UniqueName: \"kubernetes.io/projected/0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055-kube-api-access-7zvbp\") pod \"frr-k8s-webhook-server-7f989f654f-ps55f\" (UID: \"0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385126 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94d772d9-7831-40d7-9298-8fe371818c29-metrics-certs\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385156 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055-cert\") pod \"frr-k8s-webhook-server-7f989f654f-ps55f\" (UID: \"0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385186 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-reloader\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385211 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-memberlist\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385287 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-metallb-excludel2\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385313 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-frr-sockets\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.385335 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-metrics-certs\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: E0226 15:56:58.385708 5033 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 26 15:56:58 crc kubenswrapper[5033]: E0226 15:56:58.385764 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055-cert podName:0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055 nodeName:}" failed. No retries permitted until 2026-02-26 15:56:58.885747138 +0000 UTC m=+946.098345788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055-cert") pod "frr-k8s-webhook-server-7f989f654f-ps55f" (UID: "0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055") : secret "frr-k8s-webhook-server-cert" not found Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.408199 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zvbp\" (UniqueName: \"kubernetes.io/projected/0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055-kube-api-access-7zvbp\") pod \"frr-k8s-webhook-server-7f989f654f-ps55f\" (UID: \"0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.486547 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-reloader\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.486912 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-memberlist\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.486945 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ace028d-9434-478e-b6f9-2302d3506e3d-cert\") pod \"controller-86ddb6bd46-l876q\" (UID: \"0ace028d-9434-478e-b6f9-2302d3506e3d\") " pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.486976 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-metallb-excludel2\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487005 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-frr-sockets\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487026 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-metrics-certs\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487070 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-metrics\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487122 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ace028d-9434-478e-b6f9-2302d3506e3d-metrics-certs\") pod \"controller-86ddb6bd46-l876q\" (UID: \"0ace028d-9434-478e-b6f9-2302d3506e3d\") " pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487151 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/94d772d9-7831-40d7-9298-8fe371818c29-frr-startup\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487175 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjfjv\" (UniqueName: \"kubernetes.io/projected/94d772d9-7831-40d7-9298-8fe371818c29-kube-api-access-cjfjv\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487200 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tnf\" (UniqueName: \"kubernetes.io/projected/0ace028d-9434-478e-b6f9-2302d3506e3d-kube-api-access-h2tnf\") pod \"controller-86ddb6bd46-l876q\" (UID: \"0ace028d-9434-478e-b6f9-2302d3506e3d\") " pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487226 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b9tp\" (UniqueName: \"kubernetes.io/projected/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-kube-api-access-9b9tp\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487235 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-reloader\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487249 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-frr-conf\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487365 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94d772d9-7831-40d7-9298-8fe371818c29-metrics-certs\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: E0226 15:56:58.487480 5033 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 26 15:56:58 crc kubenswrapper[5033]: E0226 15:56:58.487534 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94d772d9-7831-40d7-9298-8fe371818c29-metrics-certs podName:94d772d9-7831-40d7-9298-8fe371818c29 nodeName:}" failed. No retries permitted until 2026-02-26 15:56:58.987514276 +0000 UTC m=+946.200112936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94d772d9-7831-40d7-9298-8fe371818c29-metrics-certs") pod "frr-k8s-bz2qd" (UID: "94d772d9-7831-40d7-9298-8fe371818c29") : secret "frr-k8s-certs-secret" not found Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.487567 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-frr-conf\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: E0226 15:56:58.487662 5033 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 26 15:56:58 crc kubenswrapper[5033]: E0226 15:56:58.487706 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-memberlist podName:8dbfe85d-1a34-476d-8f00-c02b6b1aeea7 nodeName:}" failed. No retries permitted until 2026-02-26 15:56:58.98769216 +0000 UTC m=+946.200290810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-memberlist") pod "speaker-5tc8n" (UID: "8dbfe85d-1a34-476d-8f00-c02b6b1aeea7") : secret "metallb-memberlist" not found Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.488008 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-frr-sockets\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.488272 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-metallb-excludel2\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.488772 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/94d772d9-7831-40d7-9298-8fe371818c29-metrics\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.489055 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/94d772d9-7831-40d7-9298-8fe371818c29-frr-startup\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.491549 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-metrics-certs\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.506076 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b9tp\" (UniqueName: \"kubernetes.io/projected/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-kube-api-access-9b9tp\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.511673 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjfjv\" (UniqueName: \"kubernetes.io/projected/94d772d9-7831-40d7-9298-8fe371818c29-kube-api-access-cjfjv\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.589172 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ace028d-9434-478e-b6f9-2302d3506e3d-cert\") pod \"controller-86ddb6bd46-l876q\" (UID: \"0ace028d-9434-478e-b6f9-2302d3506e3d\") " pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.589264 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ace028d-9434-478e-b6f9-2302d3506e3d-metrics-certs\") pod \"controller-86ddb6bd46-l876q\" (UID: \"0ace028d-9434-478e-b6f9-2302d3506e3d\") " pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.589290 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tnf\" (UniqueName: \"kubernetes.io/projected/0ace028d-9434-478e-b6f9-2302d3506e3d-kube-api-access-h2tnf\") pod \"controller-86ddb6bd46-l876q\" (UID: \"0ace028d-9434-478e-b6f9-2302d3506e3d\") " pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.594191 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ace028d-9434-478e-b6f9-2302d3506e3d-metrics-certs\") pod \"controller-86ddb6bd46-l876q\" (UID: \"0ace028d-9434-478e-b6f9-2302d3506e3d\") " pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.594654 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ace028d-9434-478e-b6f9-2302d3506e3d-cert\") pod \"controller-86ddb6bd46-l876q\" (UID: \"0ace028d-9434-478e-b6f9-2302d3506e3d\") " pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.606284 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tnf\" (UniqueName: \"kubernetes.io/projected/0ace028d-9434-478e-b6f9-2302d3506e3d-kube-api-access-h2tnf\") pod \"controller-86ddb6bd46-l876q\" (UID: \"0ace028d-9434-478e-b6f9-2302d3506e3d\") " pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.612900 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.798064 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-l876q"] Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.892207 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055-cert\") pod \"frr-k8s-webhook-server-7f989f654f-ps55f\" (UID: \"0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.897953 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055-cert\") pod \"frr-k8s-webhook-server-7f989f654f-ps55f\" (UID: \"0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.993646 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94d772d9-7831-40d7-9298-8fe371818c29-metrics-certs\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.993744 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-memberlist\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.997093 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8dbfe85d-1a34-476d-8f00-c02b6b1aeea7-memberlist\") pod \"speaker-5tc8n\" (UID: \"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7\") " pod="metallb-system/speaker-5tc8n" Feb 26 15:56:58 crc kubenswrapper[5033]: I0226 15:56:58.997191 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94d772d9-7831-40d7-9298-8fe371818c29-metrics-certs\") pod \"frr-k8s-bz2qd\" (UID: \"94d772d9-7831-40d7-9298-8fe371818c29\") " pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.091050 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.106701 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.215148 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5tc8n" Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.564676 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f"] Feb 26 15:56:59 crc kubenswrapper[5033]: W0226 15:56:59.583711 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d49f5f7_f9f1_48cb_a9e2_7e5ecdaf8055.slice/crio-da5d0d112c36de7be8efe8e387fcd7624d10c7e935e3eb7cbf53772045fec80f WatchSource:0}: Error finding container da5d0d112c36de7be8efe8e387fcd7624d10c7e935e3eb7cbf53772045fec80f: Status 404 returned error can't find the container with id da5d0d112c36de7be8efe8e387fcd7624d10c7e935e3eb7cbf53772045fec80f Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.710445 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5tc8n" event={"ID":"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7","Type":"ContainerStarted","Data":"ad4831ac3f4f2dd0e5fd03adf50e54f9448fb5d5465c4a92ed3f4369c2239f93"} Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.710765 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5tc8n" event={"ID":"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7","Type":"ContainerStarted","Data":"f6f0b3ce264c27d7ed5ff68df833ee8cab4712559a59cc62b1c7750ef125a1f5"} Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.711651 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerStarted","Data":"8076cc60c7ea2d750ae589cffba2c154caacd6756ed12e5353e0f606565edcb5"} Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.713104 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" event={"ID":"0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055","Type":"ContainerStarted","Data":"da5d0d112c36de7be8efe8e387fcd7624d10c7e935e3eb7cbf53772045fec80f"} Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.714867 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-l876q" event={"ID":"0ace028d-9434-478e-b6f9-2302d3506e3d","Type":"ContainerStarted","Data":"017fb63ddbc93b787972a79af37e70ef6695bc3ae0005c5286a01e68278cf99b"} Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.714895 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-l876q" event={"ID":"0ace028d-9434-478e-b6f9-2302d3506e3d","Type":"ContainerStarted","Data":"d2b27682ad1fca0d9de3d4e89e5b6651f148cc0018ff2b9733a909f03e2f57d6"} Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.714904 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-l876q" event={"ID":"0ace028d-9434-478e-b6f9-2302d3506e3d","Type":"ContainerStarted","Data":"951b3e4c73afbf71843ed6a0728b632cc756b00fe81a32d22038b8cc8e8688ce"} Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.715022 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:56:59 crc kubenswrapper[5033]: I0226 15:56:59.736028 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-l876q" podStartSLOduration=1.73600843 podStartE2EDuration="1.73600843s" podCreationTimestamp="2026-02-26 15:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:56:59.731001645 +0000 UTC m=+946.943600295" watchObservedRunningTime="2026-02-26 15:56:59.73600843 +0000 UTC m=+946.948607080" Feb 26 15:57:00 crc kubenswrapper[5033]: I0226 15:57:00.745329 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5tc8n" event={"ID":"8dbfe85d-1a34-476d-8f00-c02b6b1aeea7","Type":"ContainerStarted","Data":"35703283aa0cbadd68bc7d865966eb17005ee6ff8d617de9301b5f368bed376c"} Feb 26 15:57:00 crc kubenswrapper[5033]: I0226 15:57:00.745380 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5tc8n" Feb 26 15:57:03 crc kubenswrapper[5033]: I0226 15:57:03.651036 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5tc8n" podStartSLOduration=5.651014152 podStartE2EDuration="5.651014152s" podCreationTimestamp="2026-02-26 15:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:57:00.779080311 +0000 UTC m=+947.991678971" watchObservedRunningTime="2026-02-26 15:57:03.651014152 +0000 UTC m=+950.863612812" Feb 26 15:57:06 crc kubenswrapper[5033]: I0226 15:57:06.780702 5033 generic.go:334] "Generic (PLEG): container finished" podID="94d772d9-7831-40d7-9298-8fe371818c29" containerID="118b448e5c528fd6ba108035fcae7c8a2e0b8040cbe1674378f52c4a197826d6" exitCode=0 Feb 26 15:57:06 crc kubenswrapper[5033]: I0226 15:57:06.780781 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerDied","Data":"118b448e5c528fd6ba108035fcae7c8a2e0b8040cbe1674378f52c4a197826d6"} Feb 26 15:57:06 crc kubenswrapper[5033]: I0226 15:57:06.783421 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" event={"ID":"0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055","Type":"ContainerStarted","Data":"12aa7613a536eaab9122748b3078748ada5f3b21f0e9e762951f760ce06c9eee"} Feb 26 15:57:06 crc kubenswrapper[5033]: I0226 15:57:06.783537 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:57:06 crc kubenswrapper[5033]: I0226 15:57:06.840850 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" podStartSLOduration=2.138719984 podStartE2EDuration="8.840794568s" podCreationTimestamp="2026-02-26 15:56:58 +0000 UTC" firstStartedPulling="2026-02-26 15:56:59.587221325 +0000 UTC m=+946.799819975" lastFinishedPulling="2026-02-26 15:57:06.289295909 +0000 UTC m=+953.501894559" observedRunningTime="2026-02-26 15:57:06.824229137 +0000 UTC m=+954.036827787" watchObservedRunningTime="2026-02-26 15:57:06.840794568 +0000 UTC m=+954.053393228" Feb 26 15:57:07 crc kubenswrapper[5033]: I0226 15:57:07.792558 5033 generic.go:334] "Generic (PLEG): container finished" podID="94d772d9-7831-40d7-9298-8fe371818c29" containerID="746c774b152aaa06cdc90d68baba12bea0f07eb30f275e55fddd95f4f06f75c7" exitCode=0 Feb 26 15:57:07 crc kubenswrapper[5033]: I0226 15:57:07.793641 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerDied","Data":"746c774b152aaa06cdc90d68baba12bea0f07eb30f275e55fddd95f4f06f75c7"} Feb 26 15:57:08 crc kubenswrapper[5033]: I0226 15:57:08.617458 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-l876q" Feb 26 15:57:08 crc kubenswrapper[5033]: I0226 15:57:08.799926 5033 generic.go:334] "Generic (PLEG): container finished" podID="94d772d9-7831-40d7-9298-8fe371818c29" containerID="bbbcc2116ea46e15526d34a9219756b64945990796ce3d263d2b3d9fc1eb9a83" exitCode=0 Feb 26 15:57:08 crc kubenswrapper[5033]: I0226 15:57:08.799974 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerDied","Data":"bbbcc2116ea46e15526d34a9219756b64945990796ce3d263d2b3d9fc1eb9a83"} Feb 26 15:57:09 crc kubenswrapper[5033]: I0226 15:57:09.219286 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5tc8n" Feb 26 15:57:09 crc kubenswrapper[5033]: I0226 15:57:09.809726 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerStarted","Data":"b68e9d4a09dc1c22bfc1c9397b6f2f8fc27c262652634ae88fcadba811f7ffe1"} Feb 26 15:57:09 crc kubenswrapper[5033]: I0226 15:57:09.809766 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerStarted","Data":"91c475bd0d4cad781dbd9e45a14e7007dff30d1361699a56cbd02e815ff6d149"} Feb 26 15:57:09 crc kubenswrapper[5033]: I0226 15:57:09.809775 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerStarted","Data":"e54b86f1be84cd13fe3087810f03ae30b7b623d576613b9ed0c95aff2ad6999d"} Feb 26 15:57:09 crc kubenswrapper[5033]: I0226 15:57:09.809784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerStarted","Data":"e1f2be070f44b9f1cc61eb78b9ab949e4932ac044321557f54d28cb1ad0ea1fa"} Feb 26 15:57:09 crc kubenswrapper[5033]: I0226 15:57:09.809792 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerStarted","Data":"f08779561f7f9d6b78466e3638cd660083f80a51c2101318c0a357cfcf461423"} Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.408275 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp"] Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.409369 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.410719 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.433116 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp"] Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.583181 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxw48\" (UniqueName: \"kubernetes.io/projected/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-kube-api-access-pxw48\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.583220 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.583304 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.684759 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxw48\" (UniqueName: \"kubernetes.io/projected/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-kube-api-access-pxw48\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.684844 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.684920 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.685485 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.685580 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.710973 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxw48\" (UniqueName: \"kubernetes.io/projected/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-kube-api-access-pxw48\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.723392 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.891722 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bz2qd" event={"ID":"94d772d9-7831-40d7-9298-8fe371818c29","Type":"ContainerStarted","Data":"4a39a040c0eae12ffaa88103181c31c7f4faa95bedd28112000a8a63b557ee23"} Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.893521 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:57:10 crc kubenswrapper[5033]: I0226 15:57:10.922202 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-bz2qd" podStartSLOduration=5.984058404 podStartE2EDuration="12.922160492s" podCreationTimestamp="2026-02-26 15:56:58 +0000 UTC" firstStartedPulling="2026-02-26 15:56:59.375122365 +0000 UTC m=+946.587721015" lastFinishedPulling="2026-02-26 15:57:06.313224453 +0000 UTC m=+953.525823103" observedRunningTime="2026-02-26 15:57:10.920504671 +0000 UTC m=+958.133103321" watchObservedRunningTime="2026-02-26 15:57:10.922160492 +0000 UTC m=+958.134759142" Feb 26 15:57:11 crc kubenswrapper[5033]: I0226 15:57:11.216736 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp"] Feb 26 15:57:11 crc kubenswrapper[5033]: W0226 15:57:11.221208 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f363c4_0c5d_4e7a_9532_41e9ae4c3fa3.slice/crio-9b3a1816840e9fbfddfaa0cc5bd2bc468557916da7d66fc063606224c09d321f WatchSource:0}: Error finding container 9b3a1816840e9fbfddfaa0cc5bd2bc468557916da7d66fc063606224c09d321f: Status 404 returned error can't find the container with id 9b3a1816840e9fbfddfaa0cc5bd2bc468557916da7d66fc063606224c09d321f Feb 26 15:57:11 crc kubenswrapper[5033]: I0226 15:57:11.900026 5033 generic.go:334] "Generic (PLEG): container finished" podID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerID="d77f8dc1eb8f67da709f8c319025bb8abb9976ef0184b5ed2bbf6253750cc705" exitCode=0 Feb 26 15:57:11 crc kubenswrapper[5033]: I0226 15:57:11.900408 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" event={"ID":"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3","Type":"ContainerDied","Data":"d77f8dc1eb8f67da709f8c319025bb8abb9976ef0184b5ed2bbf6253750cc705"} Feb 26 15:57:11 crc kubenswrapper[5033]: I0226 15:57:11.900440 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" event={"ID":"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3","Type":"ContainerStarted","Data":"9b3a1816840e9fbfddfaa0cc5bd2bc468557916da7d66fc063606224c09d321f"} Feb 26 15:57:14 crc kubenswrapper[5033]: I0226 15:57:14.108092 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:57:14 crc kubenswrapper[5033]: I0226 15:57:14.145962 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:57:17 crc kubenswrapper[5033]: I0226 15:57:17.933958 5033 generic.go:334] "Generic (PLEG): container finished" podID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerID="5b8019e600dc845af8520c03ebff48935d9126bd72f16e77dfbdc0b67cca1c2c" exitCode=0 Feb 26 15:57:17 crc kubenswrapper[5033]: I0226 15:57:17.934061 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" event={"ID":"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3","Type":"ContainerDied","Data":"5b8019e600dc845af8520c03ebff48935d9126bd72f16e77dfbdc0b67cca1c2c"} Feb 26 15:57:18 crc kubenswrapper[5033]: I0226 15:57:18.947354 5033 generic.go:334] "Generic (PLEG): container finished" podID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerID="ff1aa02f7ba50c567c9fb03a682d991d0df4ac389351dfd62a62a1bca77e2ca0" exitCode=0 Feb 26 15:57:18 crc kubenswrapper[5033]: I0226 15:57:18.947426 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" event={"ID":"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3","Type":"ContainerDied","Data":"ff1aa02f7ba50c567c9fb03a682d991d0df4ac389351dfd62a62a1bca77e2ca0"} Feb 26 15:57:19 crc kubenswrapper[5033]: I0226 15:57:19.095374 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-ps55f" Feb 26 15:57:19 crc kubenswrapper[5033]: I0226 15:57:19.111937 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-bz2qd" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.189639 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.190165 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.210457 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.255470 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-util\") pod \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.255523 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-bundle\") pod \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.255572 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxw48\" (UniqueName: \"kubernetes.io/projected/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-kube-api-access-pxw48\") pod \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\" (UID: \"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3\") " Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.257672 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-bundle" (OuterVolumeSpecName: "bundle") pod "19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" (UID: "19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.266855 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-kube-api-access-pxw48" (OuterVolumeSpecName: "kube-api-access-pxw48") pod "19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" (UID: "19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3"). InnerVolumeSpecName "kube-api-access-pxw48". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.266925 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-util" (OuterVolumeSpecName: "util") pod "19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" (UID: "19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.358433 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-util\") on node \"crc\" DevicePath \"\"" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.358481 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.358491 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxw48\" (UniqueName: \"kubernetes.io/projected/19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3-kube-api-access-pxw48\") on node \"crc\" DevicePath \"\"" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.966054 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" event={"ID":"19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3","Type":"ContainerDied","Data":"9b3a1816840e9fbfddfaa0cc5bd2bc468557916da7d66fc063606224c09d321f"} Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.966109 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b3a1816840e9fbfddfaa0cc5bd2bc468557916da7d66fc063606224c09d321f" Feb 26 15:57:20 crc kubenswrapper[5033]: I0226 15:57:20.966151 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.387464 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5"] Feb 26 15:57:28 crc kubenswrapper[5033]: E0226 15:57:28.388122 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerName="extract" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.388142 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerName="extract" Feb 26 15:57:28 crc kubenswrapper[5033]: E0226 15:57:28.388155 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerName="util" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.388249 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerName="util" Feb 26 15:57:28 crc kubenswrapper[5033]: E0226 15:57:28.388288 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerName="pull" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.388296 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerName="pull" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.388447 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3" containerName="extract" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.388934 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.391275 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.391339 5033 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-4l6z4" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.391529 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.412644 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5"] Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.474487 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3ad6319-0119-4a87-99f6-63c80d1efd69-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k94j5\" (UID: \"c3ad6319-0119-4a87-99f6-63c80d1efd69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.474643 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhr6g\" (UniqueName: \"kubernetes.io/projected/c3ad6319-0119-4a87-99f6-63c80d1efd69-kube-api-access-hhr6g\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k94j5\" (UID: \"c3ad6319-0119-4a87-99f6-63c80d1efd69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.575755 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3ad6319-0119-4a87-99f6-63c80d1efd69-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k94j5\" (UID: \"c3ad6319-0119-4a87-99f6-63c80d1efd69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.575872 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhr6g\" (UniqueName: \"kubernetes.io/projected/c3ad6319-0119-4a87-99f6-63c80d1efd69-kube-api-access-hhr6g\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k94j5\" (UID: \"c3ad6319-0119-4a87-99f6-63c80d1efd69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.576375 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3ad6319-0119-4a87-99f6-63c80d1efd69-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k94j5\" (UID: \"c3ad6319-0119-4a87-99f6-63c80d1efd69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.598099 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhr6g\" (UniqueName: \"kubernetes.io/projected/c3ad6319-0119-4a87-99f6-63c80d1efd69-kube-api-access-hhr6g\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k94j5\" (UID: \"c3ad6319-0119-4a87-99f6-63c80d1efd69\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" Feb 26 15:57:28 crc kubenswrapper[5033]: I0226 15:57:28.705532 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" Feb 26 15:57:29 crc kubenswrapper[5033]: I0226 15:57:29.115603 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5"] Feb 26 15:57:29 crc kubenswrapper[5033]: W0226 15:57:29.123129 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3ad6319_0119_4a87_99f6_63c80d1efd69.slice/crio-5f1a428239e67e76a109e62bd8b920417e0a86ac67b5fe6294023494b0f0e459 WatchSource:0}: Error finding container 5f1a428239e67e76a109e62bd8b920417e0a86ac67b5fe6294023494b0f0e459: Status 404 returned error can't find the container with id 5f1a428239e67e76a109e62bd8b920417e0a86ac67b5fe6294023494b0f0e459 Feb 26 15:57:30 crc kubenswrapper[5033]: I0226 15:57:30.033017 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" event={"ID":"c3ad6319-0119-4a87-99f6-63c80d1efd69","Type":"ContainerStarted","Data":"5f1a428239e67e76a109e62bd8b920417e0a86ac67b5fe6294023494b0f0e459"} Feb 26 15:57:32 crc kubenswrapper[5033]: I0226 15:57:32.047322 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" event={"ID":"c3ad6319-0119-4a87-99f6-63c80d1efd69","Type":"ContainerStarted","Data":"87615b86f0ccc1137bb6f388140d1ecdd5ee601be2eda101bc4957c07fb56fc8"} Feb 26 15:57:32 crc kubenswrapper[5033]: I0226 15:57:32.078071 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k94j5" podStartSLOduration=1.4393396520000001 podStartE2EDuration="4.0780526s" podCreationTimestamp="2026-02-26 15:57:28 +0000 UTC" firstStartedPulling="2026-02-26 15:57:29.12562302 +0000 UTC m=+976.338221670" lastFinishedPulling="2026-02-26 15:57:31.764335968 +0000 UTC m=+978.976934618" observedRunningTime="2026-02-26 15:57:32.073729683 +0000 UTC m=+979.286328333" watchObservedRunningTime="2026-02-26 15:57:32.0780526 +0000 UTC m=+979.290651250" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.150638 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n5ljg"] Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.153248 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.155207 5033 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-smpnn" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.155718 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.156547 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.157596 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbzsv\" (UniqueName: \"kubernetes.io/projected/8f3ed32f-586d-46b9-a6e3-5c61abde9a03-kube-api-access-sbzsv\") pod \"cert-manager-cainjector-5545bd876-n5ljg\" (UID: \"8f3ed32f-586d-46b9-a6e3-5c61abde9a03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.157750 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f3ed32f-586d-46b9-a6e3-5c61abde9a03-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n5ljg\" (UID: \"8f3ed32f-586d-46b9-a6e3-5c61abde9a03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.162109 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n5ljg"] Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.259412 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbzsv\" (UniqueName: \"kubernetes.io/projected/8f3ed32f-586d-46b9-a6e3-5c61abde9a03-kube-api-access-sbzsv\") pod \"cert-manager-cainjector-5545bd876-n5ljg\" (UID: \"8f3ed32f-586d-46b9-a6e3-5c61abde9a03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.259496 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f3ed32f-586d-46b9-a6e3-5c61abde9a03-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n5ljg\" (UID: \"8f3ed32f-586d-46b9-a6e3-5c61abde9a03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.281847 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f3ed32f-586d-46b9-a6e3-5c61abde9a03-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n5ljg\" (UID: \"8f3ed32f-586d-46b9-a6e3-5c61abde9a03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.295163 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbzsv\" (UniqueName: \"kubernetes.io/projected/8f3ed32f-586d-46b9-a6e3-5c61abde9a03-kube-api-access-sbzsv\") pod \"cert-manager-cainjector-5545bd876-n5ljg\" (UID: \"8f3ed32f-586d-46b9-a6e3-5c61abde9a03\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.480597 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" Feb 26 15:57:40 crc kubenswrapper[5033]: I0226 15:57:40.883620 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n5ljg"] Feb 26 15:57:41 crc kubenswrapper[5033]: I0226 15:57:41.256780 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" event={"ID":"8f3ed32f-586d-46b9-a6e3-5c61abde9a03","Type":"ContainerStarted","Data":"cd94e300c792890198bc1937777ec9f9eedee1aaafbfedaf11d430015f0aa192"} Feb 26 15:57:41 crc kubenswrapper[5033]: I0226 15:57:41.775392 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-5vkhb"] Feb 26 15:57:41 crc kubenswrapper[5033]: I0226 15:57:41.776231 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:41 crc kubenswrapper[5033]: I0226 15:57:41.780002 5033 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-44wth" Feb 26 15:57:41 crc kubenswrapper[5033]: I0226 15:57:41.788618 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-5vkhb"] Feb 26 15:57:41 crc kubenswrapper[5033]: I0226 15:57:41.975872 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24dc0187-e3d9-41af-950e-49c544beeed6-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-5vkhb\" (UID: \"24dc0187-e3d9-41af-950e-49c544beeed6\") " pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:41 crc kubenswrapper[5033]: I0226 15:57:41.976001 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blgm\" (UniqueName: \"kubernetes.io/projected/24dc0187-e3d9-41af-950e-49c544beeed6-kube-api-access-9blgm\") pod \"cert-manager-webhook-6888856db4-5vkhb\" (UID: \"24dc0187-e3d9-41af-950e-49c544beeed6\") " pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:42 crc kubenswrapper[5033]: I0226 15:57:42.077297 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blgm\" (UniqueName: \"kubernetes.io/projected/24dc0187-e3d9-41af-950e-49c544beeed6-kube-api-access-9blgm\") pod \"cert-manager-webhook-6888856db4-5vkhb\" (UID: \"24dc0187-e3d9-41af-950e-49c544beeed6\") " pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:42 crc kubenswrapper[5033]: I0226 15:57:42.077382 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24dc0187-e3d9-41af-950e-49c544beeed6-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-5vkhb\" (UID: \"24dc0187-e3d9-41af-950e-49c544beeed6\") " pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:42 crc kubenswrapper[5033]: I0226 15:57:42.097183 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blgm\" (UniqueName: \"kubernetes.io/projected/24dc0187-e3d9-41af-950e-49c544beeed6-kube-api-access-9blgm\") pod \"cert-manager-webhook-6888856db4-5vkhb\" (UID: \"24dc0187-e3d9-41af-950e-49c544beeed6\") " pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:42 crc kubenswrapper[5033]: I0226 15:57:42.097572 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24dc0187-e3d9-41af-950e-49c544beeed6-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-5vkhb\" (UID: \"24dc0187-e3d9-41af-950e-49c544beeed6\") " pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:42 crc kubenswrapper[5033]: I0226 15:57:42.104182 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:42 crc kubenswrapper[5033]: I0226 15:57:42.306510 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-5vkhb"] Feb 26 15:57:42 crc kubenswrapper[5033]: W0226 15:57:42.315185 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24dc0187_e3d9_41af_950e_49c544beeed6.slice/crio-7887013faf9e02a5aee366f2b44dc02a802c1b2f1a0f3aba6a1ed58fcf48cdab WatchSource:0}: Error finding container 7887013faf9e02a5aee366f2b44dc02a802c1b2f1a0f3aba6a1ed58fcf48cdab: Status 404 returned error can't find the container with id 7887013faf9e02a5aee366f2b44dc02a802c1b2f1a0f3aba6a1ed58fcf48cdab Feb 26 15:57:43 crc kubenswrapper[5033]: I0226 15:57:43.275214 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" event={"ID":"24dc0187-e3d9-41af-950e-49c544beeed6","Type":"ContainerStarted","Data":"7887013faf9e02a5aee366f2b44dc02a802c1b2f1a0f3aba6a1ed58fcf48cdab"} Feb 26 15:57:48 crc kubenswrapper[5033]: I0226 15:57:48.316059 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" event={"ID":"24dc0187-e3d9-41af-950e-49c544beeed6","Type":"ContainerStarted","Data":"94a2b2fd674d91dd342e9b5437a47b9c85ad7b8f07f5e314b9fe881afd960304"} Feb 26 15:57:48 crc kubenswrapper[5033]: I0226 15:57:48.316977 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:48 crc kubenswrapper[5033]: I0226 15:57:48.319720 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" event={"ID":"8f3ed32f-586d-46b9-a6e3-5c61abde9a03","Type":"ContainerStarted","Data":"97c52738a6b6cb295ddb4aea66ef9da12d9b97bcb03b92c08305c81e408f24a1"} Feb 26 15:57:48 crc kubenswrapper[5033]: I0226 15:57:48.341538 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" podStartSLOduration=2.573579627 podStartE2EDuration="7.341506157s" podCreationTimestamp="2026-02-26 15:57:41 +0000 UTC" firstStartedPulling="2026-02-26 15:57:42.317914167 +0000 UTC m=+989.530512827" lastFinishedPulling="2026-02-26 15:57:47.085840707 +0000 UTC m=+994.298439357" observedRunningTime="2026-02-26 15:57:48.334873463 +0000 UTC m=+995.547472123" watchObservedRunningTime="2026-02-26 15:57:48.341506157 +0000 UTC m=+995.554104847" Feb 26 15:57:48 crc kubenswrapper[5033]: I0226 15:57:48.390245 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-n5ljg" podStartSLOduration=2.21861282 podStartE2EDuration="8.390224498s" podCreationTimestamp="2026-02-26 15:57:40 +0000 UTC" firstStartedPulling="2026-02-26 15:57:40.896922008 +0000 UTC m=+988.109520668" lastFinishedPulling="2026-02-26 15:57:47.068533696 +0000 UTC m=+994.281132346" observedRunningTime="2026-02-26 15:57:48.385718006 +0000 UTC m=+995.598316676" watchObservedRunningTime="2026-02-26 15:57:48.390224498 +0000 UTC m=+995.602823158" Feb 26 15:57:50 crc kubenswrapper[5033]: I0226 15:57:50.188541 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:57:50 crc kubenswrapper[5033]: I0226 15:57:50.188618 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:57:50 crc kubenswrapper[5033]: I0226 15:57:50.188669 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 15:57:50 crc kubenswrapper[5033]: I0226 15:57:50.189525 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0364f13db4a9c5dfb04881c280d818ef2c52b6cfbe5f9db1cd51cf7111f504ed"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:57:50 crc kubenswrapper[5033]: I0226 15:57:50.189620 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://0364f13db4a9c5dfb04881c280d818ef2c52b6cfbe5f9db1cd51cf7111f504ed" gracePeriod=600 Feb 26 15:57:50 crc kubenswrapper[5033]: I0226 15:57:50.335818 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"0364f13db4a9c5dfb04881c280d818ef2c52b6cfbe5f9db1cd51cf7111f504ed"} Feb 26 15:57:50 crc kubenswrapper[5033]: I0226 15:57:50.335900 5033 scope.go:117] "RemoveContainer" containerID="8ea382d5339c87a85284b67b2964d5c78dfbd09c05b9d81cf3fc9df0642c2fcc" Feb 26 15:57:50 crc kubenswrapper[5033]: I0226 15:57:50.336946 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="0364f13db4a9c5dfb04881c280d818ef2c52b6cfbe5f9db1cd51cf7111f504ed" exitCode=0 Feb 26 15:57:52 crc kubenswrapper[5033]: I0226 15:57:52.108288 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-5vkhb" Feb 26 15:57:52 crc kubenswrapper[5033]: I0226 15:57:52.350959 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"1db84da97ee76a847298329e56a91cd397d799819a3948a986ed6808b2508c55"} Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.128237 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-glxpk"] Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.129724 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-glxpk" Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.132945 5033 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6zk9g" Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.145803 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-glxpk"] Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.177774 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a36f0b4-122a-453f-aa82-c5b8ca6862fa-bound-sa-token\") pod \"cert-manager-545d4d4674-glxpk\" (UID: \"3a36f0b4-122a-453f-aa82-c5b8ca6862fa\") " pod="cert-manager/cert-manager-545d4d4674-glxpk" Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.177927 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nr4m\" (UniqueName: \"kubernetes.io/projected/3a36f0b4-122a-453f-aa82-c5b8ca6862fa-kube-api-access-7nr4m\") pod \"cert-manager-545d4d4674-glxpk\" (UID: \"3a36f0b4-122a-453f-aa82-c5b8ca6862fa\") " pod="cert-manager/cert-manager-545d4d4674-glxpk" Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.279757 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nr4m\" (UniqueName: \"kubernetes.io/projected/3a36f0b4-122a-453f-aa82-c5b8ca6862fa-kube-api-access-7nr4m\") pod \"cert-manager-545d4d4674-glxpk\" (UID: \"3a36f0b4-122a-453f-aa82-c5b8ca6862fa\") " pod="cert-manager/cert-manager-545d4d4674-glxpk" Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.279911 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a36f0b4-122a-453f-aa82-c5b8ca6862fa-bound-sa-token\") pod \"cert-manager-545d4d4674-glxpk\" (UID: \"3a36f0b4-122a-453f-aa82-c5b8ca6862fa\") " pod="cert-manager/cert-manager-545d4d4674-glxpk" Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.309547 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a36f0b4-122a-453f-aa82-c5b8ca6862fa-bound-sa-token\") pod \"cert-manager-545d4d4674-glxpk\" (UID: \"3a36f0b4-122a-453f-aa82-c5b8ca6862fa\") " pod="cert-manager/cert-manager-545d4d4674-glxpk" Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.311395 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nr4m\" (UniqueName: \"kubernetes.io/projected/3a36f0b4-122a-453f-aa82-c5b8ca6862fa-kube-api-access-7nr4m\") pod \"cert-manager-545d4d4674-glxpk\" (UID: \"3a36f0b4-122a-453f-aa82-c5b8ca6862fa\") " pod="cert-manager/cert-manager-545d4d4674-glxpk" Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.460005 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-glxpk" Feb 26 15:57:55 crc kubenswrapper[5033]: I0226 15:57:55.739712 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-glxpk"] Feb 26 15:57:56 crc kubenswrapper[5033]: I0226 15:57:56.379487 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-glxpk" event={"ID":"3a36f0b4-122a-453f-aa82-c5b8ca6862fa","Type":"ContainerStarted","Data":"95c163ce9b9d2bbe7c98456f9de95728198b511aca7ae204d310b84acb7e4c9b"} Feb 26 15:57:56 crc kubenswrapper[5033]: I0226 15:57:56.379536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-glxpk" event={"ID":"3a36f0b4-122a-453f-aa82-c5b8ca6862fa","Type":"ContainerStarted","Data":"7f3379193a2ea1dc8110706edad1a0152a9f7a192f4c3f99cf0dbf871eb01b9c"} Feb 26 15:57:56 crc kubenswrapper[5033]: I0226 15:57:56.412412 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-glxpk" podStartSLOduration=1.412391175 podStartE2EDuration="1.412391175s" podCreationTimestamp="2026-02-26 15:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:57:56.406487069 +0000 UTC m=+1003.619085759" watchObservedRunningTime="2026-02-26 15:57:56.412391175 +0000 UTC m=+1003.624989815" Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.145462 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535358-zt5pt"] Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.147962 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535358-zt5pt" Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.155393 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.155650 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.158843 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535358-zt5pt"] Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.161053 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.270173 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czqj6\" (UniqueName: \"kubernetes.io/projected/d16b9882-8189-44de-9058-d64a70399067-kube-api-access-czqj6\") pod \"auto-csr-approver-29535358-zt5pt\" (UID: \"d16b9882-8189-44de-9058-d64a70399067\") " pod="openshift-infra/auto-csr-approver-29535358-zt5pt" Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.371616 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czqj6\" (UniqueName: \"kubernetes.io/projected/d16b9882-8189-44de-9058-d64a70399067-kube-api-access-czqj6\") pod \"auto-csr-approver-29535358-zt5pt\" (UID: \"d16b9882-8189-44de-9058-d64a70399067\") " pod="openshift-infra/auto-csr-approver-29535358-zt5pt" Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.394377 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czqj6\" (UniqueName: \"kubernetes.io/projected/d16b9882-8189-44de-9058-d64a70399067-kube-api-access-czqj6\") pod \"auto-csr-approver-29535358-zt5pt\" (UID: \"d16b9882-8189-44de-9058-d64a70399067\") " pod="openshift-infra/auto-csr-approver-29535358-zt5pt" Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.463983 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535358-zt5pt" Feb 26 15:58:00 crc kubenswrapper[5033]: I0226 15:58:00.696804 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535358-zt5pt"] Feb 26 15:58:01 crc kubenswrapper[5033]: I0226 15:58:01.410378 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535358-zt5pt" event={"ID":"d16b9882-8189-44de-9058-d64a70399067","Type":"ContainerStarted","Data":"709644e0d05a3c04981d1328b647ef2402a283a793eb2bf7895cb72bab24ec30"} Feb 26 15:58:02 crc kubenswrapper[5033]: I0226 15:58:02.419823 5033 generic.go:334] "Generic (PLEG): container finished" podID="d16b9882-8189-44de-9058-d64a70399067" containerID="f9b9bdb8473f14761b65adb072ad4c4b45ed533735f4281f6c4543402f18d5e5" exitCode=0 Feb 26 15:58:02 crc kubenswrapper[5033]: I0226 15:58:02.419900 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535358-zt5pt" event={"ID":"d16b9882-8189-44de-9058-d64a70399067","Type":"ContainerDied","Data":"f9b9bdb8473f14761b65adb072ad4c4b45ed533735f4281f6c4543402f18d5e5"} Feb 26 15:58:03 crc kubenswrapper[5033]: I0226 15:58:03.686539 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535358-zt5pt" Feb 26 15:58:03 crc kubenswrapper[5033]: I0226 15:58:03.817414 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czqj6\" (UniqueName: \"kubernetes.io/projected/d16b9882-8189-44de-9058-d64a70399067-kube-api-access-czqj6\") pod \"d16b9882-8189-44de-9058-d64a70399067\" (UID: \"d16b9882-8189-44de-9058-d64a70399067\") " Feb 26 15:58:03 crc kubenswrapper[5033]: I0226 15:58:03.826651 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16b9882-8189-44de-9058-d64a70399067-kube-api-access-czqj6" (OuterVolumeSpecName: "kube-api-access-czqj6") pod "d16b9882-8189-44de-9058-d64a70399067" (UID: "d16b9882-8189-44de-9058-d64a70399067"). InnerVolumeSpecName "kube-api-access-czqj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:58:03 crc kubenswrapper[5033]: I0226 15:58:03.919130 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czqj6\" (UniqueName: \"kubernetes.io/projected/d16b9882-8189-44de-9058-d64a70399067-kube-api-access-czqj6\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:04 crc kubenswrapper[5033]: I0226 15:58:04.437722 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535358-zt5pt" event={"ID":"d16b9882-8189-44de-9058-d64a70399067","Type":"ContainerDied","Data":"709644e0d05a3c04981d1328b647ef2402a283a793eb2bf7895cb72bab24ec30"} Feb 26 15:58:04 crc kubenswrapper[5033]: I0226 15:58:04.437792 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="709644e0d05a3c04981d1328b647ef2402a283a793eb2bf7895cb72bab24ec30" Feb 26 15:58:04 crc kubenswrapper[5033]: I0226 15:58:04.438389 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535358-zt5pt" Feb 26 15:58:04 crc kubenswrapper[5033]: I0226 15:58:04.737742 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535352-b7nrc"] Feb 26 15:58:04 crc kubenswrapper[5033]: I0226 15:58:04.741653 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535352-b7nrc"] Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.533856 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-znb4b"] Feb 26 15:58:05 crc kubenswrapper[5033]: E0226 15:58:05.534908 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16b9882-8189-44de-9058-d64a70399067" containerName="oc" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.534963 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16b9882-8189-44de-9058-d64a70399067" containerName="oc" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.535534 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16b9882-8189-44de-9058-d64a70399067" containerName="oc" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.536793 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-znb4b" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.543775 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-czzn5" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.544103 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.544488 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.565033 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-znb4b"] Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.635397 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bef3ffd-290a-42fb-ae79-e423db030c18" path="/var/lib/kubelet/pods/7bef3ffd-290a-42fb-ae79-e423db030c18/volumes" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.647647 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnsq6\" (UniqueName: \"kubernetes.io/projected/1d13e70e-9003-4cb5-b424-935127386b56-kube-api-access-lnsq6\") pod \"openstack-operator-index-znb4b\" (UID: \"1d13e70e-9003-4cb5-b424-935127386b56\") " pod="openstack-operators/openstack-operator-index-znb4b" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.749296 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnsq6\" (UniqueName: \"kubernetes.io/projected/1d13e70e-9003-4cb5-b424-935127386b56-kube-api-access-lnsq6\") pod \"openstack-operator-index-znb4b\" (UID: \"1d13e70e-9003-4cb5-b424-935127386b56\") " pod="openstack-operators/openstack-operator-index-znb4b" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.774972 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnsq6\" (UniqueName: \"kubernetes.io/projected/1d13e70e-9003-4cb5-b424-935127386b56-kube-api-access-lnsq6\") pod \"openstack-operator-index-znb4b\" (UID: \"1d13e70e-9003-4cb5-b424-935127386b56\") " pod="openstack-operators/openstack-operator-index-znb4b" Feb 26 15:58:05 crc kubenswrapper[5033]: I0226 15:58:05.862110 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-znb4b" Feb 26 15:58:06 crc kubenswrapper[5033]: I0226 15:58:06.290024 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-znb4b"] Feb 26 15:58:06 crc kubenswrapper[5033]: W0226 15:58:06.292637 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d13e70e_9003_4cb5_b424_935127386b56.slice/crio-bf03ae62cc11c05ed434c858629b4b9960057e6f113255aa0d90b343f8489352 WatchSource:0}: Error finding container bf03ae62cc11c05ed434c858629b4b9960057e6f113255aa0d90b343f8489352: Status 404 returned error can't find the container with id bf03ae62cc11c05ed434c858629b4b9960057e6f113255aa0d90b343f8489352 Feb 26 15:58:06 crc kubenswrapper[5033]: I0226 15:58:06.452903 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-znb4b" event={"ID":"1d13e70e-9003-4cb5-b424-935127386b56","Type":"ContainerStarted","Data":"bf03ae62cc11c05ed434c858629b4b9960057e6f113255aa0d90b343f8489352"} Feb 26 15:58:09 crc kubenswrapper[5033]: I0226 15:58:09.479366 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-znb4b" event={"ID":"1d13e70e-9003-4cb5-b424-935127386b56","Type":"ContainerStarted","Data":"a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee"} Feb 26 15:58:09 crc kubenswrapper[5033]: I0226 15:58:09.503445 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-znb4b" podStartSLOduration=2.213294207 podStartE2EDuration="4.503412396s" podCreationTimestamp="2026-02-26 15:58:05 +0000 UTC" firstStartedPulling="2026-02-26 15:58:06.294969555 +0000 UTC m=+1013.507568245" lastFinishedPulling="2026-02-26 15:58:08.585087774 +0000 UTC m=+1015.797686434" observedRunningTime="2026-02-26 15:58:09.499524769 +0000 UTC m=+1016.712123439" watchObservedRunningTime="2026-02-26 15:58:09.503412396 +0000 UTC m=+1016.716011106" Feb 26 15:58:09 crc kubenswrapper[5033]: I0226 15:58:09.700952 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-znb4b"] Feb 26 15:58:10 crc kubenswrapper[5033]: I0226 15:58:10.306688 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xws47"] Feb 26 15:58:10 crc kubenswrapper[5033]: I0226 15:58:10.307712 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xws47" Feb 26 15:58:10 crc kubenswrapper[5033]: I0226 15:58:10.331816 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xws47"] Feb 26 15:58:10 crc kubenswrapper[5033]: I0226 15:58:10.411364 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bwn6\" (UniqueName: \"kubernetes.io/projected/3c0d3fc4-f017-41be-aac9-f570ece2a54a-kube-api-access-5bwn6\") pod \"openstack-operator-index-xws47\" (UID: \"3c0d3fc4-f017-41be-aac9-f570ece2a54a\") " pod="openstack-operators/openstack-operator-index-xws47" Feb 26 15:58:10 crc kubenswrapper[5033]: I0226 15:58:10.512626 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bwn6\" (UniqueName: \"kubernetes.io/projected/3c0d3fc4-f017-41be-aac9-f570ece2a54a-kube-api-access-5bwn6\") pod \"openstack-operator-index-xws47\" (UID: \"3c0d3fc4-f017-41be-aac9-f570ece2a54a\") " pod="openstack-operators/openstack-operator-index-xws47" Feb 26 15:58:10 crc kubenswrapper[5033]: I0226 15:58:10.533831 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bwn6\" (UniqueName: \"kubernetes.io/projected/3c0d3fc4-f017-41be-aac9-f570ece2a54a-kube-api-access-5bwn6\") pod \"openstack-operator-index-xws47\" (UID: \"3c0d3fc4-f017-41be-aac9-f570ece2a54a\") " pod="openstack-operators/openstack-operator-index-xws47" Feb 26 15:58:10 crc kubenswrapper[5033]: I0226 15:58:10.638178 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xws47" Feb 26 15:58:10 crc kubenswrapper[5033]: W0226 15:58:10.854098 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c0d3fc4_f017_41be_aac9_f570ece2a54a.slice/crio-fd838735c80fc48537e8f40afc4e412cba8269c57fc8703a7c1d11fabf309c5f WatchSource:0}: Error finding container fd838735c80fc48537e8f40afc4e412cba8269c57fc8703a7c1d11fabf309c5f: Status 404 returned error can't find the container with id fd838735c80fc48537e8f40afc4e412cba8269c57fc8703a7c1d11fabf309c5f Feb 26 15:58:10 crc kubenswrapper[5033]: I0226 15:58:10.857792 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xws47"] Feb 26 15:58:11 crc kubenswrapper[5033]: I0226 15:58:11.493984 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xws47" event={"ID":"3c0d3fc4-f017-41be-aac9-f570ece2a54a","Type":"ContainerStarted","Data":"bcad1a98ea9847de0b01f943b64fb5576b825de1bfbbfd4d13bad5b2cc82d298"} Feb 26 15:58:11 crc kubenswrapper[5033]: I0226 15:58:11.494049 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xws47" event={"ID":"3c0d3fc4-f017-41be-aac9-f570ece2a54a","Type":"ContainerStarted","Data":"fd838735c80fc48537e8f40afc4e412cba8269c57fc8703a7c1d11fabf309c5f"} Feb 26 15:58:11 crc kubenswrapper[5033]: I0226 15:58:11.494018 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-znb4b" podUID="1d13e70e-9003-4cb5-b424-935127386b56" containerName="registry-server" containerID="cri-o://a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee" gracePeriod=2 Feb 26 15:58:11 crc kubenswrapper[5033]: I0226 15:58:11.511184 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xws47" podStartSLOduration=1.455786155 podStartE2EDuration="1.51116203s" podCreationTimestamp="2026-02-26 15:58:10 +0000 UTC" firstStartedPulling="2026-02-26 15:58:10.856151989 +0000 UTC m=+1018.068750639" lastFinishedPulling="2026-02-26 15:58:10.911527864 +0000 UTC m=+1018.124126514" observedRunningTime="2026-02-26 15:58:11.508658038 +0000 UTC m=+1018.721256708" watchObservedRunningTime="2026-02-26 15:58:11.51116203 +0000 UTC m=+1018.723760680" Feb 26 15:58:11 crc kubenswrapper[5033]: I0226 15:58:11.860552 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-znb4b" Feb 26 15:58:11 crc kubenswrapper[5033]: I0226 15:58:11.931738 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnsq6\" (UniqueName: \"kubernetes.io/projected/1d13e70e-9003-4cb5-b424-935127386b56-kube-api-access-lnsq6\") pod \"1d13e70e-9003-4cb5-b424-935127386b56\" (UID: \"1d13e70e-9003-4cb5-b424-935127386b56\") " Feb 26 15:58:11 crc kubenswrapper[5033]: I0226 15:58:11.937125 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d13e70e-9003-4cb5-b424-935127386b56-kube-api-access-lnsq6" (OuterVolumeSpecName: "kube-api-access-lnsq6") pod "1d13e70e-9003-4cb5-b424-935127386b56" (UID: "1d13e70e-9003-4cb5-b424-935127386b56"). InnerVolumeSpecName "kube-api-access-lnsq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.033319 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnsq6\" (UniqueName: \"kubernetes.io/projected/1d13e70e-9003-4cb5-b424-935127386b56-kube-api-access-lnsq6\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.500720 5033 generic.go:334] "Generic (PLEG): container finished" podID="1d13e70e-9003-4cb5-b424-935127386b56" containerID="a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee" exitCode=0 Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.500774 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-znb4b" event={"ID":"1d13e70e-9003-4cb5-b424-935127386b56","Type":"ContainerDied","Data":"a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee"} Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.500804 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-znb4b" Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.500817 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-znb4b" event={"ID":"1d13e70e-9003-4cb5-b424-935127386b56","Type":"ContainerDied","Data":"bf03ae62cc11c05ed434c858629b4b9960057e6f113255aa0d90b343f8489352"} Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.500829 5033 scope.go:117] "RemoveContainer" containerID="a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee" Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.518521 5033 scope.go:117] "RemoveContainer" containerID="a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee" Feb 26 15:58:12 crc kubenswrapper[5033]: E0226 15:58:12.519394 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee\": container with ID starting with a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee not found: ID does not exist" containerID="a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee" Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.519504 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee"} err="failed to get container status \"a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee\": rpc error: code = NotFound desc = could not find container \"a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee\": container with ID starting with a1a5e578226c5c5ba6117fcf23ab814f04cadcea2264dd0d2ff1f5c41eb170ee not found: ID does not exist" Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.532778 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-znb4b"] Feb 26 15:58:12 crc kubenswrapper[5033]: I0226 15:58:12.537288 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-znb4b"] Feb 26 15:58:13 crc kubenswrapper[5033]: I0226 15:58:13.638827 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d13e70e-9003-4cb5-b424-935127386b56" path="/var/lib/kubelet/pods/1d13e70e-9003-4cb5-b424-935127386b56/volumes" Feb 26 15:58:16 crc kubenswrapper[5033]: I0226 15:58:16.408191 5033 scope.go:117] "RemoveContainer" containerID="0ab25daca996b2874ca4275af71b08437be053512a7d47f352be8c2c16a7979f" Feb 26 15:58:20 crc kubenswrapper[5033]: I0226 15:58:20.638694 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xws47" Feb 26 15:58:20 crc kubenswrapper[5033]: I0226 15:58:20.638749 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xws47" Feb 26 15:58:20 crc kubenswrapper[5033]: I0226 15:58:20.668302 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xws47" Feb 26 15:58:21 crc kubenswrapper[5033]: I0226 15:58:21.593188 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xws47" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.590635 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j"] Feb 26 15:58:27 crc kubenswrapper[5033]: E0226 15:58:27.591932 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d13e70e-9003-4cb5-b424-935127386b56" containerName="registry-server" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.591954 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d13e70e-9003-4cb5-b424-935127386b56" containerName="registry-server" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.592137 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d13e70e-9003-4cb5-b424-935127386b56" containerName="registry-server" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.594242 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.596783 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-p6cv6" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.603440 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j"] Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.662492 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-util\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.662571 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv7q9\" (UniqueName: \"kubernetes.io/projected/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-kube-api-access-xv7q9\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.662608 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-bundle\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.764331 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-bundle\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.764730 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-util\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.765009 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-bundle\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.765008 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv7q9\" (UniqueName: \"kubernetes.io/projected/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-kube-api-access-xv7q9\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.765449 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-util\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.793885 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv7q9\" (UniqueName: \"kubernetes.io/projected/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-kube-api-access-xv7q9\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:27 crc kubenswrapper[5033]: I0226 15:58:27.916921 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:28 crc kubenswrapper[5033]: I0226 15:58:28.127898 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j"] Feb 26 15:58:28 crc kubenswrapper[5033]: W0226 15:58:28.142646 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3bab0f1_9a79_4f6c_9f7c_6c1874f591ba.slice/crio-88804dee178ea418db9353ab47edc1c68f83d26a9cf115d30db4102e90b17db7 WatchSource:0}: Error finding container 88804dee178ea418db9353ab47edc1c68f83d26a9cf115d30db4102e90b17db7: Status 404 returned error can't find the container with id 88804dee178ea418db9353ab47edc1c68f83d26a9cf115d30db4102e90b17db7 Feb 26 15:58:28 crc kubenswrapper[5033]: I0226 15:58:28.616480 5033 generic.go:334] "Generic (PLEG): container finished" podID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerID="f7028ba738ee374bc37b7fbb309fede7e7efe1022b452701e4fecdcca02476da" exitCode=0 Feb 26 15:58:28 crc kubenswrapper[5033]: I0226 15:58:28.616546 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" event={"ID":"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba","Type":"ContainerDied","Data":"f7028ba738ee374bc37b7fbb309fede7e7efe1022b452701e4fecdcca02476da"} Feb 26 15:58:28 crc kubenswrapper[5033]: I0226 15:58:28.616906 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" event={"ID":"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba","Type":"ContainerStarted","Data":"88804dee178ea418db9353ab47edc1c68f83d26a9cf115d30db4102e90b17db7"} Feb 26 15:58:29 crc kubenswrapper[5033]: I0226 15:58:29.626807 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" event={"ID":"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba","Type":"ContainerStarted","Data":"bd2f656f377435c94e02c92c2e19b151e837f933592c6719288f8d937fade3ce"} Feb 26 15:58:30 crc kubenswrapper[5033]: I0226 15:58:30.636726 5033 generic.go:334] "Generic (PLEG): container finished" podID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerID="bd2f656f377435c94e02c92c2e19b151e837f933592c6719288f8d937fade3ce" exitCode=0 Feb 26 15:58:30 crc kubenswrapper[5033]: I0226 15:58:30.636775 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" event={"ID":"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba","Type":"ContainerDied","Data":"bd2f656f377435c94e02c92c2e19b151e837f933592c6719288f8d937fade3ce"} Feb 26 15:58:31 crc kubenswrapper[5033]: I0226 15:58:31.648989 5033 generic.go:334] "Generic (PLEG): container finished" podID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerID="0c9b6ab3bda00f181d300b281fe04049a7a4dd70eadc7dd34c0712d587aa6dd0" exitCode=0 Feb 26 15:58:31 crc kubenswrapper[5033]: I0226 15:58:31.649096 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" event={"ID":"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba","Type":"ContainerDied","Data":"0c9b6ab3bda00f181d300b281fe04049a7a4dd70eadc7dd34c0712d587aa6dd0"} Feb 26 15:58:32 crc kubenswrapper[5033]: I0226 15:58:32.894986 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:32 crc kubenswrapper[5033]: I0226 15:58:32.939906 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-util\") pod \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " Feb 26 15:58:32 crc kubenswrapper[5033]: I0226 15:58:32.940029 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-bundle\") pod \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " Feb 26 15:58:32 crc kubenswrapper[5033]: I0226 15:58:32.940135 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv7q9\" (UniqueName: \"kubernetes.io/projected/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-kube-api-access-xv7q9\") pod \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\" (UID: \"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba\") " Feb 26 15:58:32 crc kubenswrapper[5033]: I0226 15:58:32.941568 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-bundle" (OuterVolumeSpecName: "bundle") pod "e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" (UID: "e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:58:32 crc kubenswrapper[5033]: I0226 15:58:32.954382 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-util" (OuterVolumeSpecName: "util") pod "e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" (UID: "e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:58:32 crc kubenswrapper[5033]: I0226 15:58:32.954417 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-kube-api-access-xv7q9" (OuterVolumeSpecName: "kube-api-access-xv7q9") pod "e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" (UID: "e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba"). InnerVolumeSpecName "kube-api-access-xv7q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:58:33 crc kubenswrapper[5033]: I0226 15:58:33.041520 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv7q9\" (UniqueName: \"kubernetes.io/projected/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-kube-api-access-xv7q9\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:33 crc kubenswrapper[5033]: I0226 15:58:33.041569 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-util\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:33 crc kubenswrapper[5033]: I0226 15:58:33.041591 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:33 crc kubenswrapper[5033]: I0226 15:58:33.666349 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" event={"ID":"e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba","Type":"ContainerDied","Data":"88804dee178ea418db9353ab47edc1c68f83d26a9cf115d30db4102e90b17db7"} Feb 26 15:58:33 crc kubenswrapper[5033]: I0226 15:58:33.666392 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88804dee178ea418db9353ab47edc1c68f83d26a9cf115d30db4102e90b17db7" Feb 26 15:58:33 crc kubenswrapper[5033]: I0226 15:58:33.666449 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j" Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.694646 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns"] Feb 26 15:58:39 crc kubenswrapper[5033]: E0226 15:58:39.695495 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerName="util" Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.695515 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerName="util" Feb 26 15:58:39 crc kubenswrapper[5033]: E0226 15:58:39.695539 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerName="extract" Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.695547 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerName="extract" Feb 26 15:58:39 crc kubenswrapper[5033]: E0226 15:58:39.695559 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerName="pull" Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.695567 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerName="pull" Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.695690 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba" containerName="extract" Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.696148 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.701203 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-xtbcz" Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.718842 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns"] Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.742272 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfxf\" (UniqueName: \"kubernetes.io/projected/f15028fb-d8cb-4af3-954a-5b361e84f246-kube-api-access-jkfxf\") pod \"openstack-operator-controller-init-66fc5dfc5b-8k2ns\" (UID: \"f15028fb-d8cb-4af3-954a-5b361e84f246\") " pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" Feb 26 15:58:39 crc kubenswrapper[5033]: I0226 15:58:39.844616 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfxf\" (UniqueName: \"kubernetes.io/projected/f15028fb-d8cb-4af3-954a-5b361e84f246-kube-api-access-jkfxf\") pod \"openstack-operator-controller-init-66fc5dfc5b-8k2ns\" (UID: \"f15028fb-d8cb-4af3-954a-5b361e84f246\") " pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" Feb 26 15:58:40 crc kubenswrapper[5033]: I0226 15:58:40.112223 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfxf\" (UniqueName: \"kubernetes.io/projected/f15028fb-d8cb-4af3-954a-5b361e84f246-kube-api-access-jkfxf\") pod \"openstack-operator-controller-init-66fc5dfc5b-8k2ns\" (UID: \"f15028fb-d8cb-4af3-954a-5b361e84f246\") " pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" Feb 26 15:58:40 crc kubenswrapper[5033]: I0226 15:58:40.314316 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" Feb 26 15:58:40 crc kubenswrapper[5033]: I0226 15:58:40.676655 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns"] Feb 26 15:58:40 crc kubenswrapper[5033]: I0226 15:58:40.715149 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" event={"ID":"f15028fb-d8cb-4af3-954a-5b361e84f246","Type":"ContainerStarted","Data":"f1e45a557b394631af1dfb0b5d5b5669bc1bd797d6892a97c33078c4e611188f"} Feb 26 15:58:45 crc kubenswrapper[5033]: I0226 15:58:45.746085 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" event={"ID":"f15028fb-d8cb-4af3-954a-5b361e84f246","Type":"ContainerStarted","Data":"e073ca08e9ebfcc2c6c5db1241b62d86162b51142d76abe19b5401f67612c4d7"} Feb 26 15:58:45 crc kubenswrapper[5033]: I0226 15:58:45.746667 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" Feb 26 15:58:45 crc kubenswrapper[5033]: I0226 15:58:45.774053 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" podStartSLOduration=2.252431228 podStartE2EDuration="6.774034497s" podCreationTimestamp="2026-02-26 15:58:39 +0000 UTC" firstStartedPulling="2026-02-26 15:58:40.689021712 +0000 UTC m=+1047.901620362" lastFinishedPulling="2026-02-26 15:58:45.210624981 +0000 UTC m=+1052.423223631" observedRunningTime="2026-02-26 15:58:45.772901159 +0000 UTC m=+1052.985499819" watchObservedRunningTime="2026-02-26 15:58:45.774034497 +0000 UTC m=+1052.986633147" Feb 26 15:58:50 crc kubenswrapper[5033]: I0226 15:58:50.318772 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-8k2ns" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.784588 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.786130 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.788142 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6wnpn" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.790444 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.791165 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.792508 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2sfzv" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.796555 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.797668 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.799571 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-68gbh" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.812207 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.817901 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.821063 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.825665 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.826906 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.828962 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9mxs6" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.834655 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.841884 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.842807 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.849011 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-cfl92" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.866479 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.877443 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.878167 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.880098 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-8qxhs" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.890984 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65bmr\" (UniqueName: \"kubernetes.io/projected/965ea850-eadc-4a58-ba58-65acacf603e8-kube-api-access-65bmr\") pod \"designate-operator-controller-manager-55cc45767f-2szs7\" (UID: \"965ea850-eadc-4a58-ba58-65acacf603e8\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.891018 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sm6g\" (UniqueName: \"kubernetes.io/projected/968ae160-ba03-4539-849f-6aca2d464181-kube-api-access-7sm6g\") pod \"heat-operator-controller-manager-9595d6797-p7fv4\" (UID: \"968ae160-ba03-4539-849f-6aca2d464181\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.891052 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj28s\" (UniqueName: \"kubernetes.io/projected/61f234b9-a904-466e-b680-f8d6924c8240-kube-api-access-jj28s\") pod \"glance-operator-controller-manager-7f748f8b74-nrh9v\" (UID: \"61f234b9-a904-466e-b680-f8d6924c8240\") " pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.891069 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27lhg\" (UniqueName: \"kubernetes.io/projected/8611f841-ae12-4c99-95cf-06d75c839484-kube-api-access-27lhg\") pod \"barbican-operator-controller-manager-c4b7d6946-bnptj\" (UID: \"8611f841-ae12-4c99-95cf-06d75c839484\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.891086 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmqr4\" (UniqueName: \"kubernetes.io/projected/bae783d2-7df3-44de-a5e9-ccee0c3ae193-kube-api-access-mmqr4\") pod \"cinder-operator-controller-manager-768c8b45bb-x2r76\" (UID: \"bae783d2-7df3-44de-a5e9-ccee0c3ae193\") " pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.909784 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.910803 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.916444 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.920655 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jqppl" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.930278 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.954572 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.955526 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.960931 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-65qlj" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.965398 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8"] Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.966428 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.979068 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-w8llz" Feb 26 15:59:25 crc kubenswrapper[5033]: I0226 15:59:25.986456 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.995085 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.997021 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7khc\" (UniqueName: \"kubernetes.io/projected/49b7ffc2-b28c-451d-9fff-d7af6078a893-kube-api-access-k7khc\") pod \"ironic-operator-controller-manager-6494cdbf8f-m75x9\" (UID: \"49b7ffc2-b28c-451d-9fff-d7af6078a893\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.997125 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45vb5\" (UniqueName: \"kubernetes.io/projected/aeec1478-5706-42da-9d49-ee5bd5b24cbc-kube-api-access-45vb5\") pod \"horizon-operator-controller-manager-54fb488b88-5rdb7\" (UID: \"aeec1478-5706-42da-9d49-ee5bd5b24cbc\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.997159 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.997205 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65bmr\" (UniqueName: \"kubernetes.io/projected/965ea850-eadc-4a58-ba58-65acacf603e8-kube-api-access-65bmr\") pod \"designate-operator-controller-manager-55cc45767f-2szs7\" (UID: \"965ea850-eadc-4a58-ba58-65acacf603e8\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.997228 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sm6g\" (UniqueName: \"kubernetes.io/projected/968ae160-ba03-4539-849f-6aca2d464181-kube-api-access-7sm6g\") pod \"heat-operator-controller-manager-9595d6797-p7fv4\" (UID: \"968ae160-ba03-4539-849f-6aca2d464181\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.997251 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk2w8\" (UniqueName: \"kubernetes.io/projected/acfa5be3-b477-460f-8835-7f86492eed44-kube-api-access-lk2w8\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.997324 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj28s\" (UniqueName: \"kubernetes.io/projected/61f234b9-a904-466e-b680-f8d6924c8240-kube-api-access-jj28s\") pod \"glance-operator-controller-manager-7f748f8b74-nrh9v\" (UID: \"61f234b9-a904-466e-b680-f8d6924c8240\") " pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.997349 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27lhg\" (UniqueName: \"kubernetes.io/projected/8611f841-ae12-4c99-95cf-06d75c839484-kube-api-access-27lhg\") pod \"barbican-operator-controller-manager-c4b7d6946-bnptj\" (UID: \"8611f841-ae12-4c99-95cf-06d75c839484\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:25.997369 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmqr4\" (UniqueName: \"kubernetes.io/projected/bae783d2-7df3-44de-a5e9-ccee0c3ae193-kube-api-access-mmqr4\") pod \"cinder-operator-controller-manager-768c8b45bb-x2r76\" (UID: \"bae783d2-7df3-44de-a5e9-ccee0c3ae193\") " pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.006376 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-76fd76856-flt5k"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.007308 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.009977 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mmcrt" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.015307 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.036837 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-76fd76856-flt5k"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.062761 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj28s\" (UniqueName: \"kubernetes.io/projected/61f234b9-a904-466e-b680-f8d6924c8240-kube-api-access-jj28s\") pod \"glance-operator-controller-manager-7f748f8b74-nrh9v\" (UID: \"61f234b9-a904-466e-b680-f8d6924c8240\") " pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.062837 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sm6g\" (UniqueName: \"kubernetes.io/projected/968ae160-ba03-4539-849f-6aca2d464181-kube-api-access-7sm6g\") pod \"heat-operator-controller-manager-9595d6797-p7fv4\" (UID: \"968ae160-ba03-4539-849f-6aca2d464181\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.062879 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27lhg\" (UniqueName: \"kubernetes.io/projected/8611f841-ae12-4c99-95cf-06d75c839484-kube-api-access-27lhg\") pod \"barbican-operator-controller-manager-c4b7d6946-bnptj\" (UID: \"8611f841-ae12-4c99-95cf-06d75c839484\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.063372 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmqr4\" (UniqueName: \"kubernetes.io/projected/bae783d2-7df3-44de-a5e9-ccee0c3ae193-kube-api-access-mmqr4\") pod \"cinder-operator-controller-manager-768c8b45bb-x2r76\" (UID: \"bae783d2-7df3-44de-a5e9-ccee0c3ae193\") " pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.064629 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.080231 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65bmr\" (UniqueName: \"kubernetes.io/projected/965ea850-eadc-4a58-ba58-65acacf603e8-kube-api-access-65bmr\") pod \"designate-operator-controller-manager-55cc45767f-2szs7\" (UID: \"965ea850-eadc-4a58-ba58-65acacf603e8\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.097657 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.101120 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cst6\" (UniqueName: \"kubernetes.io/projected/264eff6e-56a5-4f67-9b00-17c3cd1ab65d-kube-api-access-2cst6\") pod \"keystone-operator-controller-manager-6c78d668d5-dzbd8\" (UID: \"264eff6e-56a5-4f67-9b00-17c3cd1ab65d\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.101199 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45vb5\" (UniqueName: \"kubernetes.io/projected/aeec1478-5706-42da-9d49-ee5bd5b24cbc-kube-api-access-45vb5\") pod \"horizon-operator-controller-manager-54fb488b88-5rdb7\" (UID: \"aeec1478-5706-42da-9d49-ee5bd5b24cbc\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.101240 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.101303 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk2w8\" (UniqueName: \"kubernetes.io/projected/acfa5be3-b477-460f-8835-7f86492eed44-kube-api-access-lk2w8\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.101363 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr6wz\" (UniqueName: \"kubernetes.io/projected/4cf1cb70-e865-474b-a84d-245c92a4ccf3-kube-api-access-fr6wz\") pod \"manila-operator-controller-manager-76fd76856-flt5k\" (UID: \"4cf1cb70-e865-474b-a84d-245c92a4ccf3\") " pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.101428 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7khc\" (UniqueName: \"kubernetes.io/projected/49b7ffc2-b28c-451d-9fff-d7af6078a893-kube-api-access-k7khc\") pod \"ironic-operator-controller-manager-6494cdbf8f-m75x9\" (UID: \"49b7ffc2-b28c-451d-9fff-d7af6078a893\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.101802 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.101872 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert podName:acfa5be3-b477-460f-8835-7f86492eed44 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:26.601848484 +0000 UTC m=+1093.814447134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert") pod "infra-operator-controller-manager-66d6b5f488-rhf9p" (UID: "acfa5be3-b477-460f-8835-7f86492eed44") : secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.108204 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.112619 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ltbfz" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.123308 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.180952 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7khc\" (UniqueName: \"kubernetes.io/projected/49b7ffc2-b28c-451d-9fff-d7af6078a893-kube-api-access-k7khc\") pod \"ironic-operator-controller-manager-6494cdbf8f-m75x9\" (UID: \"49b7ffc2-b28c-451d-9fff-d7af6078a893\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.180985 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk2w8\" (UniqueName: \"kubernetes.io/projected/acfa5be3-b477-460f-8835-7f86492eed44-kube-api-access-lk2w8\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.181066 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.181194 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.182704 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.186645 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.204668 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45vb5\" (UniqueName: \"kubernetes.io/projected/aeec1478-5706-42da-9d49-ee5bd5b24cbc-kube-api-access-45vb5\") pod \"horizon-operator-controller-manager-54fb488b88-5rdb7\" (UID: \"aeec1478-5706-42da-9d49-ee5bd5b24cbc\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.205727 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cst6\" (UniqueName: \"kubernetes.io/projected/264eff6e-56a5-4f67-9b00-17c3cd1ab65d-kube-api-access-2cst6\") pod \"keystone-operator-controller-manager-6c78d668d5-dzbd8\" (UID: \"264eff6e-56a5-4f67-9b00-17c3cd1ab65d\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.205823 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr6wz\" (UniqueName: \"kubernetes.io/projected/4cf1cb70-e865-474b-a84d-245c92a4ccf3-kube-api-access-fr6wz\") pod \"manila-operator-controller-manager-76fd76856-flt5k\" (UID: \"4cf1cb70-e865-474b-a84d-245c92a4ccf3\") " pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.205980 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.212671 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.213606 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.231802 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-f56mj" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.240748 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr6wz\" (UniqueName: \"kubernetes.io/projected/4cf1cb70-e865-474b-a84d-245c92a4ccf3-kube-api-access-fr6wz\") pod \"manila-operator-controller-manager-76fd76856-flt5k\" (UID: \"4cf1cb70-e865-474b-a84d-245c92a4ccf3\") " pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.264083 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cst6\" (UniqueName: \"kubernetes.io/projected/264eff6e-56a5-4f67-9b00-17c3cd1ab65d-kube-api-access-2cst6\") pod \"keystone-operator-controller-manager-6c78d668d5-dzbd8\" (UID: \"264eff6e-56a5-4f67-9b00-17c3cd1ab65d\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.279445 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.288757 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.296613 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.300726 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.303930 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-f6wng" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.304606 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.313857 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7svwt\" (UniqueName: \"kubernetes.io/projected/d8530a52-aa3c-4911-8339-af07cfeb2e52-kube-api-access-7svwt\") pod \"mariadb-operator-controller-manager-6dc9b6ff89-z22f2\" (UID: \"d8530a52-aa3c-4911-8339-af07cfeb2e52\") " pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.317606 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.318624 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.322148 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-k92v7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.335462 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.346640 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.354938 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.355728 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.364312 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.365836 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.366674 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.371784 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.371862 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-fpznv" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.372670 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-c45dw" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.380602 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.393791 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.402817 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.403652 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.406618 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-cdljb" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.406751 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.415072 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrchj\" (UniqueName: \"kubernetes.io/projected/1d9ae0bd-78ce-4d9f-8a00-0da4d84fdb94-kube-api-access-xrchj\") pod \"nova-operator-controller-manager-5d56fd956f-mdclf\" (UID: \"1d9ae0bd-78ce-4d9f-8a00-0da4d84fdb94\") " pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.415121 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4gb\" (UniqueName: \"kubernetes.io/projected/ad93658a-8b9d-4518-a840-03622dd46767-kube-api-access-4n4gb\") pod \"neutron-operator-controller-manager-54967dbbdf-rlpf8\" (UID: \"ad93658a-8b9d-4518-a840-03622dd46767\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.415161 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7svwt\" (UniqueName: \"kubernetes.io/projected/d8530a52-aa3c-4911-8339-af07cfeb2e52-kube-api-access-7svwt\") pod \"mariadb-operator-controller-manager-6dc9b6ff89-z22f2\" (UID: \"d8530a52-aa3c-4911-8339-af07cfeb2e52\") " pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.434708 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7svwt\" (UniqueName: \"kubernetes.io/projected/d8530a52-aa3c-4911-8339-af07cfeb2e52-kube-api-access-7svwt\") pod \"mariadb-operator-controller-manager-6dc9b6ff89-z22f2\" (UID: \"d8530a52-aa3c-4911-8339-af07cfeb2e52\") " pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.454315 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.455124 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.456019 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.457794 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8wdtt" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.476943 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.480990 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.485384 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.487743 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xj2wh" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.510339 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.523142 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.523191 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcb5c\" (UniqueName: \"kubernetes.io/projected/c61b96c6-4a34-407e-9c0b-a76783ffb3ed-kube-api-access-fcb5c\") pod \"placement-operator-controller-manager-57bd55f9b7-24q9k\" (UID: \"c61b96c6-4a34-407e-9c0b-a76783ffb3ed\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.523226 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrchj\" (UniqueName: \"kubernetes.io/projected/1d9ae0bd-78ce-4d9f-8a00-0da4d84fdb94-kube-api-access-xrchj\") pod \"nova-operator-controller-manager-5d56fd956f-mdclf\" (UID: \"1d9ae0bd-78ce-4d9f-8a00-0da4d84fdb94\") " pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.523275 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt47w\" (UniqueName: \"kubernetes.io/projected/66a23e95-112a-43d6-8e64-d4073f1b6489-kube-api-access-lt47w\") pod \"octavia-operator-controller-manager-77b8b67585-q6sd9\" (UID: \"66a23e95-112a-43d6-8e64-d4073f1b6489\") " pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.523297 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4gb\" (UniqueName: \"kubernetes.io/projected/ad93658a-8b9d-4518-a840-03622dd46767-kube-api-access-4n4gb\") pod \"neutron-operator-controller-manager-54967dbbdf-rlpf8\" (UID: \"ad93658a-8b9d-4518-a840-03622dd46767\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.523364 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s4pc\" (UniqueName: \"kubernetes.io/projected/f0fe9b18-a0d3-47ec-ab43-73412831eac1-kube-api-access-2s4pc\") pod \"ovn-operator-controller-manager-85c99d655-chlrt\" (UID: \"f0fe9b18-a0d3-47ec-ab43-73412831eac1\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.523382 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs6jq\" (UniqueName: \"kubernetes.io/projected/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-kube-api-access-zs6jq\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.538536 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.539373 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.545171 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nqbxs" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.550360 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrchj\" (UniqueName: \"kubernetes.io/projected/1d9ae0bd-78ce-4d9f-8a00-0da4d84fdb94-kube-api-access-xrchj\") pod \"nova-operator-controller-manager-5d56fd956f-mdclf\" (UID: \"1d9ae0bd-78ce-4d9f-8a00-0da4d84fdb94\") " pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.567274 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.577799 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4gb\" (UniqueName: \"kubernetes.io/projected/ad93658a-8b9d-4518-a840-03622dd46767-kube-api-access-4n4gb\") pod \"neutron-operator-controller-manager-54967dbbdf-rlpf8\" (UID: \"ad93658a-8b9d-4518-a840-03622dd46767\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.584705 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.605324 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.606206 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.611069 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-p9t66" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.619698 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.625283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.625320 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcb5c\" (UniqueName: \"kubernetes.io/projected/c61b96c6-4a34-407e-9c0b-a76783ffb3ed-kube-api-access-fcb5c\") pod \"placement-operator-controller-manager-57bd55f9b7-24q9k\" (UID: \"c61b96c6-4a34-407e-9c0b-a76783ffb3ed\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.625359 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4f76\" (UniqueName: \"kubernetes.io/projected/157b2409-f9a3-4e35-b087-4b54c945bb14-kube-api-access-q4f76\") pod \"swift-operator-controller-manager-79558bbfbf-28bzq\" (UID: \"157b2409-f9a3-4e35-b087-4b54c945bb14\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.625378 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt47w\" (UniqueName: \"kubernetes.io/projected/66a23e95-112a-43d6-8e64-d4073f1b6489-kube-api-access-lt47w\") pod \"octavia-operator-controller-manager-77b8b67585-q6sd9\" (UID: \"66a23e95-112a-43d6-8e64-d4073f1b6489\") " pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.625398 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbxsq\" (UniqueName: \"kubernetes.io/projected/9c031846-df0c-47c8-9fad-a566b6db896d-kube-api-access-fbxsq\") pod \"telemetry-operator-controller-manager-56dc67d744-mwxv6\" (UID: \"9c031846-df0c-47c8-9fad-a566b6db896d\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.625474 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.625511 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s4pc\" (UniqueName: \"kubernetes.io/projected/f0fe9b18-a0d3-47ec-ab43-73412831eac1-kube-api-access-2s4pc\") pod \"ovn-operator-controller-manager-85c99d655-chlrt\" (UID: \"f0fe9b18-a0d3-47ec-ab43-73412831eac1\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.625542 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs6jq\" (UniqueName: \"kubernetes.io/projected/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-kube-api-access-zs6jq\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.625860 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.625896 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert podName:9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:27.125884872 +0000 UTC m=+1094.338483522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" (UID: "9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.626775 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.626804 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert podName:acfa5be3-b477-460f-8835-7f86492eed44 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:27.626795344 +0000 UTC m=+1094.839393994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert") pod "infra-operator-controller-manager-66d6b5f488-rhf9p" (UID: "acfa5be3-b477-460f-8835-7f86492eed44") : secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.640144 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.660403 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s4pc\" (UniqueName: \"kubernetes.io/projected/f0fe9b18-a0d3-47ec-ab43-73412831eac1-kube-api-access-2s4pc\") pod \"ovn-operator-controller-manager-85c99d655-chlrt\" (UID: \"f0fe9b18-a0d3-47ec-ab43-73412831eac1\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.662012 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcb5c\" (UniqueName: \"kubernetes.io/projected/c61b96c6-4a34-407e-9c0b-a76783ffb3ed-kube-api-access-fcb5c\") pod \"placement-operator-controller-manager-57bd55f9b7-24q9k\" (UID: \"c61b96c6-4a34-407e-9c0b-a76783ffb3ed\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.663113 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs6jq\" (UniqueName: \"kubernetes.io/projected/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-kube-api-access-zs6jq\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.668511 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt47w\" (UniqueName: \"kubernetes.io/projected/66a23e95-112a-43d6-8e64-d4073f1b6489-kube-api-access-lt47w\") pod \"octavia-operator-controller-manager-77b8b67585-q6sd9\" (UID: \"66a23e95-112a-43d6-8e64-d4073f1b6489\") " pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.705141 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.713588 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.714415 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.716470 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.716594 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.718030 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.718220 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qnpsk" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.720972 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.730792 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clw5d\" (UniqueName: \"kubernetes.io/projected/67342993-d4b9-4e5a-9777-8dc7a6ee4c0d-kube-api-access-clw5d\") pod \"watcher-operator-controller-manager-76bcb69745-2xc62\" (UID: \"67342993-d4b9-4e5a-9777-8dc7a6ee4c0d\") " pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.730944 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4f76\" (UniqueName: \"kubernetes.io/projected/157b2409-f9a3-4e35-b087-4b54c945bb14-kube-api-access-q4f76\") pod \"swift-operator-controller-manager-79558bbfbf-28bzq\" (UID: \"157b2409-f9a3-4e35-b087-4b54c945bb14\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.730970 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7n9r\" (UniqueName: \"kubernetes.io/projected/63abbf2b-ad9d-458c-9a6e-b31988780065-kube-api-access-g7n9r\") pod \"test-operator-controller-manager-8467ccb4c8-s7gzx\" (UID: \"63abbf2b-ad9d-458c-9a6e-b31988780065\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.730993 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbxsq\" (UniqueName: \"kubernetes.io/projected/9c031846-df0c-47c8-9fad-a566b6db896d-kube-api-access-fbxsq\") pod \"telemetry-operator-controller-manager-56dc67d744-mwxv6\" (UID: \"9c031846-df0c-47c8-9fad-a566b6db896d\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.747567 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.748894 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.754754 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.764415 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v"] Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.766396 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-5wgjp" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.767314 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4f76\" (UniqueName: \"kubernetes.io/projected/157b2409-f9a3-4e35-b087-4b54c945bb14-kube-api-access-q4f76\") pod \"swift-operator-controller-manager-79558bbfbf-28bzq\" (UID: \"157b2409-f9a3-4e35-b087-4b54c945bb14\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.772882 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbxsq\" (UniqueName: \"kubernetes.io/projected/9c031846-df0c-47c8-9fad-a566b6db896d-kube-api-access-fbxsq\") pod \"telemetry-operator-controller-manager-56dc67d744-mwxv6\" (UID: \"9c031846-df0c-47c8-9fad-a566b6db896d\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.790939 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.831622 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7n9r\" (UniqueName: \"kubernetes.io/projected/63abbf2b-ad9d-458c-9a6e-b31988780065-kube-api-access-g7n9r\") pod \"test-operator-controller-manager-8467ccb4c8-s7gzx\" (UID: \"63abbf2b-ad9d-458c-9a6e-b31988780065\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.831790 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg9ql\" (UniqueName: \"kubernetes.io/projected/68273151-991f-4e1c-bf48-93b1d8b3973f-kube-api-access-cg9ql\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.831878 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clw5d\" (UniqueName: \"kubernetes.io/projected/67342993-d4b9-4e5a-9777-8dc7a6ee4c0d-kube-api-access-clw5d\") pod \"watcher-operator-controller-manager-76bcb69745-2xc62\" (UID: \"67342993-d4b9-4e5a-9777-8dc7a6ee4c0d\") " pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.831955 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.832018 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.832517 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.852952 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clw5d\" (UniqueName: \"kubernetes.io/projected/67342993-d4b9-4e5a-9777-8dc7a6ee4c0d-kube-api-access-clw5d\") pod \"watcher-operator-controller-manager-76bcb69745-2xc62\" (UID: \"67342993-d4b9-4e5a-9777-8dc7a6ee4c0d\") " pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.854356 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7n9r\" (UniqueName: \"kubernetes.io/projected/63abbf2b-ad9d-458c-9a6e-b31988780065-kube-api-access-g7n9r\") pod \"test-operator-controller-manager-8467ccb4c8-s7gzx\" (UID: \"63abbf2b-ad9d-458c-9a6e-b31988780065\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.874940 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.934245 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg9ql\" (UniqueName: \"kubernetes.io/projected/68273151-991f-4e1c-bf48-93b1d8b3973f-kube-api-access-cg9ql\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.934375 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.934393 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.934447 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bfgx\" (UniqueName: \"kubernetes.io/projected/67c3e0a3-0b3c-446e-bd46-07b501b58fa8-kube-api-access-9bfgx\") pod \"rabbitmq-cluster-operator-manager-668c99d594-n7drp\" (UID: \"67c3e0a3-0b3c-446e-bd46-07b501b58fa8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp" Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.934881 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.934934 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:27.434921358 +0000 UTC m=+1094.647520008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "metrics-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.935118 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: E0226 15:59:26.935162 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:27.435137833 +0000 UTC m=+1094.647736483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "webhook-server-cert" not found Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.948732 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.953847 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg9ql\" (UniqueName: \"kubernetes.io/projected/68273151-991f-4e1c-bf48-93b1d8b3973f-kube-api-access-cg9ql\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:26 crc kubenswrapper[5033]: I0226 15:59:26.956394 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.035983 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bfgx\" (UniqueName: \"kubernetes.io/projected/67c3e0a3-0b3c-446e-bd46-07b501b58fa8-kube-api-access-9bfgx\") pod \"rabbitmq-cluster-operator-manager-668c99d594-n7drp\" (UID: \"67c3e0a3-0b3c-446e-bd46-07b501b58fa8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.057222 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" event={"ID":"61f234b9-a904-466e-b680-f8d6924c8240","Type":"ContainerStarted","Data":"4a646ef660e4b293824f1fce4881b87ce859caf7bcfe621c2e7b0f842c738762"} Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.060450 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bfgx\" (UniqueName: \"kubernetes.io/projected/67c3e0a3-0b3c-446e-bd46-07b501b58fa8-kube-api-access-9bfgx\") pod \"rabbitmq-cluster-operator-manager-668c99d594-n7drp\" (UID: \"67c3e0a3-0b3c-446e-bd46-07b501b58fa8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.137227 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.137391 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.137450 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert podName:9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:28.137434349 +0000 UTC m=+1095.350032999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" (UID: "9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.194995 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.397053 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj"] Feb 26 15:59:27 crc kubenswrapper[5033]: W0226 15:59:27.403023 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8611f841_ae12_4c99_95cf_06d75c839484.slice/crio-1937e3ecd0bfe687e21f02945e5e0f6715084787d5671384ff80e61db3e93e58 WatchSource:0}: Error finding container 1937e3ecd0bfe687e21f02945e5e0f6715084787d5671384ff80e61db3e93e58: Status 404 returned error can't find the container with id 1937e3ecd0bfe687e21f02945e5e0f6715084787d5671384ff80e61db3e93e58 Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.404005 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7"] Feb 26 15:59:27 crc kubenswrapper[5033]: W0226 15:59:27.404709 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeec1478_5706_42da_9d49_ee5bd5b24cbc.slice/crio-05ff68941a3cd9209381486bcc84acddb0c21eab81a943eda05f91185bb2ae80 WatchSource:0}: Error finding container 05ff68941a3cd9209381486bcc84acddb0c21eab81a943eda05f91185bb2ae80: Status 404 returned error can't find the container with id 05ff68941a3cd9209381486bcc84acddb0c21eab81a943eda05f91185bb2ae80 Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.411618 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4"] Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.416792 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7"] Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.432805 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-76fd76856-flt5k"] Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.442668 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.442694 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.443290 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.443338 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:28.443323187 +0000 UTC m=+1095.655921827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "webhook-server-cert" not found Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.443617 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.443641 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:28.443634256 +0000 UTC m=+1095.656232906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "metrics-server-cert" not found Feb 26 15:59:27 crc kubenswrapper[5033]: W0226 15:59:27.445457 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf1cb70_e865_474b_a84d_245c92a4ccf3.slice/crio-1aa7c2282dbb36fc802afb2535cc869472303e2f636b274dbc107e9ea57e4e67 WatchSource:0}: Error finding container 1aa7c2282dbb36fc802afb2535cc869472303e2f636b274dbc107e9ea57e4e67: Status 404 returned error can't find the container with id 1aa7c2282dbb36fc802afb2535cc869472303e2f636b274dbc107e9ea57e4e67 Feb 26 15:59:27 crc kubenswrapper[5033]: W0226 15:59:27.454811 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbae783d2_7df3_44de_a5e9_ccee0c3ae193.slice/crio-15914c5a36bd9738f5dc96955fec49bc4f4c5f1e4d9705bdb2f85c526786a2f0 WatchSource:0}: Error finding container 15914c5a36bd9738f5dc96955fec49bc4f4c5f1e4d9705bdb2f85c526786a2f0: Status 404 returned error can't find the container with id 15914c5a36bd9738f5dc96955fec49bc4f4c5f1e4d9705bdb2f85c526786a2f0 Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.465729 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76"] Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.479690 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9"] Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.607086 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf"] Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.614377 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8"] Feb 26 15:59:27 crc kubenswrapper[5033]: W0226 15:59:27.614474 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d9ae0bd_78ce_4d9f_8a00_0da4d84fdb94.slice/crio-2f84064346e8af3b6902f28ddd988d4488fdac0e8c08983ec4afda7e20a741bf WatchSource:0}: Error finding container 2f84064346e8af3b6902f28ddd988d4488fdac0e8c08983ec4afda7e20a741bf: Status 404 returned error can't find the container with id 2f84064346e8af3b6902f28ddd988d4488fdac0e8c08983ec4afda7e20a741bf Feb 26 15:59:27 crc kubenswrapper[5033]: W0226 15:59:27.616190 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8530a52_aa3c_4911_8339_af07cfeb2e52.slice/crio-4de5f1a259d78e6234c6594c9efc85dadc0af0905dac78f5743737cb3152bbf6 WatchSource:0}: Error finding container 4de5f1a259d78e6234c6594c9efc85dadc0af0905dac78f5743737cb3152bbf6: Status 404 returned error can't find the container with id 4de5f1a259d78e6234c6594c9efc85dadc0af0905dac78f5743737cb3152bbf6 Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.618385 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2"] Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.644827 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.644941 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.644991 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert podName:acfa5be3-b477-460f-8835-7f86492eed44 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:29.644974946 +0000 UTC m=+1096.857573596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert") pod "infra-operator-controller-manager-66d6b5f488-rhf9p" (UID: "acfa5be3-b477-460f-8835-7f86492eed44") : secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.761165 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq"] Feb 26 15:59:27 crc kubenswrapper[5033]: W0226 15:59:27.792486 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c031846_df0c_47c8_9fad_a566b6db896d.slice/crio-75946a64addcfad9ff6c225480480d2c9dca2f73362d27287eabc5090481ebc3 WatchSource:0}: Error finding container 75946a64addcfad9ff6c225480480d2c9dca2f73362d27287eabc5090481ebc3: Status 404 returned error can't find the container with id 75946a64addcfad9ff6c225480480d2c9dca2f73362d27287eabc5090481ebc3 Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.795417 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fbxsq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-56dc67d744-mwxv6_openstack-operators(9c031846-df0c-47c8-9fad-a566b6db896d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.799377 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62"] Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.799590 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" podUID="9c031846-df0c-47c8-9fad-a566b6db896d" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.807344 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9"] Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.813225 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:4d3b6d259005ea30eee9c134d5fdf3d67eaacad8568ed105a34674e510086816,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s4pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-85c99d655-chlrt_openstack-operators(f0fe9b18-a0d3-47ec-ab43-73412831eac1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.813243 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fcb5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57bd55f9b7-24q9k_openstack-operators(c61b96c6-4a34-407e-9c0b-a76783ffb3ed): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.814337 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp"] Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.814354 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" podUID="f0fe9b18-a0d3-47ec-ab43-73412831eac1" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.814430 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" podUID="c61b96c6-4a34-407e-9c0b-a76783ffb3ed" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.816172 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:f33fc1f2e53ff4baa4e16b41f37aaa7273dcea0ef4b5a3949411e2f105c73e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-clw5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-76bcb69745-2xc62_openstack-operators(67342993-d4b9-4e5a-9777-8dc7a6ee4c0d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.817588 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" podUID="67342993-d4b9-4e5a-9777-8dc7a6ee4c0d" Feb 26 15:59:27 crc kubenswrapper[5033]: W0226 15:59:27.819807 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66a23e95_112a_43d6_8e64_d4073f1b6489.slice/crio-8aefa7878bd0ed01ca88085a58fc1f0b98d5b92c599499db26dd25e187264bf7 WatchSource:0}: Error finding container 8aefa7878bd0ed01ca88085a58fc1f0b98d5b92c599499db26dd25e187264bf7: Status 404 returned error can't find the container with id 8aefa7878bd0ed01ca88085a58fc1f0b98d5b92c599499db26dd25e187264bf7 Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.821403 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6"] Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.823620 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4n4gb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54967dbbdf-rlpf8_openstack-operators(ad93658a-8b9d-4518-a840-03622dd46767): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.824722 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" podUID="ad93658a-8b9d-4518-a840-03622dd46767" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.825196 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ba1d61d3e7f9410541fa1d04110d3859b22fe7de9f2a570e932be8d0e312d5fe,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lt47w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-77b8b67585-q6sd9_openstack-operators(66a23e95-112a-43d6-8e64-d4073f1b6489): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.825443 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7n9r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-s7gzx_openstack-operators(63abbf2b-ad9d-458c-9a6e-b31988780065): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.826667 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k"] Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.826746 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" podUID="63abbf2b-ad9d-458c-9a6e-b31988780065" Feb 26 15:59:27 crc kubenswrapper[5033]: E0226 15:59:27.826801 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" podUID="66a23e95-112a-43d6-8e64-d4073f1b6489" Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.831226 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt"] Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.836533 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8"] Feb 26 15:59:27 crc kubenswrapper[5033]: I0226 15:59:27.843832 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx"] Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.067464 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" event={"ID":"1d9ae0bd-78ce-4d9f-8a00-0da4d84fdb94","Type":"ContainerStarted","Data":"2f84064346e8af3b6902f28ddd988d4488fdac0e8c08983ec4afda7e20a741bf"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.069924 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" event={"ID":"66a23e95-112a-43d6-8e64-d4073f1b6489","Type":"ContainerStarted","Data":"8aefa7878bd0ed01ca88085a58fc1f0b98d5b92c599499db26dd25e187264bf7"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.079371 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" event={"ID":"965ea850-eadc-4a58-ba58-65acacf603e8","Type":"ContainerStarted","Data":"b989c857d88b6daf4e26e3f3a9b98735cb2b7cffc42ebf857547a38fc8a69098"} Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.079377 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ba1d61d3e7f9410541fa1d04110d3859b22fe7de9f2a570e932be8d0e312d5fe\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" podUID="66a23e95-112a-43d6-8e64-d4073f1b6489" Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.091498 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" event={"ID":"d8530a52-aa3c-4911-8339-af07cfeb2e52","Type":"ContainerStarted","Data":"4de5f1a259d78e6234c6594c9efc85dadc0af0905dac78f5743737cb3152bbf6"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.095283 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" event={"ID":"63abbf2b-ad9d-458c-9a6e-b31988780065","Type":"ContainerStarted","Data":"ae1a9df380ca3e30996f77b1c80aa7b731d971dfd1e166b20e49c3c3674b54f6"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.098237 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" event={"ID":"264eff6e-56a5-4f67-9b00-17c3cd1ab65d","Type":"ContainerStarted","Data":"3265e1062f148113c29547cb64350062273618a1ee5b2d567b625242d02f4360"} Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.098220 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" podUID="63abbf2b-ad9d-458c-9a6e-b31988780065" Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.100761 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp" event={"ID":"67c3e0a3-0b3c-446e-bd46-07b501b58fa8","Type":"ContainerStarted","Data":"2523d02543545a41d5285c7d41e155155f071144692bc7063fea4e3d38a703dd"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.102751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" event={"ID":"c61b96c6-4a34-407e-9c0b-a76783ffb3ed","Type":"ContainerStarted","Data":"bebaecb3dbc1776f58443856db3f87d2a5492e8e7c311789e5181dde447475b8"} Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.106059 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" podUID="c61b96c6-4a34-407e-9c0b-a76783ffb3ed" Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.107992 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" event={"ID":"157b2409-f9a3-4e35-b087-4b54c945bb14","Type":"ContainerStarted","Data":"355c420c972ced1b3c6c46c47222d17397c089b76280bcbd6b3467cefd97f7fc"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.113376 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" event={"ID":"9c031846-df0c-47c8-9fad-a566b6db896d","Type":"ContainerStarted","Data":"75946a64addcfad9ff6c225480480d2c9dca2f73362d27287eabc5090481ebc3"} Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.114974 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" podUID="9c031846-df0c-47c8-9fad-a566b6db896d" Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.115799 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" event={"ID":"ad93658a-8b9d-4518-a840-03622dd46767","Type":"ContainerStarted","Data":"bc3572c1c24fc9cce53a8ba4078248a1ee66dcb04755402d19314278de5eb878"} Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.116905 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" podUID="ad93658a-8b9d-4518-a840-03622dd46767" Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.117468 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" event={"ID":"968ae160-ba03-4539-849f-6aca2d464181","Type":"ContainerStarted","Data":"3c4d77cc6c5b365337dd882126d97442b1d5825bae6899e73273413a2af3f167"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.118836 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" event={"ID":"f0fe9b18-a0d3-47ec-ab43-73412831eac1","Type":"ContainerStarted","Data":"6b3ff9d2cca8447645e42b61dde9159d3d908566bfcd7a8f87bc92adec27c9de"} Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.121331 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:4d3b6d259005ea30eee9c134d5fdf3d67eaacad8568ed105a34674e510086816\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" podUID="f0fe9b18-a0d3-47ec-ab43-73412831eac1" Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.128155 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" event={"ID":"4cf1cb70-e865-474b-a84d-245c92a4ccf3","Type":"ContainerStarted","Data":"1aa7c2282dbb36fc802afb2535cc869472303e2f636b274dbc107e9ea57e4e67"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.131179 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" event={"ID":"67342993-d4b9-4e5a-9777-8dc7a6ee4c0d","Type":"ContainerStarted","Data":"72e7441ed56a9102b93946901e3673b1005e3a86583b009d83fe820008dd0539"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.132122 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" event={"ID":"bae783d2-7df3-44de-a5e9-ccee0c3ae193","Type":"ContainerStarted","Data":"15914c5a36bd9738f5dc96955fec49bc4f4c5f1e4d9705bdb2f85c526786a2f0"} Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.132574 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:f33fc1f2e53ff4baa4e16b41f37aaa7273dcea0ef4b5a3949411e2f105c73e5e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" podUID="67342993-d4b9-4e5a-9777-8dc7a6ee4c0d" Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.133345 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" event={"ID":"8611f841-ae12-4c99-95cf-06d75c839484","Type":"ContainerStarted","Data":"1937e3ecd0bfe687e21f02945e5e0f6715084787d5671384ff80e61db3e93e58"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.140810 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" event={"ID":"aeec1478-5706-42da-9d49-ee5bd5b24cbc","Type":"ContainerStarted","Data":"05ff68941a3cd9209381486bcc84acddb0c21eab81a943eda05f91185bb2ae80"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.143176 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" event={"ID":"49b7ffc2-b28c-451d-9fff-d7af6078a893","Type":"ContainerStarted","Data":"59ee999c9c2b013b4ab9aa549a49524521d6bd9e5738ed0dcfce7ef800012a5c"} Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.155791 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.155960 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.156004 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert podName:9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:30.15599093 +0000 UTC m=+1097.368589580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" (UID: "9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.460774 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:28 crc kubenswrapper[5033]: I0226 15:59:28.460862 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.460941 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.461004 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:30.460988537 +0000 UTC m=+1097.673587187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "metrics-server-cert" not found Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.461097 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 15:59:28 crc kubenswrapper[5033]: E0226 15:59:28.461158 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:30.46112414 +0000 UTC m=+1097.673722790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "webhook-server-cert" not found Feb 26 15:59:29 crc kubenswrapper[5033]: E0226 15:59:29.154362 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ba1d61d3e7f9410541fa1d04110d3859b22fe7de9f2a570e932be8d0e312d5fe\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" podUID="66a23e95-112a-43d6-8e64-d4073f1b6489" Feb 26 15:59:29 crc kubenswrapper[5033]: E0226 15:59:29.154684 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" podUID="c61b96c6-4a34-407e-9c0b-a76783ffb3ed" Feb 26 15:59:29 crc kubenswrapper[5033]: E0226 15:59:29.154705 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" podUID="ad93658a-8b9d-4518-a840-03622dd46767" Feb 26 15:59:29 crc kubenswrapper[5033]: E0226 15:59:29.154718 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:f33fc1f2e53ff4baa4e16b41f37aaa7273dcea0ef4b5a3949411e2f105c73e5e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" podUID="67342993-d4b9-4e5a-9777-8dc7a6ee4c0d" Feb 26 15:59:29 crc kubenswrapper[5033]: E0226 15:59:29.154787 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" podUID="9c031846-df0c-47c8-9fad-a566b6db896d" Feb 26 15:59:29 crc kubenswrapper[5033]: E0226 15:59:29.157785 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:4d3b6d259005ea30eee9c134d5fdf3d67eaacad8568ed105a34674e510086816\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" podUID="f0fe9b18-a0d3-47ec-ab43-73412831eac1" Feb 26 15:59:29 crc kubenswrapper[5033]: E0226 15:59:29.157852 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" podUID="63abbf2b-ad9d-458c-9a6e-b31988780065" Feb 26 15:59:29 crc kubenswrapper[5033]: I0226 15:59:29.678866 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:29 crc kubenswrapper[5033]: E0226 15:59:29.679004 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:29 crc kubenswrapper[5033]: E0226 15:59:29.679084 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert podName:acfa5be3-b477-460f-8835-7f86492eed44 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:33.679067335 +0000 UTC m=+1100.891665985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert") pod "infra-operator-controller-manager-66d6b5f488-rhf9p" (UID: "acfa5be3-b477-460f-8835-7f86492eed44") : secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:30 crc kubenswrapper[5033]: I0226 15:59:30.189989 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:30 crc kubenswrapper[5033]: E0226 15:59:30.190200 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:30 crc kubenswrapper[5033]: E0226 15:59:30.190290 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert podName:9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:34.190272793 +0000 UTC m=+1101.402871443 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" (UID: "9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:30 crc kubenswrapper[5033]: I0226 15:59:30.493919 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:30 crc kubenswrapper[5033]: I0226 15:59:30.493974 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:30 crc kubenswrapper[5033]: E0226 15:59:30.494133 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 15:59:30 crc kubenswrapper[5033]: E0226 15:59:30.494187 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:34.494169482 +0000 UTC m=+1101.706768132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "webhook-server-cert" not found Feb 26 15:59:30 crc kubenswrapper[5033]: E0226 15:59:30.494676 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 15:59:30 crc kubenswrapper[5033]: E0226 15:59:30.494706 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:34.494698766 +0000 UTC m=+1101.707297416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "metrics-server-cert" not found Feb 26 15:59:33 crc kubenswrapper[5033]: I0226 15:59:33.740970 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:33 crc kubenswrapper[5033]: E0226 15:59:33.741828 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:33 crc kubenswrapper[5033]: E0226 15:59:33.741899 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert podName:acfa5be3-b477-460f-8835-7f86492eed44 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:41.741878678 +0000 UTC m=+1108.954477328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert") pod "infra-operator-controller-manager-66d6b5f488-rhf9p" (UID: "acfa5be3-b477-460f-8835-7f86492eed44") : secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:34 crc kubenswrapper[5033]: I0226 15:59:34.249512 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:34 crc kubenswrapper[5033]: E0226 15:59:34.249720 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:34 crc kubenswrapper[5033]: E0226 15:59:34.249813 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert podName:9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:42.249778221 +0000 UTC m=+1109.462376861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" (UID: "9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:34 crc kubenswrapper[5033]: I0226 15:59:34.552643 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:34 crc kubenswrapper[5033]: I0226 15:59:34.552682 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:34 crc kubenswrapper[5033]: E0226 15:59:34.552803 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 15:59:34 crc kubenswrapper[5033]: E0226 15:59:34.552822 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 15:59:34 crc kubenswrapper[5033]: E0226 15:59:34.552861 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:42.552847098 +0000 UTC m=+1109.765445748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "metrics-server-cert" not found Feb 26 15:59:34 crc kubenswrapper[5033]: E0226 15:59:34.552875 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:42.552869619 +0000 UTC m=+1109.765468269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "webhook-server-cert" not found Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.230445 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" event={"ID":"bae783d2-7df3-44de-a5e9-ccee0c3ae193","Type":"ContainerStarted","Data":"ddb151e04f9bc295236371ee119d69fbb2c0f3dc09e35e49fbbbbc2ab7a2bed7"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.231949 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.232652 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp" event={"ID":"67c3e0a3-0b3c-446e-bd46-07b501b58fa8","Type":"ContainerStarted","Data":"f3632a4caf981a8cb9e5976c6ff0dd23d0d5cbbe62b67002f483da4a00af932f"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.234335 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" event={"ID":"965ea850-eadc-4a58-ba58-65acacf603e8","Type":"ContainerStarted","Data":"5ae21f78e6bd9fe0901c6ec35f097f00f77f729d02e0c53cbdec2262af5caf79"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.235289 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.237440 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" event={"ID":"8611f841-ae12-4c99-95cf-06d75c839484","Type":"ContainerStarted","Data":"c20eaa34cb8cc63b05a643d1f5a2e9e50435b6130f154e3daaf071542ab1d5ba"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.237892 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.238971 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" event={"ID":"61f234b9-a904-466e-b680-f8d6924c8240","Type":"ContainerStarted","Data":"2103fc87b823d18cb8c01d694323c97e4db64d5d8810b9735b26ceff87ed1390"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.239277 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.240362 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" event={"ID":"264eff6e-56a5-4f67-9b00-17c3cd1ab65d","Type":"ContainerStarted","Data":"20dbb3a71274670ecb23187870fe9e211dd3ee545b0d0e554da453a632941217"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.240473 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.241308 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" event={"ID":"49b7ffc2-b28c-451d-9fff-d7af6078a893","Type":"ContainerStarted","Data":"5d731daad334feeb2ec534b9a6d6e955275fe77831735a30654a46b020271b09"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.242071 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" event={"ID":"4cf1cb70-e865-474b-a84d-245c92a4ccf3","Type":"ContainerStarted","Data":"3c0b5cae516024b4dfae4b4c5cac952dfcf07bd5646036d8303c32c3cf5d7d79"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.242293 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.243288 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" event={"ID":"aeec1478-5706-42da-9d49-ee5bd5b24cbc","Type":"ContainerStarted","Data":"6a488a73fa377d216250b0c1f4677b6f25c187f7853a30c09f6e2ca37a9dc386"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.243544 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.244419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" event={"ID":"968ae160-ba03-4539-849f-6aca2d464181","Type":"ContainerStarted","Data":"e2ea80d8e9561865563ece45dd2878dc2ee12d865d88bcdae9a8b922d932c749"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.244472 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.247285 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" event={"ID":"157b2409-f9a3-4e35-b087-4b54c945bb14","Type":"ContainerStarted","Data":"ae87807b99f73761286081dfb9641d2da4d9868c4bb097852fedf42122e846d7"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.247554 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.248460 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" event={"ID":"1d9ae0bd-78ce-4d9f-8a00-0da4d84fdb94","Type":"ContainerStarted","Data":"5677577828b24f3bbd46de38b3c53b3f53c6702560cfada10cdc1de8bc74efbd"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.248518 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.249403 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" event={"ID":"d8530a52-aa3c-4911-8339-af07cfeb2e52","Type":"ContainerStarted","Data":"c4c3069d7c236433b007b1da26c24470b7981fde78886832739a32d2c8f6955a"} Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.249587 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.281560 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" podStartSLOduration=3.347190337 podStartE2EDuration="15.281543275s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.4744065 +0000 UTC m=+1094.687005150" lastFinishedPulling="2026-02-26 15:59:39.408759438 +0000 UTC m=+1106.621358088" observedRunningTime="2026-02-26 15:59:40.270400117 +0000 UTC m=+1107.482998787" watchObservedRunningTime="2026-02-26 15:59:40.281543275 +0000 UTC m=+1107.494141925" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.339777 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" podStartSLOduration=3.345292941 podStartE2EDuration="15.339760854s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.414711127 +0000 UTC m=+1094.627309777" lastFinishedPulling="2026-02-26 15:59:39.40917904 +0000 UTC m=+1106.621777690" observedRunningTime="2026-02-26 15:59:40.33157815 +0000 UTC m=+1107.544176800" watchObservedRunningTime="2026-02-26 15:59:40.339760854 +0000 UTC m=+1107.552359504" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.501896 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" podStartSLOduration=3.664262403 podStartE2EDuration="15.501869001s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.614436398 +0000 UTC m=+1094.827035048" lastFinishedPulling="2026-02-26 15:59:39.452042986 +0000 UTC m=+1106.664641646" observedRunningTime="2026-02-26 15:59:40.501636985 +0000 UTC m=+1107.714235635" watchObservedRunningTime="2026-02-26 15:59:40.501869001 +0000 UTC m=+1107.714467651" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.503958 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" podStartSLOduration=4.068920987 podStartE2EDuration="15.503946252s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.490833797 +0000 UTC m=+1094.703432457" lastFinishedPulling="2026-02-26 15:59:38.925859072 +0000 UTC m=+1106.138457722" observedRunningTime="2026-02-26 15:59:40.396367274 +0000 UTC m=+1107.608965924" watchObservedRunningTime="2026-02-26 15:59:40.503946252 +0000 UTC m=+1107.716544902" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.561890 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" podStartSLOduration=3.567485987 podStartE2EDuration="15.561874876s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.414371939 +0000 UTC m=+1094.626970589" lastFinishedPulling="2026-02-26 15:59:39.408760828 +0000 UTC m=+1106.621359478" observedRunningTime="2026-02-26 15:59:40.559908206 +0000 UTC m=+1107.772506856" watchObservedRunningTime="2026-02-26 15:59:40.561874876 +0000 UTC m=+1107.774473526" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.605296 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" podStartSLOduration=3.5908352470000002 podStartE2EDuration="15.605277206s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.405092308 +0000 UTC m=+1094.617690958" lastFinishedPulling="2026-02-26 15:59:39.419534267 +0000 UTC m=+1106.632132917" observedRunningTime="2026-02-26 15:59:40.599639406 +0000 UTC m=+1107.812238056" watchObservedRunningTime="2026-02-26 15:59:40.605277206 +0000 UTC m=+1107.817875856" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.633088 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" podStartSLOduration=4.159433915 podStartE2EDuration="15.633069619s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.452232189 +0000 UTC m=+1094.664830839" lastFinishedPulling="2026-02-26 15:59:38.925867893 +0000 UTC m=+1106.138466543" observedRunningTime="2026-02-26 15:59:40.63151576 +0000 UTC m=+1107.844114410" watchObservedRunningTime="2026-02-26 15:59:40.633069619 +0000 UTC m=+1107.845668269" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.665309 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" podStartSLOduration=3.006548085 podStartE2EDuration="14.665291641s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.794507471 +0000 UTC m=+1095.007106121" lastFinishedPulling="2026-02-26 15:59:39.453251027 +0000 UTC m=+1106.665849677" observedRunningTime="2026-02-26 15:59:40.662197183 +0000 UTC m=+1107.874795833" watchObservedRunningTime="2026-02-26 15:59:40.665291641 +0000 UTC m=+1107.877890291" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.692574 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" podStartSLOduration=3.6886132910000002 podStartE2EDuration="15.69255176s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.416837069 +0000 UTC m=+1094.629435719" lastFinishedPulling="2026-02-26 15:59:39.420775538 +0000 UTC m=+1106.633374188" observedRunningTime="2026-02-26 15:59:40.686658323 +0000 UTC m=+1107.899256973" watchObservedRunningTime="2026-02-26 15:59:40.69255176 +0000 UTC m=+1107.905150410" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.789323 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-n7drp" podStartSLOduration=3.11918079 podStartE2EDuration="14.789300649s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.808014387 +0000 UTC m=+1095.020613037" lastFinishedPulling="2026-02-26 15:59:39.478134246 +0000 UTC m=+1106.690732896" observedRunningTime="2026-02-26 15:59:40.747783796 +0000 UTC m=+1107.960382476" watchObservedRunningTime="2026-02-26 15:59:40.789300649 +0000 UTC m=+1108.001899309" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.793604 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" podStartSLOduration=3.991717897 podStartE2EDuration="15.793590816s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.618264813 +0000 UTC m=+1094.830863453" lastFinishedPulling="2026-02-26 15:59:39.420137722 +0000 UTC m=+1106.632736372" observedRunningTime="2026-02-26 15:59:40.785802392 +0000 UTC m=+1107.998401052" watchObservedRunningTime="2026-02-26 15:59:40.793590816 +0000 UTC m=+1108.006189466" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.840831 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" podStartSLOduration=4.038245015 podStartE2EDuration="15.840810402s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.618287133 +0000 UTC m=+1094.830885783" lastFinishedPulling="2026-02-26 15:59:39.42085251 +0000 UTC m=+1106.633451170" observedRunningTime="2026-02-26 15:59:40.838979747 +0000 UTC m=+1108.051578417" watchObservedRunningTime="2026-02-26 15:59:40.840810402 +0000 UTC m=+1108.053409062" Feb 26 15:59:40 crc kubenswrapper[5033]: I0226 15:59:40.846613 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" podStartSLOduration=4.305573777 podStartE2EDuration="15.846591196s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:26.918933761 +0000 UTC m=+1094.131532411" lastFinishedPulling="2026-02-26 15:59:38.45995114 +0000 UTC m=+1105.672549830" observedRunningTime="2026-02-26 15:59:40.819638525 +0000 UTC m=+1108.032237175" watchObservedRunningTime="2026-02-26 15:59:40.846591196 +0000 UTC m=+1108.059189846" Feb 26 15:59:41 crc kubenswrapper[5033]: I0226 15:59:41.256645 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" Feb 26 15:59:41 crc kubenswrapper[5033]: I0226 15:59:41.765127 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:41 crc kubenswrapper[5033]: E0226 15:59:41.765283 5033 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:41 crc kubenswrapper[5033]: E0226 15:59:41.765603 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert podName:acfa5be3-b477-460f-8835-7f86492eed44 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:57.765576922 +0000 UTC m=+1124.978175572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert") pod "infra-operator-controller-manager-66d6b5f488-rhf9p" (UID: "acfa5be3-b477-460f-8835-7f86492eed44") : secret "infra-operator-webhook-server-cert" not found Feb 26 15:59:42 crc kubenswrapper[5033]: I0226 15:59:42.273030 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:42 crc kubenswrapper[5033]: E0226 15:59:42.273327 5033 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:42 crc kubenswrapper[5033]: E0226 15:59:42.273414 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert podName:9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:58.273398449 +0000 UTC m=+1125.485997099 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" (UID: "9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 15:59:42 crc kubenswrapper[5033]: I0226 15:59:42.577531 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:42 crc kubenswrapper[5033]: I0226 15:59:42.577586 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:42 crc kubenswrapper[5033]: E0226 15:59:42.577746 5033 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 15:59:42 crc kubenswrapper[5033]: E0226 15:59:42.577821 5033 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 15:59:42 crc kubenswrapper[5033]: E0226 15:59:42.577830 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:58.57781024 +0000 UTC m=+1125.790408890 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "metrics-server-cert" not found Feb 26 15:59:42 crc kubenswrapper[5033]: E0226 15:59:42.577934 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs podName:68273151-991f-4e1c-bf48-93b1d8b3973f nodeName:}" failed. No retries permitted until 2026-02-26 15:59:58.577915223 +0000 UTC m=+1125.790513893 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-hhkt9" (UID: "68273151-991f-4e1c-bf48-93b1d8b3973f") : secret "webhook-server-cert" not found Feb 26 15:59:43 crc kubenswrapper[5033]: I0226 15:59:43.268624 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" event={"ID":"66a23e95-112a-43d6-8e64-d4073f1b6489","Type":"ContainerStarted","Data":"120e5525a1bede49bfe84e3d0b620233de0afe83c598d693e612b3d92955bf05"} Feb 26 15:59:43 crc kubenswrapper[5033]: I0226 15:59:43.269363 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" Feb 26 15:59:43 crc kubenswrapper[5033]: I0226 15:59:43.297333 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" podStartSLOduration=2.696897171 podStartE2EDuration="17.297314367s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.825094021 +0000 UTC m=+1095.037692671" lastFinishedPulling="2026-02-26 15:59:42.425511207 +0000 UTC m=+1109.638109867" observedRunningTime="2026-02-26 15:59:43.289747889 +0000 UTC m=+1110.502346539" watchObservedRunningTime="2026-02-26 15:59:43.297314367 +0000 UTC m=+1110.509913027" Feb 26 15:59:45 crc kubenswrapper[5033]: I0226 15:59:45.284997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" event={"ID":"f0fe9b18-a0d3-47ec-ab43-73412831eac1","Type":"ContainerStarted","Data":"7d27dbb6de14cc456b78d2d19d2028a9874f2a82044cff486da67840b5d70d68"} Feb 26 15:59:45 crc kubenswrapper[5033]: I0226 15:59:45.285244 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" Feb 26 15:59:45 crc kubenswrapper[5033]: I0226 15:59:45.291157 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" event={"ID":"9c031846-df0c-47c8-9fad-a566b6db896d","Type":"ContainerStarted","Data":"e01ef062a06a9ed87c0c417af4d5ec8e3d5eba3631777378e42213243b702ecf"} Feb 26 15:59:45 crc kubenswrapper[5033]: I0226 15:59:45.291409 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" Feb 26 15:59:45 crc kubenswrapper[5033]: I0226 15:59:45.293877 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" event={"ID":"67342993-d4b9-4e5a-9777-8dc7a6ee4c0d","Type":"ContainerStarted","Data":"1d333a74751ad05da6958ddafc5c5f5a1d029e669447ceb94f34f401006fa0ef"} Feb 26 15:59:45 crc kubenswrapper[5033]: I0226 15:59:45.294228 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" Feb 26 15:59:45 crc kubenswrapper[5033]: I0226 15:59:45.306362 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" podStartSLOduration=2.354547825 podStartE2EDuration="19.306344909s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.813096462 +0000 UTC m=+1095.025695112" lastFinishedPulling="2026-02-26 15:59:44.764893546 +0000 UTC m=+1111.977492196" observedRunningTime="2026-02-26 15:59:45.304205837 +0000 UTC m=+1112.516804507" watchObservedRunningTime="2026-02-26 15:59:45.306344909 +0000 UTC m=+1112.518943549" Feb 26 15:59:45 crc kubenswrapper[5033]: I0226 15:59:45.324510 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" podStartSLOduration=2.375774185 podStartE2EDuration="19.324491591s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.816054757 +0000 UTC m=+1095.028653407" lastFinishedPulling="2026-02-26 15:59:44.764772153 +0000 UTC m=+1111.977370813" observedRunningTime="2026-02-26 15:59:45.320671107 +0000 UTC m=+1112.533269757" watchObservedRunningTime="2026-02-26 15:59:45.324491591 +0000 UTC m=+1112.537090241" Feb 26 15:59:45 crc kubenswrapper[5033]: I0226 15:59:45.337815 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" podStartSLOduration=2.334035587 podStartE2EDuration="19.337800343s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.795305351 +0000 UTC m=+1095.007904001" lastFinishedPulling="2026-02-26 15:59:44.799070107 +0000 UTC m=+1112.011668757" observedRunningTime="2026-02-26 15:59:45.332709386 +0000 UTC m=+1112.545308036" watchObservedRunningTime="2026-02-26 15:59:45.337800343 +0000 UTC m=+1112.550398993" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.113006 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-bnptj" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.132179 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-x2r76" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.185070 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-9595d6797-p7fv4" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.189078 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-2szs7" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.192150 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-nrh9v" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.214231 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5rdb7" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.285730 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-m75x9" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.304401 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-dzbd8" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.368514 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-76fd76856-flt5k" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.462759 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-z22f2" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.643125 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-mdclf" Feb 26 15:59:46 crc kubenswrapper[5033]: I0226 15:59:46.794217 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-28bzq" Feb 26 15:59:56 crc kubenswrapper[5033]: I0226 15:59:56.707454 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-chlrt" Feb 26 15:59:56 crc kubenswrapper[5033]: I0226 15:59:56.836159 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-mwxv6" Feb 26 15:59:56 crc kubenswrapper[5033]: I0226 15:59:56.952601 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-q6sd9" Feb 26 15:59:56 crc kubenswrapper[5033]: I0226 15:59:56.958709 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-2xc62" Feb 26 15:59:57 crc kubenswrapper[5033]: E0226 15:59:57.155726 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89" Feb 26 15:59:57 crc kubenswrapper[5033]: E0226 15:59:57.155900 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fcb5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57bd55f9b7-24q9k_openstack-operators(c61b96c6-4a34-407e-9c0b-a76783ffb3ed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 15:59:57 crc kubenswrapper[5033]: E0226 15:59:57.157320 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" podUID="c61b96c6-4a34-407e-9c0b-a76783ffb3ed" Feb 26 15:59:57 crc kubenswrapper[5033]: E0226 15:59:57.476018 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89" Feb 26 15:59:57 crc kubenswrapper[5033]: E0226 15:59:57.476220 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4n4gb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54967dbbdf-rlpf8_openstack-operators(ad93658a-8b9d-4518-a840-03622dd46767): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 15:59:57 crc kubenswrapper[5033]: E0226 15:59:57.477401 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" podUID="ad93658a-8b9d-4518-a840-03622dd46767" Feb 26 15:59:57 crc kubenswrapper[5033]: I0226 15:59:57.817636 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:57 crc kubenswrapper[5033]: I0226 15:59:57.823696 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acfa5be3-b477-460f-8835-7f86492eed44-cert\") pod \"infra-operator-controller-manager-66d6b5f488-rhf9p\" (UID: \"acfa5be3-b477-460f-8835-7f86492eed44\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:57 crc kubenswrapper[5033]: E0226 15:59:57.944635 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27" Feb 26 15:59:57 crc kubenswrapper[5033]: E0226 15:59:57.945146 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7n9r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-s7gzx_openstack-operators(63abbf2b-ad9d-458c-9a6e-b31988780065): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 15:59:57 crc kubenswrapper[5033]: E0226 15:59:57.946323 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" podUID="63abbf2b-ad9d-458c-9a6e-b31988780065" Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.030251 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.324345 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.329203 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7\" (UID: \"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.461703 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p"] Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.492067 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.628869 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.628946 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.634061 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.634208 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68273151-991f-4e1c-bf48-93b1d8b3973f-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-hhkt9\" (UID: \"68273151-991f-4e1c-bf48-93b1d8b3973f\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.662775 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:58 crc kubenswrapper[5033]: W0226 15:59:58.767764 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea581b5_9d90_4f86_8d9d_3c7ac0a1ae59.slice/crio-040e4bf8bf68f830d0a7a5e65db0f20c23536329d2a6bc35e5f55a592a2b3bdc WatchSource:0}: Error finding container 040e4bf8bf68f830d0a7a5e65db0f20c23536329d2a6bc35e5f55a592a2b3bdc: Status 404 returned error can't find the container with id 040e4bf8bf68f830d0a7a5e65db0f20c23536329d2a6bc35e5f55a592a2b3bdc Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.769632 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7"] Feb 26 15:59:58 crc kubenswrapper[5033]: I0226 15:59:58.895225 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9"] Feb 26 15:59:58 crc kubenswrapper[5033]: W0226 15:59:58.901562 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68273151_991f_4e1c_bf48_93b1d8b3973f.slice/crio-534f6b4fd841c6a4f4d57ed01ec95ec0bef9d7601bbdbad319dc86867e0e32f1 WatchSource:0}: Error finding container 534f6b4fd841c6a4f4d57ed01ec95ec0bef9d7601bbdbad319dc86867e0e32f1: Status 404 returned error can't find the container with id 534f6b4fd841c6a4f4d57ed01ec95ec0bef9d7601bbdbad319dc86867e0e32f1 Feb 26 15:59:59 crc kubenswrapper[5033]: I0226 15:59:59.392245 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" event={"ID":"acfa5be3-b477-460f-8835-7f86492eed44","Type":"ContainerStarted","Data":"c9b876425b8a6738651b1673393033f19c2f5fa9f227d6d3635e8dd5c34841c9"} Feb 26 15:59:59 crc kubenswrapper[5033]: I0226 15:59:59.405007 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" event={"ID":"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59","Type":"ContainerStarted","Data":"040e4bf8bf68f830d0a7a5e65db0f20c23536329d2a6bc35e5f55a592a2b3bdc"} Feb 26 15:59:59 crc kubenswrapper[5033]: I0226 15:59:59.406963 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" event={"ID":"68273151-991f-4e1c-bf48-93b1d8b3973f","Type":"ContainerStarted","Data":"59cc1656a9bf4a73acc3237637060b3f1aa660698a2306c090f204e6394a0dbc"} Feb 26 15:59:59 crc kubenswrapper[5033]: I0226 15:59:59.407006 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" event={"ID":"68273151-991f-4e1c-bf48-93b1d8b3973f","Type":"ContainerStarted","Data":"534f6b4fd841c6a4f4d57ed01ec95ec0bef9d7601bbdbad319dc86867e0e32f1"} Feb 26 15:59:59 crc kubenswrapper[5033]: I0226 15:59:59.407936 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 15:59:59 crc kubenswrapper[5033]: I0226 15:59:59.430818 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" podStartSLOduration=33.430670245 podStartE2EDuration="33.430670245s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:59:59.429618349 +0000 UTC m=+1126.642217019" watchObservedRunningTime="2026-02-26 15:59:59.430670245 +0000 UTC m=+1126.643268895" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.130980 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535360-m5lw2"] Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.131975 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535360-m5lw2" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.134201 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.134643 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.134878 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.140737 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535360-m5lw2"] Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.231380 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p"] Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.232631 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.234635 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.234635 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.236548 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p"] Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.255554 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjxkd\" (UniqueName: \"kubernetes.io/projected/3350667e-a84c-4d48-8839-aecaa3b75a2a-kube-api-access-cjxkd\") pod \"auto-csr-approver-29535360-m5lw2\" (UID: \"3350667e-a84c-4d48-8839-aecaa3b75a2a\") " pod="openshift-infra/auto-csr-approver-29535360-m5lw2" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.357355 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxkd\" (UniqueName: \"kubernetes.io/projected/3350667e-a84c-4d48-8839-aecaa3b75a2a-kube-api-access-cjxkd\") pod \"auto-csr-approver-29535360-m5lw2\" (UID: \"3350667e-a84c-4d48-8839-aecaa3b75a2a\") " pod="openshift-infra/auto-csr-approver-29535360-m5lw2" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.357474 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4cc53151-c374-44bc-8857-c72bd11200ee-config-volume\") pod \"collect-profiles-29535360-zj98p\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.357519 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4cc53151-c374-44bc-8857-c72bd11200ee-secret-volume\") pod \"collect-profiles-29535360-zj98p\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.357545 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97l66\" (UniqueName: \"kubernetes.io/projected/4cc53151-c374-44bc-8857-c72bd11200ee-kube-api-access-97l66\") pod \"collect-profiles-29535360-zj98p\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.380636 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjxkd\" (UniqueName: \"kubernetes.io/projected/3350667e-a84c-4d48-8839-aecaa3b75a2a-kube-api-access-cjxkd\") pod \"auto-csr-approver-29535360-m5lw2\" (UID: \"3350667e-a84c-4d48-8839-aecaa3b75a2a\") " pod="openshift-infra/auto-csr-approver-29535360-m5lw2" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.457299 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535360-m5lw2" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.458227 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4cc53151-c374-44bc-8857-c72bd11200ee-secret-volume\") pod \"collect-profiles-29535360-zj98p\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.458280 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97l66\" (UniqueName: \"kubernetes.io/projected/4cc53151-c374-44bc-8857-c72bd11200ee-kube-api-access-97l66\") pod \"collect-profiles-29535360-zj98p\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.458349 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4cc53151-c374-44bc-8857-c72bd11200ee-config-volume\") pod \"collect-profiles-29535360-zj98p\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.459325 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4cc53151-c374-44bc-8857-c72bd11200ee-config-volume\") pod \"collect-profiles-29535360-zj98p\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.463815 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4cc53151-c374-44bc-8857-c72bd11200ee-secret-volume\") pod \"collect-profiles-29535360-zj98p\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.474466 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97l66\" (UniqueName: \"kubernetes.io/projected/4cc53151-c374-44bc-8857-c72bd11200ee-kube-api-access-97l66\") pod \"collect-profiles-29535360-zj98p\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:00 crc kubenswrapper[5033]: I0226 16:00:00.547787 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:02 crc kubenswrapper[5033]: I0226 16:00:02.386737 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535360-m5lw2"] Feb 26 16:00:02 crc kubenswrapper[5033]: W0226 16:00:02.389512 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3350667e_a84c_4d48_8839_aecaa3b75a2a.slice/crio-eb044e3e8fb2320a7ae56abdacb41e637e3f7d9110081ff09fac0d7a87589533 WatchSource:0}: Error finding container eb044e3e8fb2320a7ae56abdacb41e637e3f7d9110081ff09fac0d7a87589533: Status 404 returned error can't find the container with id eb044e3e8fb2320a7ae56abdacb41e637e3f7d9110081ff09fac0d7a87589533 Feb 26 16:00:02 crc kubenswrapper[5033]: I0226 16:00:02.428492 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" event={"ID":"acfa5be3-b477-460f-8835-7f86492eed44","Type":"ContainerStarted","Data":"ae86faa75dd47c4c2b27fa173d81e5723773568c1319019dfb32590897135b86"} Feb 26 16:00:02 crc kubenswrapper[5033]: I0226 16:00:02.429514 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 16:00:02 crc kubenswrapper[5033]: I0226 16:00:02.431872 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535360-m5lw2" event={"ID":"3350667e-a84c-4d48-8839-aecaa3b75a2a","Type":"ContainerStarted","Data":"eb044e3e8fb2320a7ae56abdacb41e637e3f7d9110081ff09fac0d7a87589533"} Feb 26 16:00:02 crc kubenswrapper[5033]: I0226 16:00:02.433643 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" event={"ID":"9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59","Type":"ContainerStarted","Data":"7eb0df3cf15800c915cf7b6b3529a3bbb12f96b4eddb44e7a8b4f5c379bdc2b3"} Feb 26 16:00:02 crc kubenswrapper[5033]: I0226 16:00:02.434187 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 16:00:02 crc kubenswrapper[5033]: I0226 16:00:02.483353 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" podStartSLOduration=33.984653938 podStartE2EDuration="37.483328987s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:58.463796747 +0000 UTC m=+1125.676395407" lastFinishedPulling="2026-02-26 16:00:01.962471806 +0000 UTC m=+1129.175070456" observedRunningTime="2026-02-26 16:00:02.448809888 +0000 UTC m=+1129.661408538" watchObservedRunningTime="2026-02-26 16:00:02.483328987 +0000 UTC m=+1129.695927637" Feb 26 16:00:02 crc kubenswrapper[5033]: I0226 16:00:02.487032 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" podStartSLOduration=33.275993854 podStartE2EDuration="36.48701363s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="2026-02-26 15:59:58.769613553 +0000 UTC m=+1125.982212213" lastFinishedPulling="2026-02-26 16:00:01.980633329 +0000 UTC m=+1129.193231989" observedRunningTime="2026-02-26 16:00:02.471823241 +0000 UTC m=+1129.684421891" watchObservedRunningTime="2026-02-26 16:00:02.48701363 +0000 UTC m=+1129.699612280" Feb 26 16:00:02 crc kubenswrapper[5033]: I0226 16:00:02.501680 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p"] Feb 26 16:00:03 crc kubenswrapper[5033]: I0226 16:00:03.443831 5033 generic.go:334] "Generic (PLEG): container finished" podID="4cc53151-c374-44bc-8857-c72bd11200ee" containerID="484e05841cbcd7671734fcc38f847bb472a86ed47f838f6feddd98f39ca3d553" exitCode=0 Feb 26 16:00:03 crc kubenswrapper[5033]: I0226 16:00:03.444011 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" event={"ID":"4cc53151-c374-44bc-8857-c72bd11200ee","Type":"ContainerDied","Data":"484e05841cbcd7671734fcc38f847bb472a86ed47f838f6feddd98f39ca3d553"} Feb 26 16:00:03 crc kubenswrapper[5033]: I0226 16:00:03.444404 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" event={"ID":"4cc53151-c374-44bc-8857-c72bd11200ee","Type":"ContainerStarted","Data":"6c22ab8034dcac9d4f57813ac1f4976a2fd46097f4342441fd9bb7dde71bf48b"} Feb 26 16:00:04 crc kubenswrapper[5033]: I0226 16:00:04.782190 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:04 crc kubenswrapper[5033]: I0226 16:00:04.922984 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4cc53151-c374-44bc-8857-c72bd11200ee-config-volume\") pod \"4cc53151-c374-44bc-8857-c72bd11200ee\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " Feb 26 16:00:04 crc kubenswrapper[5033]: I0226 16:00:04.923090 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97l66\" (UniqueName: \"kubernetes.io/projected/4cc53151-c374-44bc-8857-c72bd11200ee-kube-api-access-97l66\") pod \"4cc53151-c374-44bc-8857-c72bd11200ee\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " Feb 26 16:00:04 crc kubenswrapper[5033]: I0226 16:00:04.923231 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4cc53151-c374-44bc-8857-c72bd11200ee-secret-volume\") pod \"4cc53151-c374-44bc-8857-c72bd11200ee\" (UID: \"4cc53151-c374-44bc-8857-c72bd11200ee\") " Feb 26 16:00:04 crc kubenswrapper[5033]: I0226 16:00:04.923713 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc53151-c374-44bc-8857-c72bd11200ee-config-volume" (OuterVolumeSpecName: "config-volume") pod "4cc53151-c374-44bc-8857-c72bd11200ee" (UID: "4cc53151-c374-44bc-8857-c72bd11200ee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:00:04 crc kubenswrapper[5033]: I0226 16:00:04.928079 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc53151-c374-44bc-8857-c72bd11200ee-kube-api-access-97l66" (OuterVolumeSpecName: "kube-api-access-97l66") pod "4cc53151-c374-44bc-8857-c72bd11200ee" (UID: "4cc53151-c374-44bc-8857-c72bd11200ee"). InnerVolumeSpecName "kube-api-access-97l66". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:00:04 crc kubenswrapper[5033]: I0226 16:00:04.928184 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc53151-c374-44bc-8857-c72bd11200ee-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4cc53151-c374-44bc-8857-c72bd11200ee" (UID: "4cc53151-c374-44bc-8857-c72bd11200ee"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:00:05 crc kubenswrapper[5033]: I0226 16:00:05.024606 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4cc53151-c374-44bc-8857-c72bd11200ee-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:05 crc kubenswrapper[5033]: I0226 16:00:05.024680 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4cc53151-c374-44bc-8857-c72bd11200ee-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:05 crc kubenswrapper[5033]: I0226 16:00:05.024695 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97l66\" (UniqueName: \"kubernetes.io/projected/4cc53151-c374-44bc-8857-c72bd11200ee-kube-api-access-97l66\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:05 crc kubenswrapper[5033]: I0226 16:00:05.462814 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" event={"ID":"4cc53151-c374-44bc-8857-c72bd11200ee","Type":"ContainerDied","Data":"6c22ab8034dcac9d4f57813ac1f4976a2fd46097f4342441fd9bb7dde71bf48b"} Feb 26 16:00:05 crc kubenswrapper[5033]: I0226 16:00:05.462863 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c22ab8034dcac9d4f57813ac1f4976a2fd46097f4342441fd9bb7dde71bf48b" Feb 26 16:00:05 crc kubenswrapper[5033]: I0226 16:00:05.462882 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p" Feb 26 16:00:08 crc kubenswrapper[5033]: I0226 16:00:08.048950 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-rhf9p" Feb 26 16:00:08 crc kubenswrapper[5033]: I0226 16:00:08.502084 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7" Feb 26 16:00:08 crc kubenswrapper[5033]: I0226 16:00:08.669931 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-hhkt9" Feb 26 16:00:09 crc kubenswrapper[5033]: E0226 16:00:09.630775 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" podUID="ad93658a-8b9d-4518-a840-03622dd46767" Feb 26 16:00:10 crc kubenswrapper[5033]: E0226 16:00:10.630042 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" podUID="c61b96c6-4a34-407e-9c0b-a76783ffb3ed" Feb 26 16:00:11 crc kubenswrapper[5033]: E0226 16:00:11.628934 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" podUID="63abbf2b-ad9d-458c-9a6e-b31988780065" Feb 26 16:00:20 crc kubenswrapper[5033]: I0226 16:00:20.189669 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:00:20 crc kubenswrapper[5033]: I0226 16:00:20.190180 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:00:22 crc kubenswrapper[5033]: I0226 16:00:22.630940 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:00:23 crc kubenswrapper[5033]: I0226 16:00:23.613134 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" event={"ID":"63abbf2b-ad9d-458c-9a6e-b31988780065","Type":"ContainerStarted","Data":"c1b4850086878899ff44ba0f1e6b3a29c02b9e9f477755a30da717e13d231cfd"} Feb 26 16:00:23 crc kubenswrapper[5033]: I0226 16:00:23.614193 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" Feb 26 16:00:23 crc kubenswrapper[5033]: I0226 16:00:23.614609 5033 generic.go:334] "Generic (PLEG): container finished" podID="3350667e-a84c-4d48-8839-aecaa3b75a2a" containerID="45897f3d33a2eee6754b7aa0b054d288b06ca6f2b6ac01bafc31387f6118566d" exitCode=0 Feb 26 16:00:23 crc kubenswrapper[5033]: I0226 16:00:23.614665 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535360-m5lw2" event={"ID":"3350667e-a84c-4d48-8839-aecaa3b75a2a","Type":"ContainerDied","Data":"45897f3d33a2eee6754b7aa0b054d288b06ca6f2b6ac01bafc31387f6118566d"} Feb 26 16:00:23 crc kubenswrapper[5033]: I0226 16:00:23.642957 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" podStartSLOduration=2.365072439 podStartE2EDuration="57.64293269s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.825242275 +0000 UTC m=+1095.037840915" lastFinishedPulling="2026-02-26 16:00:23.103102516 +0000 UTC m=+1150.315701166" observedRunningTime="2026-02-26 16:00:23.637910325 +0000 UTC m=+1150.850508995" watchObservedRunningTime="2026-02-26 16:00:23.64293269 +0000 UTC m=+1150.855531360" Feb 26 16:00:24 crc kubenswrapper[5033]: I0226 16:00:24.635376 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" event={"ID":"ad93658a-8b9d-4518-a840-03622dd46767","Type":"ContainerStarted","Data":"2832a3cb898d3e5776bc459603507184de33282e9a51f5b4ed6e86aa4d8de397"} Feb 26 16:00:24 crc kubenswrapper[5033]: I0226 16:00:24.636312 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" Feb 26 16:00:24 crc kubenswrapper[5033]: I0226 16:00:24.681213 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" podStartSLOduration=3.249241643 podStartE2EDuration="59.681188546s" podCreationTimestamp="2026-02-26 15:59:25 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.823496091 +0000 UTC m=+1095.036094751" lastFinishedPulling="2026-02-26 16:00:24.255443004 +0000 UTC m=+1151.468041654" observedRunningTime="2026-02-26 16:00:24.680158921 +0000 UTC m=+1151.892757611" watchObservedRunningTime="2026-02-26 16:00:24.681188546 +0000 UTC m=+1151.893787206" Feb 26 16:00:24 crc kubenswrapper[5033]: I0226 16:00:24.906926 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535360-m5lw2" Feb 26 16:00:25 crc kubenswrapper[5033]: I0226 16:00:25.016562 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjxkd\" (UniqueName: \"kubernetes.io/projected/3350667e-a84c-4d48-8839-aecaa3b75a2a-kube-api-access-cjxkd\") pod \"3350667e-a84c-4d48-8839-aecaa3b75a2a\" (UID: \"3350667e-a84c-4d48-8839-aecaa3b75a2a\") " Feb 26 16:00:25 crc kubenswrapper[5033]: I0226 16:00:25.025614 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3350667e-a84c-4d48-8839-aecaa3b75a2a-kube-api-access-cjxkd" (OuterVolumeSpecName: "kube-api-access-cjxkd") pod "3350667e-a84c-4d48-8839-aecaa3b75a2a" (UID: "3350667e-a84c-4d48-8839-aecaa3b75a2a"). InnerVolumeSpecName "kube-api-access-cjxkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:00:25 crc kubenswrapper[5033]: I0226 16:00:25.118423 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjxkd\" (UniqueName: \"kubernetes.io/projected/3350667e-a84c-4d48-8839-aecaa3b75a2a-kube-api-access-cjxkd\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:25 crc kubenswrapper[5033]: I0226 16:00:25.642228 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535360-m5lw2" event={"ID":"3350667e-a84c-4d48-8839-aecaa3b75a2a","Type":"ContainerDied","Data":"eb044e3e8fb2320a7ae56abdacb41e637e3f7d9110081ff09fac0d7a87589533"} Feb 26 16:00:25 crc kubenswrapper[5033]: I0226 16:00:25.642295 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb044e3e8fb2320a7ae56abdacb41e637e3f7d9110081ff09fac0d7a87589533" Feb 26 16:00:25 crc kubenswrapper[5033]: I0226 16:00:25.642249 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535360-m5lw2" Feb 26 16:00:25 crc kubenswrapper[5033]: I0226 16:00:25.978366 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535354-4kqtg"] Feb 26 16:00:25 crc kubenswrapper[5033]: I0226 16:00:25.984505 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535354-4kqtg"] Feb 26 16:00:26 crc kubenswrapper[5033]: I0226 16:00:26.651658 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" event={"ID":"c61b96c6-4a34-407e-9c0b-a76783ffb3ed","Type":"ContainerStarted","Data":"283e7814ad2667a2998b7ff0f5df87d61c31221cc0176c28c70315d3d919b9b3"} Feb 26 16:00:26 crc kubenswrapper[5033]: I0226 16:00:26.652288 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" Feb 26 16:00:26 crc kubenswrapper[5033]: I0226 16:00:26.720130 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" podStartSLOduration=2.2642653790000002 podStartE2EDuration="1m0.720102824s" podCreationTimestamp="2026-02-26 15:59:26 +0000 UTC" firstStartedPulling="2026-02-26 15:59:27.813110913 +0000 UTC m=+1095.025709563" lastFinishedPulling="2026-02-26 16:00:26.268948368 +0000 UTC m=+1153.481547008" observedRunningTime="2026-02-26 16:00:26.713757466 +0000 UTC m=+1153.926356136" watchObservedRunningTime="2026-02-26 16:00:26.720102824 +0000 UTC m=+1153.932701494" Feb 26 16:00:27 crc kubenswrapper[5033]: I0226 16:00:27.638803 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b93d288-0120-4f53-a82e-10c9e6e1f08b" path="/var/lib/kubelet/pods/9b93d288-0120-4f53-a82e-10c9e6e1f08b/volumes" Feb 26 16:00:36 crc kubenswrapper[5033]: I0226 16:00:36.590033 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-rlpf8" Feb 26 16:00:36 crc kubenswrapper[5033]: I0226 16:00:36.722045 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-24q9k" Feb 26 16:00:36 crc kubenswrapper[5033]: I0226 16:00:36.880350 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s7gzx" Feb 26 16:00:50 crc kubenswrapper[5033]: I0226 16:00:50.189184 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:00:50 crc kubenswrapper[5033]: I0226 16:00:50.189956 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.985342 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5fwv"] Feb 26 16:00:51 crc kubenswrapper[5033]: E0226 16:00:51.985982 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3350667e-a84c-4d48-8839-aecaa3b75a2a" containerName="oc" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.985996 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3350667e-a84c-4d48-8839-aecaa3b75a2a" containerName="oc" Feb 26 16:00:51 crc kubenswrapper[5033]: E0226 16:00:51.986013 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc53151-c374-44bc-8857-c72bd11200ee" containerName="collect-profiles" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.986021 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc53151-c374-44bc-8857-c72bd11200ee" containerName="collect-profiles" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.986191 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3350667e-a84c-4d48-8839-aecaa3b75a2a" containerName="oc" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.986214 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cc53151-c374-44bc-8857-c72bd11200ee" containerName="collect-profiles" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.987058 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.988987 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.989095 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.989152 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 26 16:00:51 crc kubenswrapper[5033]: I0226 16:00:51.989530 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6gbkl" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.011056 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5fwv"] Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.050461 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sm5sw"] Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.051856 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.054334 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.059449 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sm5sw"] Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.136950 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860a1b87-8820-43cb-8972-028fbd029e15-config\") pod \"dnsmasq-dns-675f4bcbfc-v5fwv\" (UID: \"860a1b87-8820-43cb-8972-028fbd029e15\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.137299 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ktfs\" (UniqueName: \"kubernetes.io/projected/860a1b87-8820-43cb-8972-028fbd029e15-kube-api-access-5ktfs\") pod \"dnsmasq-dns-675f4bcbfc-v5fwv\" (UID: \"860a1b87-8820-43cb-8972-028fbd029e15\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.238514 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xcr5\" (UniqueName: \"kubernetes.io/projected/a882a181-d43e-414f-9dc8-b0229af9b0ff-kube-api-access-7xcr5\") pod \"dnsmasq-dns-78dd6ddcc-sm5sw\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.238572 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ktfs\" (UniqueName: \"kubernetes.io/projected/860a1b87-8820-43cb-8972-028fbd029e15-kube-api-access-5ktfs\") pod \"dnsmasq-dns-675f4bcbfc-v5fwv\" (UID: \"860a1b87-8820-43cb-8972-028fbd029e15\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.238596 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sm5sw\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.238672 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860a1b87-8820-43cb-8972-028fbd029e15-config\") pod \"dnsmasq-dns-675f4bcbfc-v5fwv\" (UID: \"860a1b87-8820-43cb-8972-028fbd029e15\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.238707 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-config\") pod \"dnsmasq-dns-78dd6ddcc-sm5sw\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.239648 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860a1b87-8820-43cb-8972-028fbd029e15-config\") pod \"dnsmasq-dns-675f4bcbfc-v5fwv\" (UID: \"860a1b87-8820-43cb-8972-028fbd029e15\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.260924 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ktfs\" (UniqueName: \"kubernetes.io/projected/860a1b87-8820-43cb-8972-028fbd029e15-kube-api-access-5ktfs\") pod \"dnsmasq-dns-675f4bcbfc-v5fwv\" (UID: \"860a1b87-8820-43cb-8972-028fbd029e15\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.305525 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.339536 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xcr5\" (UniqueName: \"kubernetes.io/projected/a882a181-d43e-414f-9dc8-b0229af9b0ff-kube-api-access-7xcr5\") pod \"dnsmasq-dns-78dd6ddcc-sm5sw\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.340044 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sm5sw\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.340125 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-config\") pod \"dnsmasq-dns-78dd6ddcc-sm5sw\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.340979 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-config\") pod \"dnsmasq-dns-78dd6ddcc-sm5sw\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.341034 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sm5sw\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.358769 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xcr5\" (UniqueName: \"kubernetes.io/projected/a882a181-d43e-414f-9dc8-b0229af9b0ff-kube-api-access-7xcr5\") pod \"dnsmasq-dns-78dd6ddcc-sm5sw\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.422574 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.760771 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5fwv"] Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.854643 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" event={"ID":"860a1b87-8820-43cb-8972-028fbd029e15","Type":"ContainerStarted","Data":"c476281579f3a35f558d8f9db0178b4b32cd7dc0729800a3753ff6cde55b1fc9"} Feb 26 16:00:52 crc kubenswrapper[5033]: I0226 16:00:52.912065 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sm5sw"] Feb 26 16:00:52 crc kubenswrapper[5033]: W0226 16:00:52.917085 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda882a181_d43e_414f_9dc8_b0229af9b0ff.slice/crio-7128d4d75ffa57800d260261547d44775da908ec12fe46a4bbf3210f156a2c70 WatchSource:0}: Error finding container 7128d4d75ffa57800d260261547d44775da908ec12fe46a4bbf3210f156a2c70: Status 404 returned error can't find the container with id 7128d4d75ffa57800d260261547d44775da908ec12fe46a4bbf3210f156a2c70 Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.691304 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5fwv"] Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.738018 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-zt7cr"] Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.739377 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.750966 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-zt7cr"] Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.773206 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-config\") pod \"dnsmasq-dns-5ccc8479f9-zt7cr\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.773318 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkkh\" (UniqueName: \"kubernetes.io/projected/184ba7b9-46c1-44fc-8eef-adaa1990e143-kube-api-access-8kkkh\") pod \"dnsmasq-dns-5ccc8479f9-zt7cr\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.773358 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-zt7cr\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.866203 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" event={"ID":"a882a181-d43e-414f-9dc8-b0229af9b0ff","Type":"ContainerStarted","Data":"7128d4d75ffa57800d260261547d44775da908ec12fe46a4bbf3210f156a2c70"} Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.874034 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkkh\" (UniqueName: \"kubernetes.io/projected/184ba7b9-46c1-44fc-8eef-adaa1990e143-kube-api-access-8kkkh\") pod \"dnsmasq-dns-5ccc8479f9-zt7cr\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.874101 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-zt7cr\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.874139 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-config\") pod \"dnsmasq-dns-5ccc8479f9-zt7cr\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.875121 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-zt7cr\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.875202 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-config\") pod \"dnsmasq-dns-5ccc8479f9-zt7cr\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:53 crc kubenswrapper[5033]: I0226 16:00:53.897905 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkkh\" (UniqueName: \"kubernetes.io/projected/184ba7b9-46c1-44fc-8eef-adaa1990e143-kube-api-access-8kkkh\") pod \"dnsmasq-dns-5ccc8479f9-zt7cr\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.034082 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sm5sw"] Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.056863 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.058333 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4mlnn"] Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.061698 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.084021 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4mlnn"] Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.185191 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-config\") pod \"dnsmasq-dns-57d769cc4f-4mlnn\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.185277 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrrl2\" (UniqueName: \"kubernetes.io/projected/f46ea79d-19f0-4713-98ab-0e4376a9406d-kube-api-access-lrrl2\") pod \"dnsmasq-dns-57d769cc4f-4mlnn\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.185430 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4mlnn\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.301402 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-config\") pod \"dnsmasq-dns-57d769cc4f-4mlnn\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.301451 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrrl2\" (UniqueName: \"kubernetes.io/projected/f46ea79d-19f0-4713-98ab-0e4376a9406d-kube-api-access-lrrl2\") pod \"dnsmasq-dns-57d769cc4f-4mlnn\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.301474 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4mlnn\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.302219 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4mlnn\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.302741 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-config\") pod \"dnsmasq-dns-57d769cc4f-4mlnn\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.321330 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrrl2\" (UniqueName: \"kubernetes.io/projected/f46ea79d-19f0-4713-98ab-0e4376a9406d-kube-api-access-lrrl2\") pod \"dnsmasq-dns-57d769cc4f-4mlnn\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.384076 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.662757 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-zt7cr"] Feb 26 16:00:54 crc kubenswrapper[5033]: W0226 16:00:54.679608 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod184ba7b9_46c1_44fc_8eef_adaa1990e143.slice/crio-63a6b1a878bb68414f8ce1c8b18a278a73f21b59a3e1198545a31740bdee113c WatchSource:0}: Error finding container 63a6b1a878bb68414f8ce1c8b18a278a73f21b59a3e1198545a31740bdee113c: Status 404 returned error can't find the container with id 63a6b1a878bb68414f8ce1c8b18a278a73f21b59a3e1198545a31740bdee113c Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.692961 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4mlnn"] Feb 26 16:00:54 crc kubenswrapper[5033]: W0226 16:00:54.709482 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf46ea79d_19f0_4713_98ab_0e4376a9406d.slice/crio-7e9d2fcd5f72c583ff6d8c6ef374479ced7956a7cb89490c3ad442cdc95bba3f WatchSource:0}: Error finding container 7e9d2fcd5f72c583ff6d8c6ef374479ced7956a7cb89490c3ad442cdc95bba3f: Status 404 returned error can't find the container with id 7e9d2fcd5f72c583ff6d8c6ef374479ced7956a7cb89490c3ad442cdc95bba3f Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.873945 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" event={"ID":"f46ea79d-19f0-4713-98ab-0e4376a9406d","Type":"ContainerStarted","Data":"7e9d2fcd5f72c583ff6d8c6ef374479ced7956a7cb89490c3ad442cdc95bba3f"} Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.875702 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" event={"ID":"184ba7b9-46c1-44fc-8eef-adaa1990e143","Type":"ContainerStarted","Data":"63a6b1a878bb68414f8ce1c8b18a278a73f21b59a3e1198545a31740bdee113c"} Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.890065 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.893293 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.899778 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.899834 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.899913 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.900001 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.900100 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qnzrx" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.900202 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.900271 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 26 16:00:54 crc kubenswrapper[5033]: I0226 16:00:54.900310 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.010398 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.010467 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94562cbd-1b98-4b30-9a97-f26fb103bb72-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.010519 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.010599 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.010632 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.010727 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p92fh\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-kube-api-access-p92fh\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.011211 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.011308 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94562cbd-1b98-4b30-9a97-f26fb103bb72-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.011388 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.011418 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.011470 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115250 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115320 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94562cbd-1b98-4b30-9a97-f26fb103bb72-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115501 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115532 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115609 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115691 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115740 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94562cbd-1b98-4b30-9a97-f26fb103bb72-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115773 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115835 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115864 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.115906 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p92fh\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-kube-api-access-p92fh\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.116448 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.116923 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.116967 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.117101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.117955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.121191 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.127276 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.128800 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94562cbd-1b98-4b30-9a97-f26fb103bb72-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.129070 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94562cbd-1b98-4b30-9a97-f26fb103bb72-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.129130 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.132886 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p92fh\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-kube-api-access-p92fh\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.145934 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.209444 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.214858 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.219006 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.219006 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-52vlc" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.219761 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.220083 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.220118 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.220559 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.220726 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.221507 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.222813 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321080 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321167 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321198 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321282 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/443e562a-aefe-46f3-a86f-59205f5849c7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321386 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321522 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd6sl\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-kube-api-access-nd6sl\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321545 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/443e562a-aefe-46f3-a86f-59205f5849c7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321561 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321593 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321624 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.321699 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.423719 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424086 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424116 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424155 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424187 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424211 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424241 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/443e562a-aefe-46f3-a86f-59205f5849c7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424322 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424376 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd6sl\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-kube-api-access-nd6sl\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424405 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/443e562a-aefe-46f3-a86f-59205f5849c7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.424430 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.425292 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.425493 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.425762 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.425771 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.425776 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.426943 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.436800 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.436800 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/443e562a-aefe-46f3-a86f-59205f5849c7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.450120 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd6sl\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-kube-api-access-nd6sl\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.450510 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/443e562a-aefe-46f3-a86f-59205f5849c7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.454295 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.474649 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.554343 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:00:55 crc kubenswrapper[5033]: I0226 16:00:55.840895 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:00:55 crc kubenswrapper[5033]: W0226 16:00:55.877976 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94562cbd_1b98_4b30_9a97_f26fb103bb72.slice/crio-41047549b664909658ddcc8d2cf54718889d14921b52c80c34e2493e2f012d6e WatchSource:0}: Error finding container 41047549b664909658ddcc8d2cf54718889d14921b52c80c34e2493e2f012d6e: Status 404 returned error can't find the container with id 41047549b664909658ddcc8d2cf54718889d14921b52c80c34e2493e2f012d6e Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.300277 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:00:56 crc kubenswrapper[5033]: W0226 16:00:56.314368 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443e562a_aefe_46f3_a86f_59205f5849c7.slice/crio-f33c995e86b21de7c50df0b29830b963605c0332ac69d65d4e22240e45227b12 WatchSource:0}: Error finding container f33c995e86b21de7c50df0b29830b963605c0332ac69d65d4e22240e45227b12: Status 404 returned error can't find the container with id f33c995e86b21de7c50df0b29830b963605c0332ac69d65d4e22240e45227b12 Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.666897 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.668712 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.678665 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-q7jlj" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.688349 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.696401 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.710694 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.732863 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.744298 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.756181 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.756276 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.756338 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb89j\" (UniqueName: \"kubernetes.io/projected/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kube-api-access-vb89j\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.756408 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.758055 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.758080 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.758108 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.758137 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.859420 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb89j\" (UniqueName: \"kubernetes.io/projected/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kube-api-access-vb89j\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.859825 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.859854 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.859877 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.859912 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.859941 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.859984 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.860039 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.860878 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.860976 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.861228 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.861937 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.877381 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.880370 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb89j\" (UniqueName: \"kubernetes.io/projected/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kube-api-access-vb89j\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.880787 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.882781 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.886623 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " pod="openstack/openstack-galera-0" Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.910249 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"94562cbd-1b98-4b30-9a97-f26fb103bb72","Type":"ContainerStarted","Data":"41047549b664909658ddcc8d2cf54718889d14921b52c80c34e2493e2f012d6e"} Feb 26 16:00:56 crc kubenswrapper[5033]: I0226 16:00:56.922300 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"443e562a-aefe-46f3-a86f-59205f5849c7","Type":"ContainerStarted","Data":"f33c995e86b21de7c50df0b29830b963605c0332ac69d65d4e22240e45227b12"} Feb 26 16:00:57 crc kubenswrapper[5033]: I0226 16:00:57.018009 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 16:00:57 crc kubenswrapper[5033]: I0226 16:00:57.609286 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 16:00:57 crc kubenswrapper[5033]: W0226 16:00:57.731127 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aa45e26_ccbe_4827_8ddd_ebb5308b98ec.slice/crio-0410a91a0562d41a2afd473190889986bea4bcd86f7f1ad44ab5222a3df28c5a WatchSource:0}: Error finding container 0410a91a0562d41a2afd473190889986bea4bcd86f7f1ad44ab5222a3df28c5a: Status 404 returned error can't find the container with id 0410a91a0562d41a2afd473190889986bea4bcd86f7f1ad44ab5222a3df28c5a Feb 26 16:00:57 crc kubenswrapper[5033]: I0226 16:00:57.946711 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec","Type":"ContainerStarted","Data":"0410a91a0562d41a2afd473190889986bea4bcd86f7f1ad44ab5222a3df28c5a"} Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.054346 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.059670 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.068636 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.068835 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.069579 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.070403 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-r8cjj" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.076185 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.197844 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.197932 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29chb\" (UniqueName: \"kubernetes.io/projected/a1a35ad9-a937-4d01-961b-a3488303610e-kube-api-access-29chb\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.197977 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.198016 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.198038 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.198111 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.198151 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.198170 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.299734 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.299804 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29chb\" (UniqueName: \"kubernetes.io/projected/a1a35ad9-a937-4d01-961b-a3488303610e-kube-api-access-29chb\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.299826 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.299860 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.299877 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.299915 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.299940 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.299954 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.301707 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.302451 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.302540 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.303909 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.305477 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.306477 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.306881 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.311217 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.312168 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.314248 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-lpgzm" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.314951 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.315101 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.341348 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.341703 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29chb\" (UniqueName: \"kubernetes.io/projected/a1a35ad9-a937-4d01-961b-a3488303610e-kube-api-access-29chb\") pod \"openstack-cell1-galera-0\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.367935 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.403502 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.403855 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-combined-ca-bundle\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.403904 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9cz8\" (UniqueName: \"kubernetes.io/projected/be41eafa-8738-4d4a-9f9e-592c105bc766-kube-api-access-k9cz8\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.403958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-memcached-tls-certs\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.403989 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-kolla-config\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.404005 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-config-data\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.505911 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9cz8\" (UniqueName: \"kubernetes.io/projected/be41eafa-8738-4d4a-9f9e-592c105bc766-kube-api-access-k9cz8\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.506949 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-memcached-tls-certs\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.507130 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-kolla-config\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.507242 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-config-data\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.507481 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-combined-ca-bundle\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.509059 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-kolla-config\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.510324 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-config-data\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.512499 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-memcached-tls-certs\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.514365 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-combined-ca-bundle\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.539817 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9cz8\" (UniqueName: \"kubernetes.io/projected/be41eafa-8738-4d4a-9f9e-592c105bc766-kube-api-access-k9cz8\") pod \"memcached-0\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " pod="openstack/memcached-0" Feb 26 16:00:58 crc kubenswrapper[5033]: I0226 16:00:58.724581 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 16:00:59 crc kubenswrapper[5033]: I0226 16:00:59.060503 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 16:00:59 crc kubenswrapper[5033]: I0226 16:00:59.493999 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 16:00:59 crc kubenswrapper[5033]: I0226 16:00:59.976338 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1a35ad9-a937-4d01-961b-a3488303610e","Type":"ContainerStarted","Data":"dcc888e2710bee66d73e2bd3cf8c8e91464ec13f8c0de19b6adfd024271004aa"} Feb 26 16:00:59 crc kubenswrapper[5033]: I0226 16:00:59.986440 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"be41eafa-8738-4d4a-9f9e-592c105bc766","Type":"ContainerStarted","Data":"1a1e6ee458bfccb1e1b96789762f6d308e1b639646ef6f499e8d7bf9205eaf36"} Feb 26 16:01:00 crc kubenswrapper[5033]: I0226 16:01:00.703172 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:01:00 crc kubenswrapper[5033]: I0226 16:01:00.704384 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:01:00 crc kubenswrapper[5033]: I0226 16:01:00.707303 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-j46n4" Feb 26 16:01:00 crc kubenswrapper[5033]: I0226 16:01:00.711059 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:01:00 crc kubenswrapper[5033]: I0226 16:01:00.758344 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcht2\" (UniqueName: \"kubernetes.io/projected/56172023-8cfb-4589-809f-d83c900f90b0-kube-api-access-hcht2\") pod \"kube-state-metrics-0\" (UID: \"56172023-8cfb-4589-809f-d83c900f90b0\") " pod="openstack/kube-state-metrics-0" Feb 26 16:01:00 crc kubenswrapper[5033]: I0226 16:01:00.860134 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcht2\" (UniqueName: \"kubernetes.io/projected/56172023-8cfb-4589-809f-d83c900f90b0-kube-api-access-hcht2\") pod \"kube-state-metrics-0\" (UID: \"56172023-8cfb-4589-809f-d83c900f90b0\") " pod="openstack/kube-state-metrics-0" Feb 26 16:01:00 crc kubenswrapper[5033]: I0226 16:01:00.884233 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcht2\" (UniqueName: \"kubernetes.io/projected/56172023-8cfb-4589-809f-d83c900f90b0-kube-api-access-hcht2\") pod \"kube-state-metrics-0\" (UID: \"56172023-8cfb-4589-809f-d83c900f90b0\") " pod="openstack/kube-state-metrics-0" Feb 26 16:01:01 crc kubenswrapper[5033]: I0226 16:01:01.025723 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:01:03 crc kubenswrapper[5033]: I0226 16:01:03.484763 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.844293 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fxrwp"] Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.845549 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.858028 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xhmv6" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.858455 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.858639 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.863853 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-84k4b"] Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.865245 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.892116 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fxrwp"] Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.911985 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-84k4b"] Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.940881 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run-ovn\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.940922 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-log\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.940945 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-etc-ovs\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941001 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-lib\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941023 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-combined-ca-bundle\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941039 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn9n9\" (UniqueName: \"kubernetes.io/projected/44171111-32bf-415d-97c9-f966209cd0a0-kube-api-access-kn9n9\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941075 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44171111-32bf-415d-97c9-f966209cd0a0-scripts\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941095 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxflh\" (UniqueName: \"kubernetes.io/projected/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-kube-api-access-cxflh\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941117 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-log-ovn\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941138 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-run\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941159 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-ovn-controller-tls-certs\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941188 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-scripts\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:04 crc kubenswrapper[5033]: I0226 16:01:04.941206 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042450 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44171111-32bf-415d-97c9-f966209cd0a0-scripts\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042492 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxflh\" (UniqueName: \"kubernetes.io/projected/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-kube-api-access-cxflh\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042525 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-log-ovn\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042541 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-run\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042560 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-ovn-controller-tls-certs\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042595 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-scripts\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042612 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042629 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run-ovn\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042646 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-log\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042666 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-etc-ovs\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042719 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-lib\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042740 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-combined-ca-bundle\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.042756 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn9n9\" (UniqueName: \"kubernetes.io/projected/44171111-32bf-415d-97c9-f966209cd0a0-kube-api-access-kn9n9\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.043133 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-log-ovn\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.043208 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-run\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.043471 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-log\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.043483 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.043590 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run-ovn\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.043628 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-lib\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.043794 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-etc-ovs\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.045764 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-scripts\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.046379 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44171111-32bf-415d-97c9-f966209cd0a0-scripts\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.062222 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-ovn-controller-tls-certs\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.062362 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-combined-ca-bundle\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.064514 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxflh\" (UniqueName: \"kubernetes.io/projected/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-kube-api-access-cxflh\") pod \"ovn-controller-ovs-84k4b\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.065537 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn9n9\" (UniqueName: \"kubernetes.io/projected/44171111-32bf-415d-97c9-f966209cd0a0-kube-api-access-kn9n9\") pod \"ovn-controller-fxrwp\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.189859 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:05 crc kubenswrapper[5033]: I0226 16:01:05.205050 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.594245 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.595645 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.597919 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.598398 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.598552 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-95jrl" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.598812 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.598892 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.611509 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.670150 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw46j\" (UniqueName: \"kubernetes.io/projected/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-kube-api-access-mw46j\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.671078 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.671346 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.671519 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.671735 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.671933 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.672100 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.672266 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.773425 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.773807 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.773950 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw46j\" (UniqueName: \"kubernetes.io/projected/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-kube-api-access-mw46j\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.774054 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.774191 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.774278 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.774426 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.774539 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.774620 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.774673 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.775135 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.775795 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.778778 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.778821 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.780535 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.796741 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw46j\" (UniqueName: \"kubernetes.io/projected/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-kube-api-access-mw46j\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.802489 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:06 crc kubenswrapper[5033]: I0226 16:01:06.917160 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.737652 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.739389 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.741916 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-stxff" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.742117 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.749147 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.749420 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.758467 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.890497 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.890540 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.890564 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.890634 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.890695 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.890724 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.890766 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zczdr\" (UniqueName: \"kubernetes.io/projected/28081635-21dd-4dbf-ae6a-354bbad0dbdd-kube-api-access-zczdr\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.890796 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.992149 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.992533 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.992625 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.992714 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.992850 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.992977 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.993071 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.993174 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zczdr\" (UniqueName: \"kubernetes.io/projected/28081635-21dd-4dbf-ae6a-354bbad0dbdd-kube-api-access-zczdr\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.992983 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.993217 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.993016 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.993991 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.996420 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:07 crc kubenswrapper[5033]: I0226 16:01:07.996971 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:08 crc kubenswrapper[5033]: I0226 16:01:08.005011 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:08 crc kubenswrapper[5033]: I0226 16:01:08.010607 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zczdr\" (UniqueName: \"kubernetes.io/projected/28081635-21dd-4dbf-ae6a-354bbad0dbdd-kube-api-access-zczdr\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:08 crc kubenswrapper[5033]: I0226 16:01:08.016030 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:08 crc kubenswrapper[5033]: I0226 16:01:08.058587 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:10 crc kubenswrapper[5033]: I0226 16:01:10.086214 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56172023-8cfb-4589-809f-d83c900f90b0","Type":"ContainerStarted","Data":"92bdb30a43677f35f5aa40f7babb0dbbd052ce12c4a91c2bea8dd211c2ac8697"} Feb 26 16:01:13 crc kubenswrapper[5033]: E0226 16:01:13.440496 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 26 16:01:13 crc kubenswrapper[5033]: E0226 16:01:13.441394 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nd6sl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(443e562a-aefe-46f3-a86f-59205f5849c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:01:13 crc kubenswrapper[5033]: E0226 16:01:13.442595 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" Feb 26 16:01:14 crc kubenswrapper[5033]: E0226 16:01:14.133341 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" Feb 26 16:01:16 crc kubenswrapper[5033]: I0226 16:01:16.568651 5033 scope.go:117] "RemoveContainer" containerID="5dd4f0e58aa6ca572dc3458182f11b5edb14cace47435266a20ac3c7fb777499" Feb 26 16:01:16 crc kubenswrapper[5033]: E0226 16:01:16.630305 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Feb 26 16:01:16 crc kubenswrapper[5033]: E0226 16:01:16.630490 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5cchc5hfch55bhfdhc8h74h648h66fh599h5f8h85h5bdh58fh59h5b9h578h658h57fh59h679h68fh79h68dh57h5bfh5ffh7fh6bh588h595h7q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9cz8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(be41eafa-8738-4d4a-9f9e-592c105bc766): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:01:16 crc kubenswrapper[5033]: E0226 16:01:16.632236 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="be41eafa-8738-4d4a-9f9e-592c105bc766" Feb 26 16:01:17 crc kubenswrapper[5033]: E0226 16:01:17.173908 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="be41eafa-8738-4d4a-9f9e-592c105bc766" Feb 26 16:01:20 crc kubenswrapper[5033]: I0226 16:01:20.189108 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:01:20 crc kubenswrapper[5033]: I0226 16:01:20.189387 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:01:20 crc kubenswrapper[5033]: I0226 16:01:20.189422 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:01:20 crc kubenswrapper[5033]: I0226 16:01:20.190137 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1db84da97ee76a847298329e56a91cd397d799819a3948a986ed6808b2508c55"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:01:20 crc kubenswrapper[5033]: I0226 16:01:20.190208 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://1db84da97ee76a847298329e56a91cd397d799819a3948a986ed6808b2508c55" gracePeriod=600 Feb 26 16:01:21 crc kubenswrapper[5033]: I0226 16:01:21.070795 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 16:01:21 crc kubenswrapper[5033]: I0226 16:01:21.200113 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="1db84da97ee76a847298329e56a91cd397d799819a3948a986ed6808b2508c55" exitCode=0 Feb 26 16:01:21 crc kubenswrapper[5033]: I0226 16:01:21.200153 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"1db84da97ee76a847298329e56a91cd397d799819a3948a986ed6808b2508c55"} Feb 26 16:01:21 crc kubenswrapper[5033]: I0226 16:01:21.200183 5033 scope.go:117] "RemoveContainer" containerID="0364f13db4a9c5dfb04881c280d818ef2c52b6cfbe5f9db1cd51cf7111f504ed" Feb 26 16:01:21 crc kubenswrapper[5033]: E0226 16:01:21.619856 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 16:01:21 crc kubenswrapper[5033]: E0226 16:01:21.620400 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kkkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-zt7cr_openstack(184ba7b9-46c1-44fc-8eef-adaa1990e143): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:01:21 crc kubenswrapper[5033]: E0226 16:01:21.621645 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" podUID="184ba7b9-46c1-44fc-8eef-adaa1990e143" Feb 26 16:01:21 crc kubenswrapper[5033]: W0226 16:01:21.637213 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde38b113_f8c9_4c5d_913e_f0efc7a7b5ef.slice/crio-9613ae0291561f00f942b9c24946e7666ec483d52649f7ff145858771804f4bd WatchSource:0}: Error finding container 9613ae0291561f00f942b9c24946e7666ec483d52649f7ff145858771804f4bd: Status 404 returned error can't find the container with id 9613ae0291561f00f942b9c24946e7666ec483d52649f7ff145858771804f4bd Feb 26 16:01:21 crc kubenswrapper[5033]: E0226 16:01:21.673053 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 16:01:21 crc kubenswrapper[5033]: E0226 16:01:21.673404 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xcr5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-sm5sw_openstack(a882a181-d43e-414f-9dc8-b0229af9b0ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:01:21 crc kubenswrapper[5033]: E0226 16:01:21.674836 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" podUID="a882a181-d43e-414f-9dc8-b0229af9b0ff" Feb 26 16:01:21 crc kubenswrapper[5033]: E0226 16:01:21.688313 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 16:01:21 crc kubenswrapper[5033]: E0226 16:01:21.688531 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lrrl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-4mlnn_openstack(f46ea79d-19f0-4713-98ab-0e4376a9406d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:01:21 crc kubenswrapper[5033]: E0226 16:01:21.690437 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" podUID="f46ea79d-19f0-4713-98ab-0e4376a9406d" Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.121137 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.131841 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fxrwp"] Feb 26 16:01:22 crc kubenswrapper[5033]: E0226 16:01:22.143491 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 16:01:22 crc kubenswrapper[5033]: E0226 16:01:22.143664 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5ktfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-v5fwv_openstack(860a1b87-8820-43cb-8972-028fbd029e15): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:01:22 crc kubenswrapper[5033]: E0226 16:01:22.145053 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" podUID="860a1b87-8820-43cb-8972-028fbd029e15" Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.209491 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"7ce282c1f9d55fd8756822a733df8c8b2d5c1a2cdad655e8d68dfed7a9540071"} Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.218128 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec","Type":"ContainerStarted","Data":"1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e"} Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.219596 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef","Type":"ContainerStarted","Data":"9613ae0291561f00f942b9c24946e7666ec483d52649f7ff145858771804f4bd"} Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.219629 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-84k4b"] Feb 26 16:01:22 crc kubenswrapper[5033]: E0226 16:01:22.221003 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" podUID="184ba7b9-46c1-44fc-8eef-adaa1990e143" Feb 26 16:01:22 crc kubenswrapper[5033]: E0226 16:01:22.221065 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" podUID="f46ea79d-19f0-4713-98ab-0e4376a9406d" Feb 26 16:01:22 crc kubenswrapper[5033]: W0226 16:01:22.574160 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44171111_32bf_415d_97c9_f966209cd0a0.slice/crio-325feb8e037e2632b533481e82a50ef6be3a245a2ff6fe4331faac72b59aded5 WatchSource:0}: Error finding container 325feb8e037e2632b533481e82a50ef6be3a245a2ff6fe4331faac72b59aded5: Status 404 returned error can't find the container with id 325feb8e037e2632b533481e82a50ef6be3a245a2ff6fe4331faac72b59aded5 Feb 26 16:01:22 crc kubenswrapper[5033]: W0226 16:01:22.774383 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d8f31df_b55c_44c6_88e4_3b0d0fb6c1e6.slice/crio-53a98cf35457b1fd63fada97e8eb965cfa36a0560679a213466b605b6ade4f74 WatchSource:0}: Error finding container 53a98cf35457b1fd63fada97e8eb965cfa36a0560679a213466b605b6ade4f74: Status 404 returned error can't find the container with id 53a98cf35457b1fd63fada97e8eb965cfa36a0560679a213466b605b6ade4f74 Feb 26 16:01:22 crc kubenswrapper[5033]: W0226 16:01:22.775380 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28081635_21dd_4dbf_ae6a_354bbad0dbdd.slice/crio-1f2c0cb3d9d45570f2440a5830cd21d3c1256ee1dd97d33d2ffba258534c32c5 WatchSource:0}: Error finding container 1f2c0cb3d9d45570f2440a5830cd21d3c1256ee1dd97d33d2ffba258534c32c5: Status 404 returned error can't find the container with id 1f2c0cb3d9d45570f2440a5830cd21d3c1256ee1dd97d33d2ffba258534c32c5 Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.920516 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.932517 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.968544 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-dns-svc\") pod \"a882a181-d43e-414f-9dc8-b0229af9b0ff\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.968600 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ktfs\" (UniqueName: \"kubernetes.io/projected/860a1b87-8820-43cb-8972-028fbd029e15-kube-api-access-5ktfs\") pod \"860a1b87-8820-43cb-8972-028fbd029e15\" (UID: \"860a1b87-8820-43cb-8972-028fbd029e15\") " Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.968618 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860a1b87-8820-43cb-8972-028fbd029e15-config\") pod \"860a1b87-8820-43cb-8972-028fbd029e15\" (UID: \"860a1b87-8820-43cb-8972-028fbd029e15\") " Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.968708 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-config\") pod \"a882a181-d43e-414f-9dc8-b0229af9b0ff\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.968732 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xcr5\" (UniqueName: \"kubernetes.io/projected/a882a181-d43e-414f-9dc8-b0229af9b0ff-kube-api-access-7xcr5\") pod \"a882a181-d43e-414f-9dc8-b0229af9b0ff\" (UID: \"a882a181-d43e-414f-9dc8-b0229af9b0ff\") " Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.969035 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a882a181-d43e-414f-9dc8-b0229af9b0ff" (UID: "a882a181-d43e-414f-9dc8-b0229af9b0ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.969352 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860a1b87-8820-43cb-8972-028fbd029e15-config" (OuterVolumeSpecName: "config") pod "860a1b87-8820-43cb-8972-028fbd029e15" (UID: "860a1b87-8820-43cb-8972-028fbd029e15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.969729 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-config" (OuterVolumeSpecName: "config") pod "a882a181-d43e-414f-9dc8-b0229af9b0ff" (UID: "a882a181-d43e-414f-9dc8-b0229af9b0ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.973636 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860a1b87-8820-43cb-8972-028fbd029e15-kube-api-access-5ktfs" (OuterVolumeSpecName: "kube-api-access-5ktfs") pod "860a1b87-8820-43cb-8972-028fbd029e15" (UID: "860a1b87-8820-43cb-8972-028fbd029e15"). InnerVolumeSpecName "kube-api-access-5ktfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:22 crc kubenswrapper[5033]: I0226 16:01:22.973765 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a882a181-d43e-414f-9dc8-b0229af9b0ff-kube-api-access-7xcr5" (OuterVolumeSpecName: "kube-api-access-7xcr5") pod "a882a181-d43e-414f-9dc8-b0229af9b0ff" (UID: "a882a181-d43e-414f-9dc8-b0229af9b0ff"). InnerVolumeSpecName "kube-api-access-7xcr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.070332 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.070683 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ktfs\" (UniqueName: \"kubernetes.io/projected/860a1b87-8820-43cb-8972-028fbd029e15-kube-api-access-5ktfs\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.070696 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/860a1b87-8820-43cb-8972-028fbd029e15-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.070710 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a882a181-d43e-414f-9dc8-b0229af9b0ff-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.070722 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xcr5\" (UniqueName: \"kubernetes.io/projected/a882a181-d43e-414f-9dc8-b0229af9b0ff-kube-api-access-7xcr5\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.232312 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1a35ad9-a937-4d01-961b-a3488303610e","Type":"ContainerStarted","Data":"a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb"} Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.236454 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"94562cbd-1b98-4b30-9a97-f26fb103bb72","Type":"ContainerStarted","Data":"24ed5542d86de71c32fadfea0c9e8d2f9fc6c503f22d6a3bd14b9feec584230c"} Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.237966 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fxrwp" event={"ID":"44171111-32bf-415d-97c9-f966209cd0a0","Type":"ContainerStarted","Data":"325feb8e037e2632b533481e82a50ef6be3a245a2ff6fe4331faac72b59aded5"} Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.239104 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" event={"ID":"a882a181-d43e-414f-9dc8-b0229af9b0ff","Type":"ContainerDied","Data":"7128d4d75ffa57800d260261547d44775da908ec12fe46a4bbf3210f156a2c70"} Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.239198 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sm5sw" Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.248313 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"28081635-21dd-4dbf-ae6a-354bbad0dbdd","Type":"ContainerStarted","Data":"1f2c0cb3d9d45570f2440a5830cd21d3c1256ee1dd97d33d2ffba258534c32c5"} Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.259049 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" event={"ID":"860a1b87-8820-43cb-8972-028fbd029e15","Type":"ContainerDied","Data":"c476281579f3a35f558d8f9db0178b4b32cd7dc0729800a3753ff6cde55b1fc9"} Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.259137 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v5fwv" Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.262844 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-84k4b" event={"ID":"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6","Type":"ContainerStarted","Data":"53a98cf35457b1fd63fada97e8eb965cfa36a0560679a213466b605b6ade4f74"} Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.330594 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sm5sw"] Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.333452 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sm5sw"] Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.370048 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5fwv"] Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.378604 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5fwv"] Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.639567 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860a1b87-8820-43cb-8972-028fbd029e15" path="/var/lib/kubelet/pods/860a1b87-8820-43cb-8972-028fbd029e15/volumes" Feb 26 16:01:23 crc kubenswrapper[5033]: I0226 16:01:23.639957 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a882a181-d43e-414f-9dc8-b0229af9b0ff" path="/var/lib/kubelet/pods/a882a181-d43e-414f-9dc8-b0229af9b0ff/volumes" Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.304521 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"443e562a-aefe-46f3-a86f-59205f5849c7","Type":"ContainerStarted","Data":"247b57a885cc380f0649077b96f76c4e9d410ef7835d763d6902fafb762cc064"} Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.308117 5033 generic.go:334] "Generic (PLEG): container finished" podID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerID="f55ea55b5a30bde5eb0724a1694acfd92897c14f2d93ceaa5d07c8635c8cc6ea" exitCode=0 Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.308173 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-84k4b" event={"ID":"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6","Type":"ContainerDied","Data":"f55ea55b5a30bde5eb0724a1694acfd92897c14f2d93ceaa5d07c8635c8cc6ea"} Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.309695 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef","Type":"ContainerStarted","Data":"6a0a05a28e34cceca86396fc370a3c3c2d613869b745809e63d4581ad8187860"} Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.311567 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56172023-8cfb-4589-809f-d83c900f90b0","Type":"ContainerStarted","Data":"46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4"} Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.311724 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.313589 5033 generic.go:334] "Generic (PLEG): container finished" podID="a1a35ad9-a937-4d01-961b-a3488303610e" containerID="a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb" exitCode=0 Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.313615 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1a35ad9-a937-4d01-961b-a3488303610e","Type":"ContainerDied","Data":"a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb"} Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.315249 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fxrwp" event={"ID":"44171111-32bf-415d-97c9-f966209cd0a0","Type":"ContainerStarted","Data":"89deba044f841c417bb427e36c48c0c1ad7827ea068666457c484ac571cf5b5c"} Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.315859 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-fxrwp" Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.317438 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"28081635-21dd-4dbf-ae6a-354bbad0dbdd","Type":"ContainerStarted","Data":"014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323"} Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.324985 5033 generic.go:334] "Generic (PLEG): container finished" podID="6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" containerID="1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e" exitCode=0 Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.325104 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec","Type":"ContainerDied","Data":"1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e"} Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.363315 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.750945933 podStartE2EDuration="28.363298136s" podCreationTimestamp="2026-02-26 16:01:00 +0000 UTC" firstStartedPulling="2026-02-26 16:01:09.63415151 +0000 UTC m=+1196.846750180" lastFinishedPulling="2026-02-26 16:01:27.246503733 +0000 UTC m=+1214.459102383" observedRunningTime="2026-02-26 16:01:28.362873815 +0000 UTC m=+1215.575472485" watchObservedRunningTime="2026-02-26 16:01:28.363298136 +0000 UTC m=+1215.575896786" Feb 26 16:01:28 crc kubenswrapper[5033]: I0226 16:01:28.427501 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fxrwp" podStartSLOduration=19.728610095 podStartE2EDuration="24.427485924s" podCreationTimestamp="2026-02-26 16:01:04 +0000 UTC" firstStartedPulling="2026-02-26 16:01:22.578108013 +0000 UTC m=+1209.790706663" lastFinishedPulling="2026-02-26 16:01:27.276983842 +0000 UTC m=+1214.489582492" observedRunningTime="2026-02-26 16:01:28.418942832 +0000 UTC m=+1215.631541482" watchObservedRunningTime="2026-02-26 16:01:28.427485924 +0000 UTC m=+1215.640084574" Feb 26 16:01:29 crc kubenswrapper[5033]: I0226 16:01:29.337148 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec","Type":"ContainerStarted","Data":"5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396"} Feb 26 16:01:29 crc kubenswrapper[5033]: I0226 16:01:29.340418 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-84k4b" event={"ID":"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6","Type":"ContainerStarted","Data":"c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572"} Feb 26 16:01:29 crc kubenswrapper[5033]: I0226 16:01:29.343569 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1a35ad9-a937-4d01-961b-a3488303610e","Type":"ContainerStarted","Data":"364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9"} Feb 26 16:01:29 crc kubenswrapper[5033]: I0226 16:01:29.363478 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=11.454152487 podStartE2EDuration="34.363460663s" podCreationTimestamp="2026-02-26 16:00:55 +0000 UTC" firstStartedPulling="2026-02-26 16:00:57.741230543 +0000 UTC m=+1184.953829203" lastFinishedPulling="2026-02-26 16:01:20.650538719 +0000 UTC m=+1207.863137379" observedRunningTime="2026-02-26 16:01:29.361015613 +0000 UTC m=+1216.573614263" watchObservedRunningTime="2026-02-26 16:01:29.363460663 +0000 UTC m=+1216.576059313" Feb 26 16:01:29 crc kubenswrapper[5033]: I0226 16:01:29.389767 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.847626017 podStartE2EDuration="32.389748248s" podCreationTimestamp="2026-02-26 16:00:57 +0000 UTC" firstStartedPulling="2026-02-26 16:00:59.114204296 +0000 UTC m=+1186.326802946" lastFinishedPulling="2026-02-26 16:01:21.656326527 +0000 UTC m=+1208.868925177" observedRunningTime="2026-02-26 16:01:29.385460721 +0000 UTC m=+1216.598059381" watchObservedRunningTime="2026-02-26 16:01:29.389748248 +0000 UTC m=+1216.602346898" Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.354037 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"28081635-21dd-4dbf-ae6a-354bbad0dbdd","Type":"ContainerStarted","Data":"806513972f7ac7f35e48dccd32d3d8681e9469c01f754dd9f30afc6dcbc44290"} Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.357044 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-84k4b" event={"ID":"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6","Type":"ContainerStarted","Data":"10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92"} Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.357511 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.357563 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.360872 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef","Type":"ContainerStarted","Data":"7d15896c86ef74e743d6d11de6e7eb42b79dd557583bbade3356774639583f07"} Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.362993 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"be41eafa-8738-4d4a-9f9e-592c105bc766","Type":"ContainerStarted","Data":"067a1b260955a76eecc32e5016cd673aa068e9dd6e33251557d30945591fa778"} Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.363495 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.387962 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.559091022 podStartE2EDuration="24.387943507s" podCreationTimestamp="2026-02-26 16:01:06 +0000 UTC" firstStartedPulling="2026-02-26 16:01:22.782879222 +0000 UTC m=+1209.995477872" lastFinishedPulling="2026-02-26 16:01:29.611731707 +0000 UTC m=+1216.824330357" observedRunningTime="2026-02-26 16:01:30.379884466 +0000 UTC m=+1217.592483126" watchObservedRunningTime="2026-02-26 16:01:30.387943507 +0000 UTC m=+1217.600542157" Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.406178 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.371977359 podStartE2EDuration="32.406163341s" podCreationTimestamp="2026-02-26 16:00:58 +0000 UTC" firstStartedPulling="2026-02-26 16:00:59.577571255 +0000 UTC m=+1186.790169905" lastFinishedPulling="2026-02-26 16:01:29.611757217 +0000 UTC m=+1216.824355887" observedRunningTime="2026-02-26 16:01:30.402330755 +0000 UTC m=+1217.614929405" watchObservedRunningTime="2026-02-26 16:01:30.406163341 +0000 UTC m=+1217.618761991" Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.439172 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.5129326 podStartE2EDuration="25.439146142s" podCreationTimestamp="2026-02-26 16:01:05 +0000 UTC" firstStartedPulling="2026-02-26 16:01:21.656058011 +0000 UTC m=+1208.868656661" lastFinishedPulling="2026-02-26 16:01:29.582271553 +0000 UTC m=+1216.794870203" observedRunningTime="2026-02-26 16:01:30.431730877 +0000 UTC m=+1217.644329527" watchObservedRunningTime="2026-02-26 16:01:30.439146142 +0000 UTC m=+1217.651744802" Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.454815 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-84k4b" podStartSLOduration=21.98593384 podStartE2EDuration="26.454798572s" podCreationTimestamp="2026-02-26 16:01:04 +0000 UTC" firstStartedPulling="2026-02-26 16:01:22.776894973 +0000 UTC m=+1209.989493623" lastFinishedPulling="2026-02-26 16:01:27.245759715 +0000 UTC m=+1214.458358355" observedRunningTime="2026-02-26 16:01:30.452547606 +0000 UTC m=+1217.665146276" watchObservedRunningTime="2026-02-26 16:01:30.454798572 +0000 UTC m=+1217.667397222" Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.918063 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:30 crc kubenswrapper[5033]: I0226 16:01:30.962126 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:31 crc kubenswrapper[5033]: I0226 16:01:31.374982 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.059350 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.117992 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.383464 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.426629 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.446059 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.613622 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4mlnn"] Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.666445 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-slznn"] Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.670142 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.677558 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.682877 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-slznn"] Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.704342 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-9ts2z"] Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.708863 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.714794 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.735927 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9ts2z"] Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.763461 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssv4n\" (UniqueName: \"kubernetes.io/projected/45a330af-c614-4321-8534-060b7e5fb93f-kube-api-access-ssv4n\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.763499 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a330af-c614-4321-8534-060b7e5fb93f-config\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.763533 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovs-rundir\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.763617 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovn-rundir\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.763646 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-config\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.780386 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-combined-ca-bundle\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.780487 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.780523 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.780548 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.780619 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g692s\" (UniqueName: \"kubernetes.io/projected/c7337880-9125-485d-9767-86a69b5f1391-kube-api-access-g692s\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882457 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovn-rundir\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882506 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-config\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882534 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-combined-ca-bundle\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882558 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882581 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882599 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882620 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g692s\" (UniqueName: \"kubernetes.io/projected/c7337880-9125-485d-9767-86a69b5f1391-kube-api-access-g692s\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882638 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssv4n\" (UniqueName: \"kubernetes.io/projected/45a330af-c614-4321-8534-060b7e5fb93f-kube-api-access-ssv4n\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882656 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a330af-c614-4321-8534-060b7e5fb93f-config\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.882680 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovs-rundir\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.883199 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovs-rundir\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.883274 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovn-rundir\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.884169 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-config\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.885160 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.885271 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.889920 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a330af-c614-4321-8534-060b7e5fb93f-config\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.943512 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.957112 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-combined-ca-bundle\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.959053 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssv4n\" (UniqueName: \"kubernetes.io/projected/45a330af-c614-4321-8534-060b7e5fb93f-kube-api-access-ssv4n\") pod \"ovn-controller-metrics-9ts2z\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.968213 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-zt7cr"] Feb 26 16:01:32 crc kubenswrapper[5033]: I0226 16:01:32.974047 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g692s\" (UniqueName: \"kubernetes.io/projected/c7337880-9125-485d-9767-86a69b5f1391-kube-api-access-g692s\") pod \"dnsmasq-dns-5bf47b49b7-slznn\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.001671 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.035151 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.050393 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.051786 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.075049 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.075111 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.075438 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-2jmxd" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.075563 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.200789 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.201338 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-scripts\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.201375 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.201461 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.201582 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.201612 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.203308 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqdsw\" (UniqueName: \"kubernetes.io/projected/00a5a420-8682-4780-8349-3da44c8d7f1a-kube-api-access-pqdsw\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.203348 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-config\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.228765 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-wvhbl"] Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.233189 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.237508 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.259501 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wvhbl"] Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.306779 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-config\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310441 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqdsw\" (UniqueName: \"kubernetes.io/projected/00a5a420-8682-4780-8349-3da44c8d7f1a-kube-api-access-pqdsw\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310496 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-config\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310546 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310597 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-scripts\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310614 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310714 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310773 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310874 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310891 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-dns-svc\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310924 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b75cs\" (UniqueName: \"kubernetes.io/projected/281e04fa-65a9-4798-a371-3a644e34ec2d-kube-api-access-b75cs\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.310944 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.311521 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.324793 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-config\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.329844 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.330066 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.332164 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqdsw\" (UniqueName: \"kubernetes.io/projected/00a5a420-8682-4780-8349-3da44c8d7f1a-kube-api-access-pqdsw\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.332316 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-scripts\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.336289 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.339612 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.390251 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" event={"ID":"f46ea79d-19f0-4713-98ab-0e4376a9406d","Type":"ContainerDied","Data":"7e9d2fcd5f72c583ff6d8c6ef374479ced7956a7cb89490c3ad442cdc95bba3f"} Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.390283 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4mlnn" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.414028 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrrl2\" (UniqueName: \"kubernetes.io/projected/f46ea79d-19f0-4713-98ab-0e4376a9406d-kube-api-access-lrrl2\") pod \"f46ea79d-19f0-4713-98ab-0e4376a9406d\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.414409 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-dns-svc\") pod \"f46ea79d-19f0-4713-98ab-0e4376a9406d\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.414534 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-config\") pod \"f46ea79d-19f0-4713-98ab-0e4376a9406d\" (UID: \"f46ea79d-19f0-4713-98ab-0e4376a9406d\") " Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.414839 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-config\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.414927 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.415036 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.415136 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-dns-svc\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.415211 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b75cs\" (UniqueName: \"kubernetes.io/projected/281e04fa-65a9-4798-a371-3a644e34ec2d-kube-api-access-b75cs\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.416281 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f46ea79d-19f0-4713-98ab-0e4376a9406d" (UID: "f46ea79d-19f0-4713-98ab-0e4376a9406d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.416736 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-config\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.416754 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.417132 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-config" (OuterVolumeSpecName: "config") pod "f46ea79d-19f0-4713-98ab-0e4376a9406d" (UID: "f46ea79d-19f0-4713-98ab-0e4376a9406d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.417455 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-dns-svc\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.419107 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46ea79d-19f0-4713-98ab-0e4376a9406d-kube-api-access-lrrl2" (OuterVolumeSpecName: "kube-api-access-lrrl2") pod "f46ea79d-19f0-4713-98ab-0e4376a9406d" (UID: "f46ea79d-19f0-4713-98ab-0e4376a9406d"). InnerVolumeSpecName "kube-api-access-lrrl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.420217 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.434923 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b75cs\" (UniqueName: \"kubernetes.io/projected/281e04fa-65a9-4798-a371-3a644e34ec2d-kube-api-access-b75cs\") pod \"dnsmasq-dns-8554648995-wvhbl\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.501830 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.520816 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrrl2\" (UniqueName: \"kubernetes.io/projected/f46ea79d-19f0-4713-98ab-0e4376a9406d-kube-api-access-lrrl2\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.520845 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.520865 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46ea79d-19f0-4713-98ab-0e4376a9406d-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.576196 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.589005 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.621447 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-dns-svc\") pod \"184ba7b9-46c1-44fc-8eef-adaa1990e143\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.621549 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-config\") pod \"184ba7b9-46c1-44fc-8eef-adaa1990e143\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.621591 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kkkh\" (UniqueName: \"kubernetes.io/projected/184ba7b9-46c1-44fc-8eef-adaa1990e143-kube-api-access-8kkkh\") pod \"184ba7b9-46c1-44fc-8eef-adaa1990e143\" (UID: \"184ba7b9-46c1-44fc-8eef-adaa1990e143\") " Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.622063 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-config" (OuterVolumeSpecName: "config") pod "184ba7b9-46c1-44fc-8eef-adaa1990e143" (UID: "184ba7b9-46c1-44fc-8eef-adaa1990e143"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.622558 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "184ba7b9-46c1-44fc-8eef-adaa1990e143" (UID: "184ba7b9-46c1-44fc-8eef-adaa1990e143"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.625727 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184ba7b9-46c1-44fc-8eef-adaa1990e143-kube-api-access-8kkkh" (OuterVolumeSpecName: "kube-api-access-8kkkh") pod "184ba7b9-46c1-44fc-8eef-adaa1990e143" (UID: "184ba7b9-46c1-44fc-8eef-adaa1990e143"). InnerVolumeSpecName "kube-api-access-8kkkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.723999 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.724032 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184ba7b9-46c1-44fc-8eef-adaa1990e143-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.724052 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kkkh\" (UniqueName: \"kubernetes.io/projected/184ba7b9-46c1-44fc-8eef-adaa1990e143-kube-api-access-8kkkh\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.755835 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4mlnn"] Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.762014 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9ts2z"] Feb 26 16:01:33 crc kubenswrapper[5033]: W0226 16:01:33.764829 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7337880_9125_485d_9767_86a69b5f1391.slice/crio-3eda50e1f509310bd8867831d62da9937092a6d286964f13380df5bdbae81051 WatchSource:0}: Error finding container 3eda50e1f509310bd8867831d62da9937092a6d286964f13380df5bdbae81051: Status 404 returned error can't find the container with id 3eda50e1f509310bd8867831d62da9937092a6d286964f13380df5bdbae81051 Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.770150 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4mlnn"] Feb 26 16:01:33 crc kubenswrapper[5033]: I0226 16:01:33.779976 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-slznn"] Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.002912 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.107513 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wvhbl"] Feb 26 16:01:34 crc kubenswrapper[5033]: W0226 16:01:34.108225 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod281e04fa_65a9_4798_a371_3a644e34ec2d.slice/crio-a64eaca8fd089337ae86d5bc42d512343d53f540b76e55b955dffd2e994f0736 WatchSource:0}: Error finding container a64eaca8fd089337ae86d5bc42d512343d53f540b76e55b955dffd2e994f0736: Status 404 returned error can't find the container with id a64eaca8fd089337ae86d5bc42d512343d53f540b76e55b955dffd2e994f0736 Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.405107 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00a5a420-8682-4780-8349-3da44c8d7f1a","Type":"ContainerStarted","Data":"c376bea5f8d4b23397e8136ee9deab9dfa20345d0b61b68727a170705832a4f9"} Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.406931 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" event={"ID":"c7337880-9125-485d-9767-86a69b5f1391","Type":"ContainerStarted","Data":"3eda50e1f509310bd8867831d62da9937092a6d286964f13380df5bdbae81051"} Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.409211 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.409207 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-zt7cr" event={"ID":"184ba7b9-46c1-44fc-8eef-adaa1990e143","Type":"ContainerDied","Data":"63a6b1a878bb68414f8ce1c8b18a278a73f21b59a3e1198545a31740bdee113c"} Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.411091 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9ts2z" event={"ID":"45a330af-c614-4321-8534-060b7e5fb93f","Type":"ContainerStarted","Data":"573e660c9a39f1b32a6850206b5db2c131d9a2979b4c49f52b53d3cab6204546"} Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.412135 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wvhbl" event={"ID":"281e04fa-65a9-4798-a371-3a644e34ec2d","Type":"ContainerStarted","Data":"a64eaca8fd089337ae86d5bc42d512343d53f540b76e55b955dffd2e994f0736"} Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.454547 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-zt7cr"] Feb 26 16:01:34 crc kubenswrapper[5033]: I0226 16:01:34.463146 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-zt7cr"] Feb 26 16:01:35 crc kubenswrapper[5033]: I0226 16:01:35.640035 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184ba7b9-46c1-44fc-8eef-adaa1990e143" path="/var/lib/kubelet/pods/184ba7b9-46c1-44fc-8eef-adaa1990e143/volumes" Feb 26 16:01:35 crc kubenswrapper[5033]: I0226 16:01:35.641539 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46ea79d-19f0-4713-98ab-0e4376a9406d" path="/var/lib/kubelet/pods/f46ea79d-19f0-4713-98ab-0e4376a9406d/volumes" Feb 26 16:01:37 crc kubenswrapper[5033]: I0226 16:01:37.018560 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 26 16:01:37 crc kubenswrapper[5033]: I0226 16:01:37.019565 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 26 16:01:38 crc kubenswrapper[5033]: I0226 16:01:38.404479 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 26 16:01:38 crc kubenswrapper[5033]: I0226 16:01:38.404792 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 26 16:01:38 crc kubenswrapper[5033]: I0226 16:01:38.726500 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 26 16:01:40 crc kubenswrapper[5033]: I0226 16:01:40.495392 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 26 16:01:40 crc kubenswrapper[5033]: I0226 16:01:40.602847 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 26 16:01:40 crc kubenswrapper[5033]: I0226 16:01:40.775411 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9ts2z" event={"ID":"45a330af-c614-4321-8534-060b7e5fb93f","Type":"ContainerStarted","Data":"f0481a06ae138ce8c43ec40c4c5ac9624c819724a7da33627fecd75b0cbe6925"} Feb 26 16:01:40 crc kubenswrapper[5033]: I0226 16:01:40.793524 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-9ts2z" podStartSLOduration=8.793473473 podStartE2EDuration="8.793473473s" podCreationTimestamp="2026-02-26 16:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:01:40.78892001 +0000 UTC m=+1228.001518670" watchObservedRunningTime="2026-02-26 16:01:40.793473473 +0000 UTC m=+1228.006072123" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.035133 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.111969 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-slznn"] Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.132466 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9hpw2"] Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.133755 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.173328 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9hpw2"] Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.236115 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.236464 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzdx7\" (UniqueName: \"kubernetes.io/projected/04084062-c67c-4433-bc9c-76203d0a326e-kube-api-access-lzdx7\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.236986 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.237038 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.237071 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-config\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.338067 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.338138 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzdx7\" (UniqueName: \"kubernetes.io/projected/04084062-c67c-4433-bc9c-76203d0a326e-kube-api-access-lzdx7\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.338207 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.338245 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.338292 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-config\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.339136 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-config\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.339136 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.339369 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.357765 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.362826 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzdx7\" (UniqueName: \"kubernetes.io/projected/04084062-c67c-4433-bc9c-76203d0a326e-kube-api-access-lzdx7\") pod \"dnsmasq-dns-b8fbc5445-9hpw2\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.459050 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.656760 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.761539 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.786490 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00a5a420-8682-4780-8349-3da44c8d7f1a","Type":"ContainerStarted","Data":"0781d5f97d7bdfedb1b4ca988b8219b988b2353f19930fed09bd026d7a0f175c"} Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.786548 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00a5a420-8682-4780-8349-3da44c8d7f1a","Type":"ContainerStarted","Data":"65abe18a8fd80fc9112bf0561b66f917755c231568bc655d18e082d960532c5a"} Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.786599 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.788680 5033 generic.go:334] "Generic (PLEG): container finished" podID="c7337880-9125-485d-9767-86a69b5f1391" containerID="e7989b4051c29bf2783d59cb4c2fc4258842cb8b924a5db61ef768771152a15d" exitCode=0 Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.788737 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" event={"ID":"c7337880-9125-485d-9767-86a69b5f1391","Type":"ContainerDied","Data":"e7989b4051c29bf2783d59cb4c2fc4258842cb8b924a5db61ef768771152a15d"} Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.794003 5033 generic.go:334] "Generic (PLEG): container finished" podID="281e04fa-65a9-4798-a371-3a644e34ec2d" containerID="4215c2df6c92481f13cb6b9eccd314e68bbcf30cd08a0a26f66574508f7185fc" exitCode=0 Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.795300 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wvhbl" event={"ID":"281e04fa-65a9-4798-a371-3a644e34ec2d","Type":"ContainerDied","Data":"4215c2df6c92481f13cb6b9eccd314e68bbcf30cd08a0a26f66574508f7185fc"} Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.819577 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.297871793 podStartE2EDuration="9.819558537s" podCreationTimestamp="2026-02-26 16:01:32 +0000 UTC" firstStartedPulling="2026-02-26 16:01:34.005887537 +0000 UTC m=+1221.218486187" lastFinishedPulling="2026-02-26 16:01:40.527574281 +0000 UTC m=+1227.740172931" observedRunningTime="2026-02-26 16:01:41.817171397 +0000 UTC m=+1229.029770047" watchObservedRunningTime="2026-02-26 16:01:41.819558537 +0000 UTC m=+1229.032157197" Feb 26 16:01:41 crc kubenswrapper[5033]: I0226 16:01:41.930842 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9hpw2"] Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.039992 5033 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 26 16:01:42 crc kubenswrapper[5033]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c7337880-9125-485d-9767-86a69b5f1391/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 26 16:01:42 crc kubenswrapper[5033]: > podSandboxID="3eda50e1f509310bd8867831d62da9937092a6d286964f13380df5bdbae81051" Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.040413 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 16:01:42 crc kubenswrapper[5033]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58bh65dh95hf6h595hf6hf5h59dh6h57dh558h55ch5dbh5f5h565h5f7h9fh76h58ch54dh84h59bh7fh6bh5b9h59h67fh566h56h5f4h554h58fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g692s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5bf47b49b7-slznn_openstack(c7337880-9125-485d-9767-86a69b5f1391): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c7337880-9125-485d-9767-86a69b5f1391/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 26 16:01:42 crc kubenswrapper[5033]: > logger="UnhandledError" Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.041722 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c7337880-9125-485d-9767-86a69b5f1391/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" podUID="c7337880-9125-485d-9767-86a69b5f1391" Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.089455 5033 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 26 16:01:42 crc kubenswrapper[5033]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/281e04fa-65a9-4798-a371-3a644e34ec2d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 26 16:01:42 crc kubenswrapper[5033]: > podSandboxID="a64eaca8fd089337ae86d5bc42d512343d53f540b76e55b955dffd2e994f0736" Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.089620 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 16:01:42 crc kubenswrapper[5033]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b75cs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8554648995-wvhbl_openstack(281e04fa-65a9-4798-a371-3a644e34ec2d): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/281e04fa-65a9-4798-a371-3a644e34ec2d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 26 16:01:42 crc kubenswrapper[5033]: > logger="UnhandledError" Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.090929 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/281e04fa-65a9-4798-a371-3a644e34ec2d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8554648995-wvhbl" podUID="281e04fa-65a9-4798-a371-3a644e34ec2d" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.273569 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.279720 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.282356 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.282488 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.282508 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.283420 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-h546v" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.297999 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.367627 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.367685 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qh4j\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-kube-api-access-2qh4j\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.367713 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-cache\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.367766 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-lock\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.367806 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.367828 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.469240 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qh4j\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-kube-api-access-2qh4j\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.469651 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-cache\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.469794 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-lock\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.469934 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.470044 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.470205 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.470293 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-lock\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.470460 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.470486 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.470513 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-cache\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.470561 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift podName:8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9 nodeName:}" failed. No retries permitted until 2026-02-26 16:01:42.970524509 +0000 UTC m=+1230.183123159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift") pod "swift-storage-0" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9") : configmap "swift-ring-files" not found Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.470695 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.477753 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.486425 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qh4j\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-kube-api-access-2qh4j\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.498004 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.775624 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-84cln"] Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.776819 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.780929 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.780940 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.784556 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.803818 5033 generic.go:334] "Generic (PLEG): container finished" podID="04084062-c67c-4433-bc9c-76203d0a326e" containerID="b984b0734b8df81526f145dcf503a31e10508a24ee58e3c5357cfa3ee5c427e7" exitCode=0 Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.804146 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" event={"ID":"04084062-c67c-4433-bc9c-76203d0a326e","Type":"ContainerDied","Data":"b984b0734b8df81526f145dcf503a31e10508a24ee58e3c5357cfa3ee5c427e7"} Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.804198 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" event={"ID":"04084062-c67c-4433-bc9c-76203d0a326e","Type":"ContainerStarted","Data":"4428b605e4cfe2ed9a3d36e1d8ca1c01f3852b34aa96816796aeafe8965bf22d"} Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.812362 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-84cln"] Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.878087 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-scripts\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.878159 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-dispersionconf\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.878365 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-combined-ca-bundle\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.878543 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-ring-data-devices\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.878687 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-swiftconf\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.878780 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005e3882-a184-4d2d-9c00-a3de51d47b6e-etc-swift\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.878828 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5k5l\" (UniqueName: \"kubernetes.io/projected/005e3882-a184-4d2d-9c00-a3de51d47b6e-kube-api-access-w5k5l\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.981597 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-ring-data-devices\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.981707 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-swiftconf\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.981772 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005e3882-a184-4d2d-9c00-a3de51d47b6e-etc-swift\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.981802 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5k5l\" (UniqueName: \"kubernetes.io/projected/005e3882-a184-4d2d-9c00-a3de51d47b6e-kube-api-access-w5k5l\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.981914 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-scripts\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.981963 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-dispersionconf\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.982029 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.982077 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-combined-ca-bundle\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.982768 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-scripts\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.983017 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005e3882-a184-4d2d-9c00-a3de51d47b6e-etc-swift\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.983548 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.983568 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:01:42 crc kubenswrapper[5033]: E0226 16:01:42.983639 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift podName:8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9 nodeName:}" failed. No retries permitted until 2026-02-26 16:01:43.983626365 +0000 UTC m=+1231.196225015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift") pod "swift-storage-0" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9") : configmap "swift-ring-files" not found Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.984703 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-ring-data-devices\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.988440 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-swiftconf\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.988802 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-combined-ca-bundle\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:42 crc kubenswrapper[5033]: I0226 16:01:42.999550 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-dispersionconf\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.007930 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5k5l\" (UniqueName: \"kubernetes.io/projected/005e3882-a184-4d2d-9c00-a3de51d47b6e-kube-api-access-w5k5l\") pod \"swift-ring-rebalance-84cln\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.094872 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.128717 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.188024 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g692s\" (UniqueName: \"kubernetes.io/projected/c7337880-9125-485d-9767-86a69b5f1391-kube-api-access-g692s\") pod \"c7337880-9125-485d-9767-86a69b5f1391\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.188212 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-config\") pod \"c7337880-9125-485d-9767-86a69b5f1391\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.188239 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-ovsdbserver-nb\") pod \"c7337880-9125-485d-9767-86a69b5f1391\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.188274 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-dns-svc\") pod \"c7337880-9125-485d-9767-86a69b5f1391\" (UID: \"c7337880-9125-485d-9767-86a69b5f1391\") " Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.196941 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7337880-9125-485d-9767-86a69b5f1391-kube-api-access-g692s" (OuterVolumeSpecName: "kube-api-access-g692s") pod "c7337880-9125-485d-9767-86a69b5f1391" (UID: "c7337880-9125-485d-9767-86a69b5f1391"). InnerVolumeSpecName "kube-api-access-g692s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.227613 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7337880-9125-485d-9767-86a69b5f1391" (UID: "c7337880-9125-485d-9767-86a69b5f1391"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.240578 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-config" (OuterVolumeSpecName: "config") pod "c7337880-9125-485d-9767-86a69b5f1391" (UID: "c7337880-9125-485d-9767-86a69b5f1391"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.241889 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7337880-9125-485d-9767-86a69b5f1391" (UID: "c7337880-9125-485d-9767-86a69b5f1391"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.289917 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.290240 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.290266 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7337880-9125-485d-9767-86a69b5f1391-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.290278 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g692s\" (UniqueName: \"kubernetes.io/projected/c7337880-9125-485d-9767-86a69b5f1391-kube-api-access-g692s\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.586083 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-84cln"] Feb 26 16:01:43 crc kubenswrapper[5033]: W0226 16:01:43.592087 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod005e3882_a184_4d2d_9c00_a3de51d47b6e.slice/crio-e450f69761230896b9a6364295244ea578c7378fcb48f9969ee85627ca9d0f6d WatchSource:0}: Error finding container e450f69761230896b9a6364295244ea578c7378fcb48f9969ee85627ca9d0f6d: Status 404 returned error can't find the container with id e450f69761230896b9a6364295244ea578c7378fcb48f9969ee85627ca9d0f6d Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.815021 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" event={"ID":"04084062-c67c-4433-bc9c-76203d0a326e","Type":"ContainerStarted","Data":"508a044c3f1d584f9c821bc289398bcdb7799b7fe7c2e60db6edf5ef87d59c85"} Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.815980 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.819298 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wvhbl" event={"ID":"281e04fa-65a9-4798-a371-3a644e34ec2d","Type":"ContainerStarted","Data":"df18a639ac4d63e834cd2e263dc8edde8a49b3fe283c2f89ae1cbdec2f3a0e7d"} Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.819558 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.820764 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" event={"ID":"c7337880-9125-485d-9767-86a69b5f1391","Type":"ContainerDied","Data":"3eda50e1f509310bd8867831d62da9937092a6d286964f13380df5bdbae81051"} Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.820797 5033 scope.go:117] "RemoveContainer" containerID="e7989b4051c29bf2783d59cb4c2fc4258842cb8b924a5db61ef768771152a15d" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.820903 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-slznn" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.822964 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-84cln" event={"ID":"005e3882-a184-4d2d-9c00-a3de51d47b6e","Type":"ContainerStarted","Data":"e450f69761230896b9a6364295244ea578c7378fcb48f9969ee85627ca9d0f6d"} Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.839964 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" podStartSLOduration=2.8399446729999998 podStartE2EDuration="2.839944673s" podCreationTimestamp="2026-02-26 16:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:01:43.831565243 +0000 UTC m=+1231.044163903" watchObservedRunningTime="2026-02-26 16:01:43.839944673 +0000 UTC m=+1231.052543333" Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.898877 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-slznn"] Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.910433 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-slznn"] Feb 26 16:01:43 crc kubenswrapper[5033]: I0226 16:01:43.915398 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-wvhbl" podStartSLOduration=4.496785837 podStartE2EDuration="10.915375663s" podCreationTimestamp="2026-02-26 16:01:33 +0000 UTC" firstStartedPulling="2026-02-26 16:01:34.110179065 +0000 UTC m=+1221.322777725" lastFinishedPulling="2026-02-26 16:01:40.528768901 +0000 UTC m=+1227.741367551" observedRunningTime="2026-02-26 16:01:43.909378826 +0000 UTC m=+1231.121977516" watchObservedRunningTime="2026-02-26 16:01:43.915375663 +0000 UTC m=+1231.127974323" Feb 26 16:01:44 crc kubenswrapper[5033]: I0226 16:01:44.002063 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:44 crc kubenswrapper[5033]: E0226 16:01:44.002308 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:01:44 crc kubenswrapper[5033]: E0226 16:01:44.002352 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:01:44 crc kubenswrapper[5033]: E0226 16:01:44.002425 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift podName:8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9 nodeName:}" failed. No retries permitted until 2026-02-26 16:01:46.002406515 +0000 UTC m=+1233.215005165 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift") pod "swift-storage-0" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9") : configmap "swift-ring-files" not found Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.647063 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7337880-9125-485d-9767-86a69b5f1391" path="/var/lib/kubelet/pods/c7337880-9125-485d-9767-86a69b5f1391/volumes" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.760142 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-c9b8h"] Feb 26 16:01:45 crc kubenswrapper[5033]: E0226 16:01:45.761019 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7337880-9125-485d-9767-86a69b5f1391" containerName="init" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.761045 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7337880-9125-485d-9767-86a69b5f1391" containerName="init" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.762303 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7337880-9125-485d-9767-86a69b5f1391" containerName="init" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.765047 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.769016 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.792366 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c9b8h"] Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.846654 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264942a5-9225-48d0-9a43-27bf495fda2c-operator-scripts\") pod \"root-account-create-update-c9b8h\" (UID: \"264942a5-9225-48d0-9a43-27bf495fda2c\") " pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.846763 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxpc2\" (UniqueName: \"kubernetes.io/projected/264942a5-9225-48d0-9a43-27bf495fda2c-kube-api-access-zxpc2\") pod \"root-account-create-update-c9b8h\" (UID: \"264942a5-9225-48d0-9a43-27bf495fda2c\") " pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.952867 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264942a5-9225-48d0-9a43-27bf495fda2c-operator-scripts\") pod \"root-account-create-update-c9b8h\" (UID: \"264942a5-9225-48d0-9a43-27bf495fda2c\") " pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.952969 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxpc2\" (UniqueName: \"kubernetes.io/projected/264942a5-9225-48d0-9a43-27bf495fda2c-kube-api-access-zxpc2\") pod \"root-account-create-update-c9b8h\" (UID: \"264942a5-9225-48d0-9a43-27bf495fda2c\") " pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.954826 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264942a5-9225-48d0-9a43-27bf495fda2c-operator-scripts\") pod \"root-account-create-update-c9b8h\" (UID: \"264942a5-9225-48d0-9a43-27bf495fda2c\") " pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:45 crc kubenswrapper[5033]: I0226 16:01:45.976298 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxpc2\" (UniqueName: \"kubernetes.io/projected/264942a5-9225-48d0-9a43-27bf495fda2c-kube-api-access-zxpc2\") pod \"root-account-create-update-c9b8h\" (UID: \"264942a5-9225-48d0-9a43-27bf495fda2c\") " pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:46 crc kubenswrapper[5033]: I0226 16:01:46.054923 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:46 crc kubenswrapper[5033]: E0226 16:01:46.055143 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:01:46 crc kubenswrapper[5033]: E0226 16:01:46.055180 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:01:46 crc kubenswrapper[5033]: E0226 16:01:46.055276 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift podName:8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9 nodeName:}" failed. No retries permitted until 2026-02-26 16:01:50.055232781 +0000 UTC m=+1237.267831441 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift") pod "swift-storage-0" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9") : configmap "swift-ring-files" not found Feb 26 16:01:46 crc kubenswrapper[5033]: I0226 16:01:46.084355 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:46 crc kubenswrapper[5033]: I0226 16:01:46.856118 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-84cln" event={"ID":"005e3882-a184-4d2d-9c00-a3de51d47b6e","Type":"ContainerStarted","Data":"49468fafa60ce3443569ed80de49ce1efc39535d4d172f5cb2cbc367b94cfb46"} Feb 26 16:01:46 crc kubenswrapper[5033]: I0226 16:01:46.876692 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-84cln" podStartSLOduration=1.885217689 podStartE2EDuration="4.876672729s" podCreationTimestamp="2026-02-26 16:01:42 +0000 UTC" firstStartedPulling="2026-02-26 16:01:43.595730268 +0000 UTC m=+1230.808328958" lastFinishedPulling="2026-02-26 16:01:46.587185348 +0000 UTC m=+1233.799783998" observedRunningTime="2026-02-26 16:01:46.873955847 +0000 UTC m=+1234.086554517" watchObservedRunningTime="2026-02-26 16:01:46.876672729 +0000 UTC m=+1234.089271379" Feb 26 16:01:46 crc kubenswrapper[5033]: I0226 16:01:46.931505 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c9b8h"] Feb 26 16:01:46 crc kubenswrapper[5033]: W0226 16:01:46.938570 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod264942a5_9225_48d0_9a43_27bf495fda2c.slice/crio-2d73af71e721ae24198dd533f1fa26a328dee7580ac8428c657a481ff9aad231 WatchSource:0}: Error finding container 2d73af71e721ae24198dd533f1fa26a328dee7580ac8428c657a481ff9aad231: Status 404 returned error can't find the container with id 2d73af71e721ae24198dd533f1fa26a328dee7580ac8428c657a481ff9aad231 Feb 26 16:01:47 crc kubenswrapper[5033]: I0226 16:01:47.865233 5033 generic.go:334] "Generic (PLEG): container finished" podID="264942a5-9225-48d0-9a43-27bf495fda2c" containerID="20b248e0fb1d9960be2a68c40ac5b87b789f7097644b42650194a80f3d646412" exitCode=0 Feb 26 16:01:47 crc kubenswrapper[5033]: I0226 16:01:47.865377 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c9b8h" event={"ID":"264942a5-9225-48d0-9a43-27bf495fda2c","Type":"ContainerDied","Data":"20b248e0fb1d9960be2a68c40ac5b87b789f7097644b42650194a80f3d646412"} Feb 26 16:01:47 crc kubenswrapper[5033]: I0226 16:01:47.868934 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c9b8h" event={"ID":"264942a5-9225-48d0-9a43-27bf495fda2c","Type":"ContainerStarted","Data":"2d73af71e721ae24198dd533f1fa26a328dee7580ac8428c657a481ff9aad231"} Feb 26 16:01:48 crc kubenswrapper[5033]: I0226 16:01:48.593498 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:48 crc kubenswrapper[5033]: I0226 16:01:48.858934 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-h9h8w"] Feb 26 16:01:48 crc kubenswrapper[5033]: I0226 16:01:48.859945 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:48 crc kubenswrapper[5033]: I0226 16:01:48.871907 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-h9h8w"] Feb 26 16:01:48 crc kubenswrapper[5033]: I0226 16:01:48.960594 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3699-account-create-update-4dzzm"] Feb 26 16:01:48 crc kubenswrapper[5033]: I0226 16:01:48.962111 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:48 crc kubenswrapper[5033]: I0226 16:01:48.966371 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 26 16:01:48 crc kubenswrapper[5033]: I0226 16:01:48.972279 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3699-account-create-update-4dzzm"] Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.018179 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtdb\" (UniqueName: \"kubernetes.io/projected/0558b5db-1cf4-477a-8874-cf2931891f31-kube-api-access-sdtdb\") pod \"glance-db-create-h9h8w\" (UID: \"0558b5db-1cf4-477a-8874-cf2931891f31\") " pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.018220 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0558b5db-1cf4-477a-8874-cf2931891f31-operator-scripts\") pod \"glance-db-create-h9h8w\" (UID: \"0558b5db-1cf4-477a-8874-cf2931891f31\") " pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.122845 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtdb\" (UniqueName: \"kubernetes.io/projected/0558b5db-1cf4-477a-8874-cf2931891f31-kube-api-access-sdtdb\") pod \"glance-db-create-h9h8w\" (UID: \"0558b5db-1cf4-477a-8874-cf2931891f31\") " pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.122904 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0558b5db-1cf4-477a-8874-cf2931891f31-operator-scripts\") pod \"glance-db-create-h9h8w\" (UID: \"0558b5db-1cf4-477a-8874-cf2931891f31\") " pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.122948 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0528d63-bfdb-4036-af0f-4900569f04c8-operator-scripts\") pod \"glance-3699-account-create-update-4dzzm\" (UID: \"a0528d63-bfdb-4036-af0f-4900569f04c8\") " pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.122990 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdhdt\" (UniqueName: \"kubernetes.io/projected/a0528d63-bfdb-4036-af0f-4900569f04c8-kube-api-access-fdhdt\") pod \"glance-3699-account-create-update-4dzzm\" (UID: \"a0528d63-bfdb-4036-af0f-4900569f04c8\") " pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.124640 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0558b5db-1cf4-477a-8874-cf2931891f31-operator-scripts\") pod \"glance-db-create-h9h8w\" (UID: \"0558b5db-1cf4-477a-8874-cf2931891f31\") " pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.151843 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtdb\" (UniqueName: \"kubernetes.io/projected/0558b5db-1cf4-477a-8874-cf2931891f31-kube-api-access-sdtdb\") pod \"glance-db-create-h9h8w\" (UID: \"0558b5db-1cf4-477a-8874-cf2931891f31\") " pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.177932 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.226511 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdhdt\" (UniqueName: \"kubernetes.io/projected/a0528d63-bfdb-4036-af0f-4900569f04c8-kube-api-access-fdhdt\") pod \"glance-3699-account-create-update-4dzzm\" (UID: \"a0528d63-bfdb-4036-af0f-4900569f04c8\") " pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.228041 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0528d63-bfdb-4036-af0f-4900569f04c8-operator-scripts\") pod \"glance-3699-account-create-update-4dzzm\" (UID: \"a0528d63-bfdb-4036-af0f-4900569f04c8\") " pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.232856 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0528d63-bfdb-4036-af0f-4900569f04c8-operator-scripts\") pod \"glance-3699-account-create-update-4dzzm\" (UID: \"a0528d63-bfdb-4036-af0f-4900569f04c8\") " pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.258679 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdhdt\" (UniqueName: \"kubernetes.io/projected/a0528d63-bfdb-4036-af0f-4900569f04c8-kube-api-access-fdhdt\") pod \"glance-3699-account-create-update-4dzzm\" (UID: \"a0528d63-bfdb-4036-af0f-4900569f04c8\") " pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.290124 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.309346 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.431007 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxpc2\" (UniqueName: \"kubernetes.io/projected/264942a5-9225-48d0-9a43-27bf495fda2c-kube-api-access-zxpc2\") pod \"264942a5-9225-48d0-9a43-27bf495fda2c\" (UID: \"264942a5-9225-48d0-9a43-27bf495fda2c\") " Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.431241 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264942a5-9225-48d0-9a43-27bf495fda2c-operator-scripts\") pod \"264942a5-9225-48d0-9a43-27bf495fda2c\" (UID: \"264942a5-9225-48d0-9a43-27bf495fda2c\") " Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.432155 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/264942a5-9225-48d0-9a43-27bf495fda2c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "264942a5-9225-48d0-9a43-27bf495fda2c" (UID: "264942a5-9225-48d0-9a43-27bf495fda2c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.434760 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264942a5-9225-48d0-9a43-27bf495fda2c-kube-api-access-zxpc2" (OuterVolumeSpecName: "kube-api-access-zxpc2") pod "264942a5-9225-48d0-9a43-27bf495fda2c" (UID: "264942a5-9225-48d0-9a43-27bf495fda2c"). InnerVolumeSpecName "kube-api-access-zxpc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.532890 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/264942a5-9225-48d0-9a43-27bf495fda2c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.532930 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxpc2\" (UniqueName: \"kubernetes.io/projected/264942a5-9225-48d0-9a43-27bf495fda2c-kube-api-access-zxpc2\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.646205 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-zwrrq"] Feb 26 16:01:49 crc kubenswrapper[5033]: E0226 16:01:49.646655 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264942a5-9225-48d0-9a43-27bf495fda2c" containerName="mariadb-account-create-update" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.646687 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="264942a5-9225-48d0-9a43-27bf495fda2c" containerName="mariadb-account-create-update" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.646932 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="264942a5-9225-48d0-9a43-27bf495fda2c" containerName="mariadb-account-create-update" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.647730 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.648158 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zwrrq"] Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.660917 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-h9h8w"] Feb 26 16:01:49 crc kubenswrapper[5033]: W0226 16:01:49.669133 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0558b5db_1cf4_477a_8874_cf2931891f31.slice/crio-21f72bedb158792fe77ffe9482e0d02989288f753a06b00e8502099ffe70462b WatchSource:0}: Error finding container 21f72bedb158792fe77ffe9482e0d02989288f753a06b00e8502099ffe70462b: Status 404 returned error can't find the container with id 21f72bedb158792fe77ffe9482e0d02989288f753a06b00e8502099ffe70462b Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.739030 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-operator-scripts\") pod \"keystone-db-create-zwrrq\" (UID: \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\") " pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.739122 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxj4q\" (UniqueName: \"kubernetes.io/projected/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-kube-api-access-nxj4q\") pod \"keystone-db-create-zwrrq\" (UID: \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\") " pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.752926 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-65a8-account-create-update-md5x4"] Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.753985 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.762365 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.763276 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65a8-account-create-update-md5x4"] Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.772428 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3699-account-create-update-4dzzm"] Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.842298 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sd2b\" (UniqueName: \"kubernetes.io/projected/c9de3c3b-afe5-4a88-99bd-28da4fe00092-kube-api-access-9sd2b\") pod \"keystone-65a8-account-create-update-md5x4\" (UID: \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\") " pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.842575 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-operator-scripts\") pod \"keystone-db-create-zwrrq\" (UID: \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\") " pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.842733 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxj4q\" (UniqueName: \"kubernetes.io/projected/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-kube-api-access-nxj4q\") pod \"keystone-db-create-zwrrq\" (UID: \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\") " pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.842890 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9de3c3b-afe5-4a88-99bd-28da4fe00092-operator-scripts\") pod \"keystone-65a8-account-create-update-md5x4\" (UID: \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\") " pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.845582 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-operator-scripts\") pod \"keystone-db-create-zwrrq\" (UID: \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\") " pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.861912 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxj4q\" (UniqueName: \"kubernetes.io/projected/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-kube-api-access-nxj4q\") pod \"keystone-db-create-zwrrq\" (UID: \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\") " pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.884652 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h9h8w" event={"ID":"0558b5db-1cf4-477a-8874-cf2931891f31","Type":"ContainerStarted","Data":"d0ff186c2752535cbe0156aa95c8b137562b367feee80bda35bbc94a62258970"} Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.885235 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h9h8w" event={"ID":"0558b5db-1cf4-477a-8874-cf2931891f31","Type":"ContainerStarted","Data":"21f72bedb158792fe77ffe9482e0d02989288f753a06b00e8502099ffe70462b"} Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.886241 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3699-account-create-update-4dzzm" event={"ID":"a0528d63-bfdb-4036-af0f-4900569f04c8","Type":"ContainerStarted","Data":"937fbd5759eaa9acdf1ff21282468bf67549c7b94dd9b8e642a38934341b210e"} Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.890198 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c9b8h" event={"ID":"264942a5-9225-48d0-9a43-27bf495fda2c","Type":"ContainerDied","Data":"2d73af71e721ae24198dd533f1fa26a328dee7580ac8428c657a481ff9aad231"} Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.890250 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d73af71e721ae24198dd533f1fa26a328dee7580ac8428c657a481ff9aad231" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.890779 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c9b8h" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.902546 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-h9h8w" podStartSLOduration=1.9025304090000001 podStartE2EDuration="1.902530409s" podCreationTimestamp="2026-02-26 16:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:01:49.898459217 +0000 UTC m=+1237.111057867" watchObservedRunningTime="2026-02-26 16:01:49.902530409 +0000 UTC m=+1237.115129059" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.935034 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-cgff2"] Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.936110 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cgff2" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.944347 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sd2b\" (UniqueName: \"kubernetes.io/projected/c9de3c3b-afe5-4a88-99bd-28da4fe00092-kube-api-access-9sd2b\") pod \"keystone-65a8-account-create-update-md5x4\" (UID: \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\") " pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.944432 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9de3c3b-afe5-4a88-99bd-28da4fe00092-operator-scripts\") pod \"keystone-65a8-account-create-update-md5x4\" (UID: \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\") " pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.945032 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9de3c3b-afe5-4a88-99bd-28da4fe00092-operator-scripts\") pod \"keystone-65a8-account-create-update-md5x4\" (UID: \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\") " pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.945291 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cgff2"] Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.964070 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sd2b\" (UniqueName: \"kubernetes.io/projected/c9de3c3b-afe5-4a88-99bd-28da4fe00092-kube-api-access-9sd2b\") pod \"keystone-65a8-account-create-update-md5x4\" (UID: \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\") " pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:49 crc kubenswrapper[5033]: I0226 16:01:49.976304 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.043983 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fdf9-account-create-update-hcvt7"] Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.045235 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.046673 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlh44\" (UniqueName: \"kubernetes.io/projected/771e4d50-f290-4165-99c3-dec5b3582458-kube-api-access-mlh44\") pod \"placement-db-create-cgff2\" (UID: \"771e4d50-f290-4165-99c3-dec5b3582458\") " pod="openstack/placement-db-create-cgff2" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.046718 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/771e4d50-f290-4165-99c3-dec5b3582458-operator-scripts\") pod \"placement-db-create-cgff2\" (UID: \"771e4d50-f290-4165-99c3-dec5b3582458\") " pod="openstack/placement-db-create-cgff2" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.048853 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.059023 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fdf9-account-create-update-hcvt7"] Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.080901 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.148845 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nggbt\" (UniqueName: \"kubernetes.io/projected/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-kube-api-access-nggbt\") pod \"placement-fdf9-account-create-update-hcvt7\" (UID: \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\") " pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.149061 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlh44\" (UniqueName: \"kubernetes.io/projected/771e4d50-f290-4165-99c3-dec5b3582458-kube-api-access-mlh44\") pod \"placement-db-create-cgff2\" (UID: \"771e4d50-f290-4165-99c3-dec5b3582458\") " pod="openstack/placement-db-create-cgff2" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.149101 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-operator-scripts\") pod \"placement-fdf9-account-create-update-hcvt7\" (UID: \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\") " pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.149154 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/771e4d50-f290-4165-99c3-dec5b3582458-operator-scripts\") pod \"placement-db-create-cgff2\" (UID: \"771e4d50-f290-4165-99c3-dec5b3582458\") " pod="openstack/placement-db-create-cgff2" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.149370 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:50 crc kubenswrapper[5033]: E0226 16:01:50.149574 5033 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:01:50 crc kubenswrapper[5033]: E0226 16:01:50.149592 5033 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:01:50 crc kubenswrapper[5033]: E0226 16:01:50.149657 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift podName:8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9 nodeName:}" failed. No retries permitted until 2026-02-26 16:01:58.149640549 +0000 UTC m=+1245.362239199 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift") pod "swift-storage-0" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9") : configmap "swift-ring-files" not found Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.149963 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/771e4d50-f290-4165-99c3-dec5b3582458-operator-scripts\") pod \"placement-db-create-cgff2\" (UID: \"771e4d50-f290-4165-99c3-dec5b3582458\") " pod="openstack/placement-db-create-cgff2" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.195898 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlh44\" (UniqueName: \"kubernetes.io/projected/771e4d50-f290-4165-99c3-dec5b3582458-kube-api-access-mlh44\") pod \"placement-db-create-cgff2\" (UID: \"771e4d50-f290-4165-99c3-dec5b3582458\") " pod="openstack/placement-db-create-cgff2" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.251804 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-operator-scripts\") pod \"placement-fdf9-account-create-update-hcvt7\" (UID: \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\") " pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.251991 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nggbt\" (UniqueName: \"kubernetes.io/projected/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-kube-api-access-nggbt\") pod \"placement-fdf9-account-create-update-hcvt7\" (UID: \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\") " pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.252649 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-operator-scripts\") pod \"placement-fdf9-account-create-update-hcvt7\" (UID: \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\") " pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.255913 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cgff2" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.267619 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nggbt\" (UniqueName: \"kubernetes.io/projected/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-kube-api-access-nggbt\") pod \"placement-fdf9-account-create-update-hcvt7\" (UID: \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\") " pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.398809 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:50 crc kubenswrapper[5033]: I0226 16:01:50.496271 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zwrrq"] Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.616115 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65a8-account-create-update-md5x4"] Feb 26 16:01:51 crc kubenswrapper[5033]: W0226 16:01:50.618994 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9de3c3b_afe5_4a88_99bd_28da4fe00092.slice/crio-6fb1207433ec8e4c7fac57dc048b4cc93871229019ff7c9cc4feed5f6c94cb4e WatchSource:0}: Error finding container 6fb1207433ec8e4c7fac57dc048b4cc93871229019ff7c9cc4feed5f6c94cb4e: Status 404 returned error can't find the container with id 6fb1207433ec8e4c7fac57dc048b4cc93871229019ff7c9cc4feed5f6c94cb4e Feb 26 16:01:51 crc kubenswrapper[5033]: W0226 16:01:50.724715 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod771e4d50_f290_4165_99c3_dec5b3582458.slice/crio-701f7c144d744610d3dde68bdf61e1c0fae717fd8b3c9bb11822f575a0d21349 WatchSource:0}: Error finding container 701f7c144d744610d3dde68bdf61e1c0fae717fd8b3c9bb11822f575a0d21349: Status 404 returned error can't find the container with id 701f7c144d744610d3dde68bdf61e1c0fae717fd8b3c9bb11822f575a0d21349 Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.732185 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cgff2"] Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.898712 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65a8-account-create-update-md5x4" event={"ID":"c9de3c3b-afe5-4a88-99bd-28da4fe00092","Type":"ContainerStarted","Data":"d9d43a57c427770c384b76badb61f86c68f296b829ada01856cbd642d294f82c"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.898763 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65a8-account-create-update-md5x4" event={"ID":"c9de3c3b-afe5-4a88-99bd-28da4fe00092","Type":"ContainerStarted","Data":"6fb1207433ec8e4c7fac57dc048b4cc93871229019ff7c9cc4feed5f6c94cb4e"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.901654 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h9h8w" event={"ID":"0558b5db-1cf4-477a-8874-cf2931891f31","Type":"ContainerDied","Data":"d0ff186c2752535cbe0156aa95c8b137562b367feee80bda35bbc94a62258970"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.901541 5033 generic.go:334] "Generic (PLEG): container finished" podID="0558b5db-1cf4-477a-8874-cf2931891f31" containerID="d0ff186c2752535cbe0156aa95c8b137562b367feee80bda35bbc94a62258970" exitCode=0 Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.907492 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zwrrq" event={"ID":"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f","Type":"ContainerStarted","Data":"6046c4769194f5929905a2674f35beebb80e8dba1a305d0e97be6b50d6ce657f"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.907531 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zwrrq" event={"ID":"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f","Type":"ContainerStarted","Data":"12b4c4f0ea82b695efd0ab59f53eee96f4cd8d0d2d2acc2ed96f40b5284285be"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.910209 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cgff2" event={"ID":"771e4d50-f290-4165-99c3-dec5b3582458","Type":"ContainerStarted","Data":"d4df572ff20d78acf262ce21a0a068f8731d9a8a8e278d67a66bcc4ef90cd096"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.910245 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cgff2" event={"ID":"771e4d50-f290-4165-99c3-dec5b3582458","Type":"ContainerStarted","Data":"701f7c144d744610d3dde68bdf61e1c0fae717fd8b3c9bb11822f575a0d21349"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.914518 5033 generic.go:334] "Generic (PLEG): container finished" podID="a0528d63-bfdb-4036-af0f-4900569f04c8" containerID="1c9c4fd07b0b2d14f63fa13680cdf51683f0305829e18896e63fa3f4e39ded84" exitCode=0 Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.914578 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3699-account-create-update-4dzzm" event={"ID":"a0528d63-bfdb-4036-af0f-4900569f04c8","Type":"ContainerDied","Data":"1c9c4fd07b0b2d14f63fa13680cdf51683f0305829e18896e63fa3f4e39ded84"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.919904 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-65a8-account-create-update-md5x4" podStartSLOduration=1.9198817259999998 podStartE2EDuration="1.919881726s" podCreationTimestamp="2026-02-26 16:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:01:50.914549925 +0000 UTC m=+1238.127148585" watchObservedRunningTime="2026-02-26 16:01:50.919881726 +0000 UTC m=+1238.132480376" Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.948175 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-zwrrq" podStartSLOduration=1.948147327 podStartE2EDuration="1.948147327s" podCreationTimestamp="2026-02-26 16:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:01:50.942059479 +0000 UTC m=+1238.154658129" watchObservedRunningTime="2026-02-26 16:01:50.948147327 +0000 UTC m=+1238.160745987" Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:50.975828 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-cgff2" podStartSLOduration=1.975806414 podStartE2EDuration="1.975806414s" podCreationTimestamp="2026-02-26 16:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:01:50.974694679 +0000 UTC m=+1238.187293329" watchObservedRunningTime="2026-02-26 16:01:50.975806414 +0000 UTC m=+1238.188405064" Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.461892 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.520155 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wvhbl"] Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.520548 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-wvhbl" podUID="281e04fa-65a9-4798-a371-3a644e34ec2d" containerName="dnsmasq-dns" containerID="cri-o://df18a639ac4d63e834cd2e263dc8edde8a49b3fe283c2f89ae1cbdec2f3a0e7d" gracePeriod=10 Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.938639 5033 generic.go:334] "Generic (PLEG): container finished" podID="281e04fa-65a9-4798-a371-3a644e34ec2d" containerID="df18a639ac4d63e834cd2e263dc8edde8a49b3fe283c2f89ae1cbdec2f3a0e7d" exitCode=0 Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.938681 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wvhbl" event={"ID":"281e04fa-65a9-4798-a371-3a644e34ec2d","Type":"ContainerDied","Data":"df18a639ac4d63e834cd2e263dc8edde8a49b3fe283c2f89ae1cbdec2f3a0e7d"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.948325 5033 generic.go:334] "Generic (PLEG): container finished" podID="c9de3c3b-afe5-4a88-99bd-28da4fe00092" containerID="d9d43a57c427770c384b76badb61f86c68f296b829ada01856cbd642d294f82c" exitCode=0 Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.948597 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65a8-account-create-update-md5x4" event={"ID":"c9de3c3b-afe5-4a88-99bd-28da4fe00092","Type":"ContainerDied","Data":"d9d43a57c427770c384b76badb61f86c68f296b829ada01856cbd642d294f82c"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.951543 5033 generic.go:334] "Generic (PLEG): container finished" podID="8ceab3ba-6b5a-4c0d-a81d-a17a9273837f" containerID="6046c4769194f5929905a2674f35beebb80e8dba1a305d0e97be6b50d6ce657f" exitCode=0 Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.951593 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zwrrq" event={"ID":"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f","Type":"ContainerDied","Data":"6046c4769194f5929905a2674f35beebb80e8dba1a305d0e97be6b50d6ce657f"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.958740 5033 generic.go:334] "Generic (PLEG): container finished" podID="771e4d50-f290-4165-99c3-dec5b3582458" containerID="d4df572ff20d78acf262ce21a0a068f8731d9a8a8e278d67a66bcc4ef90cd096" exitCode=0 Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.959039 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cgff2" event={"ID":"771e4d50-f290-4165-99c3-dec5b3582458","Type":"ContainerDied","Data":"d4df572ff20d78acf262ce21a0a068f8731d9a8a8e278d67a66bcc4ef90cd096"} Feb 26 16:01:51 crc kubenswrapper[5033]: I0226 16:01:51.963142 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fdf9-account-create-update-hcvt7"] Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.038180 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c9b8h"] Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.044955 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-c9b8h"] Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.121966 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.291162 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-sb\") pod \"281e04fa-65a9-4798-a371-3a644e34ec2d\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.291298 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-config\") pod \"281e04fa-65a9-4798-a371-3a644e34ec2d\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.291371 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-nb\") pod \"281e04fa-65a9-4798-a371-3a644e34ec2d\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.291450 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b75cs\" (UniqueName: \"kubernetes.io/projected/281e04fa-65a9-4798-a371-3a644e34ec2d-kube-api-access-b75cs\") pod \"281e04fa-65a9-4798-a371-3a644e34ec2d\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.291480 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-dns-svc\") pod \"281e04fa-65a9-4798-a371-3a644e34ec2d\" (UID: \"281e04fa-65a9-4798-a371-3a644e34ec2d\") " Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.306137 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281e04fa-65a9-4798-a371-3a644e34ec2d-kube-api-access-b75cs" (OuterVolumeSpecName: "kube-api-access-b75cs") pod "281e04fa-65a9-4798-a371-3a644e34ec2d" (UID: "281e04fa-65a9-4798-a371-3a644e34ec2d"). InnerVolumeSpecName "kube-api-access-b75cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.335049 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "281e04fa-65a9-4798-a371-3a644e34ec2d" (UID: "281e04fa-65a9-4798-a371-3a644e34ec2d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.341925 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-config" (OuterVolumeSpecName: "config") pod "281e04fa-65a9-4798-a371-3a644e34ec2d" (UID: "281e04fa-65a9-4798-a371-3a644e34ec2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.364186 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "281e04fa-65a9-4798-a371-3a644e34ec2d" (UID: "281e04fa-65a9-4798-a371-3a644e34ec2d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.374010 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "281e04fa-65a9-4798-a371-3a644e34ec2d" (UID: "281e04fa-65a9-4798-a371-3a644e34ec2d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.393232 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.393415 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b75cs\" (UniqueName: \"kubernetes.io/projected/281e04fa-65a9-4798-a371-3a644e34ec2d-kube-api-access-b75cs\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.393475 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.393554 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.393604 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281e04fa-65a9-4798-a371-3a644e34ec2d-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.435248 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.504635 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.597248 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0558b5db-1cf4-477a-8874-cf2931891f31-operator-scripts\") pod \"0558b5db-1cf4-477a-8874-cf2931891f31\" (UID: \"0558b5db-1cf4-477a-8874-cf2931891f31\") " Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.597439 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdhdt\" (UniqueName: \"kubernetes.io/projected/a0528d63-bfdb-4036-af0f-4900569f04c8-kube-api-access-fdhdt\") pod \"a0528d63-bfdb-4036-af0f-4900569f04c8\" (UID: \"a0528d63-bfdb-4036-af0f-4900569f04c8\") " Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.597466 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdtdb\" (UniqueName: \"kubernetes.io/projected/0558b5db-1cf4-477a-8874-cf2931891f31-kube-api-access-sdtdb\") pod \"0558b5db-1cf4-477a-8874-cf2931891f31\" (UID: \"0558b5db-1cf4-477a-8874-cf2931891f31\") " Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.597509 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0528d63-bfdb-4036-af0f-4900569f04c8-operator-scripts\") pod \"a0528d63-bfdb-4036-af0f-4900569f04c8\" (UID: \"a0528d63-bfdb-4036-af0f-4900569f04c8\") " Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.598016 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0558b5db-1cf4-477a-8874-cf2931891f31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0558b5db-1cf4-477a-8874-cf2931891f31" (UID: "0558b5db-1cf4-477a-8874-cf2931891f31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.598293 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0528d63-bfdb-4036-af0f-4900569f04c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0528d63-bfdb-4036-af0f-4900569f04c8" (UID: "a0528d63-bfdb-4036-af0f-4900569f04c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.598450 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0558b5db-1cf4-477a-8874-cf2931891f31-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.598497 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0528d63-bfdb-4036-af0f-4900569f04c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.602516 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0528d63-bfdb-4036-af0f-4900569f04c8-kube-api-access-fdhdt" (OuterVolumeSpecName: "kube-api-access-fdhdt") pod "a0528d63-bfdb-4036-af0f-4900569f04c8" (UID: "a0528d63-bfdb-4036-af0f-4900569f04c8"). InnerVolumeSpecName "kube-api-access-fdhdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.602763 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0558b5db-1cf4-477a-8874-cf2931891f31-kube-api-access-sdtdb" (OuterVolumeSpecName: "kube-api-access-sdtdb") pod "0558b5db-1cf4-477a-8874-cf2931891f31" (UID: "0558b5db-1cf4-477a-8874-cf2931891f31"). InnerVolumeSpecName "kube-api-access-sdtdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.700187 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdhdt\" (UniqueName: \"kubernetes.io/projected/a0528d63-bfdb-4036-af0f-4900569f04c8-kube-api-access-fdhdt\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.700216 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdtdb\" (UniqueName: \"kubernetes.io/projected/0558b5db-1cf4-477a-8874-cf2931891f31-kube-api-access-sdtdb\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.966450 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wvhbl" event={"ID":"281e04fa-65a9-4798-a371-3a644e34ec2d","Type":"ContainerDied","Data":"a64eaca8fd089337ae86d5bc42d512343d53f540b76e55b955dffd2e994f0736"} Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.966685 5033 scope.go:117] "RemoveContainer" containerID="df18a639ac4d63e834cd2e263dc8edde8a49b3fe283c2f89ae1cbdec2f3a0e7d" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.966468 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-wvhbl" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.972458 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5" containerID="ca9dacffdf3c3bcb9175574af945576f2cf13f888c591849b03f35e84cfd9fa4" exitCode=0 Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.972595 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fdf9-account-create-update-hcvt7" event={"ID":"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5","Type":"ContainerDied","Data":"ca9dacffdf3c3bcb9175574af945576f2cf13f888c591849b03f35e84cfd9fa4"} Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.972648 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fdf9-account-create-update-hcvt7" event={"ID":"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5","Type":"ContainerStarted","Data":"b11107829cc2a6f596291f55d773302ce13d9035e238328ef7e75e9f10f429c8"} Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.975696 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h9h8w" event={"ID":"0558b5db-1cf4-477a-8874-cf2931891f31","Type":"ContainerDied","Data":"21f72bedb158792fe77ffe9482e0d02989288f753a06b00e8502099ffe70462b"} Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.975727 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f72bedb158792fe77ffe9482e0d02989288f753a06b00e8502099ffe70462b" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.975824 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h9h8w" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.977696 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3699-account-create-update-4dzzm" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.977855 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3699-account-create-update-4dzzm" event={"ID":"a0528d63-bfdb-4036-af0f-4900569f04c8","Type":"ContainerDied","Data":"937fbd5759eaa9acdf1ff21282468bf67549c7b94dd9b8e642a38934341b210e"} Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.978337 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="937fbd5759eaa9acdf1ff21282468bf67549c7b94dd9b8e642a38934341b210e" Feb 26 16:01:52 crc kubenswrapper[5033]: I0226 16:01:52.998992 5033 scope.go:117] "RemoveContainer" containerID="4215c2df6c92481f13cb6b9eccd314e68bbcf30cd08a0a26f66574508f7185fc" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.038753 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wvhbl"] Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.052909 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wvhbl"] Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.265767 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.423154 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9de3c3b-afe5-4a88-99bd-28da4fe00092-operator-scripts\") pod \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\" (UID: \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\") " Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.424314 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sd2b\" (UniqueName: \"kubernetes.io/projected/c9de3c3b-afe5-4a88-99bd-28da4fe00092-kube-api-access-9sd2b\") pod \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\" (UID: \"c9de3c3b-afe5-4a88-99bd-28da4fe00092\") " Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.424061 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9de3c3b-afe5-4a88-99bd-28da4fe00092-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9de3c3b-afe5-4a88-99bd-28da4fe00092" (UID: "c9de3c3b-afe5-4a88-99bd-28da4fe00092"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.453025 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9de3c3b-afe5-4a88-99bd-28da4fe00092-kube-api-access-9sd2b" (OuterVolumeSpecName: "kube-api-access-9sd2b") pod "c9de3c3b-afe5-4a88-99bd-28da4fe00092" (UID: "c9de3c3b-afe5-4a88-99bd-28da4fe00092"). InnerVolumeSpecName "kube-api-access-9sd2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.503279 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.508144 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cgff2" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.531373 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sd2b\" (UniqueName: \"kubernetes.io/projected/c9de3c3b-afe5-4a88-99bd-28da4fe00092-kube-api-access-9sd2b\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.531410 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9de3c3b-afe5-4a88-99bd-28da4fe00092-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.565048 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.633972 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-operator-scripts\") pod \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\" (UID: \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\") " Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.634027 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/771e4d50-f290-4165-99c3-dec5b3582458-operator-scripts\") pod \"771e4d50-f290-4165-99c3-dec5b3582458\" (UID: \"771e4d50-f290-4165-99c3-dec5b3582458\") " Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.634115 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlh44\" (UniqueName: \"kubernetes.io/projected/771e4d50-f290-4165-99c3-dec5b3582458-kube-api-access-mlh44\") pod \"771e4d50-f290-4165-99c3-dec5b3582458\" (UID: \"771e4d50-f290-4165-99c3-dec5b3582458\") " Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.634143 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxj4q\" (UniqueName: \"kubernetes.io/projected/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-kube-api-access-nxj4q\") pod \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\" (UID: \"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f\") " Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.634731 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ceab3ba-6b5a-4c0d-a81d-a17a9273837f" (UID: "8ceab3ba-6b5a-4c0d-a81d-a17a9273837f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.634893 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.635155 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/771e4d50-f290-4165-99c3-dec5b3582458-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "771e4d50-f290-4165-99c3-dec5b3582458" (UID: "771e4d50-f290-4165-99c3-dec5b3582458"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.637801 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-kube-api-access-nxj4q" (OuterVolumeSpecName: "kube-api-access-nxj4q") pod "8ceab3ba-6b5a-4c0d-a81d-a17a9273837f" (UID: "8ceab3ba-6b5a-4c0d-a81d-a17a9273837f"). InnerVolumeSpecName "kube-api-access-nxj4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.639032 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/771e4d50-f290-4165-99c3-dec5b3582458-kube-api-access-mlh44" (OuterVolumeSpecName: "kube-api-access-mlh44") pod "771e4d50-f290-4165-99c3-dec5b3582458" (UID: "771e4d50-f290-4165-99c3-dec5b3582458"). InnerVolumeSpecName "kube-api-access-mlh44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.643465 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="264942a5-9225-48d0-9a43-27bf495fda2c" path="/var/lib/kubelet/pods/264942a5-9225-48d0-9a43-27bf495fda2c/volumes" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.644125 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281e04fa-65a9-4798-a371-3a644e34ec2d" path="/var/lib/kubelet/pods/281e04fa-65a9-4798-a371-3a644e34ec2d/volumes" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.737071 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/771e4d50-f290-4165-99c3-dec5b3582458-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.737106 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlh44\" (UniqueName: \"kubernetes.io/projected/771e4d50-f290-4165-99c3-dec5b3582458-kube-api-access-mlh44\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.737120 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxj4q\" (UniqueName: \"kubernetes.io/projected/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f-kube-api-access-nxj4q\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.988152 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65a8-account-create-update-md5x4" event={"ID":"c9de3c3b-afe5-4a88-99bd-28da4fe00092","Type":"ContainerDied","Data":"6fb1207433ec8e4c7fac57dc048b4cc93871229019ff7c9cc4feed5f6c94cb4e"} Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.988396 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fb1207433ec8e4c7fac57dc048b4cc93871229019ff7c9cc4feed5f6c94cb4e" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.988208 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65a8-account-create-update-md5x4" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.989907 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zwrrq" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.989931 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zwrrq" event={"ID":"8ceab3ba-6b5a-4c0d-a81d-a17a9273837f","Type":"ContainerDied","Data":"12b4c4f0ea82b695efd0ab59f53eee96f4cd8d0d2d2acc2ed96f40b5284285be"} Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.990145 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12b4c4f0ea82b695efd0ab59f53eee96f4cd8d0d2d2acc2ed96f40b5284285be" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.991690 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cgff2" event={"ID":"771e4d50-f290-4165-99c3-dec5b3582458","Type":"ContainerDied","Data":"701f7c144d744610d3dde68bdf61e1c0fae717fd8b3c9bb11822f575a0d21349"} Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.991718 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="701f7c144d744610d3dde68bdf61e1c0fae717fd8b3c9bb11822f575a0d21349" Feb 26 16:01:53 crc kubenswrapper[5033]: I0226 16:01:53.991719 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cgff2" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109275 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9jqk7"] Feb 26 16:01:54 crc kubenswrapper[5033]: E0226 16:01:54.109610 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ceab3ba-6b5a-4c0d-a81d-a17a9273837f" containerName="mariadb-database-create" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109628 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ceab3ba-6b5a-4c0d-a81d-a17a9273837f" containerName="mariadb-database-create" Feb 26 16:01:54 crc kubenswrapper[5033]: E0226 16:01:54.109650 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771e4d50-f290-4165-99c3-dec5b3582458" containerName="mariadb-database-create" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109657 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="771e4d50-f290-4165-99c3-dec5b3582458" containerName="mariadb-database-create" Feb 26 16:01:54 crc kubenswrapper[5033]: E0226 16:01:54.109670 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0558b5db-1cf4-477a-8874-cf2931891f31" containerName="mariadb-database-create" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109676 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0558b5db-1cf4-477a-8874-cf2931891f31" containerName="mariadb-database-create" Feb 26 16:01:54 crc kubenswrapper[5033]: E0226 16:01:54.109698 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0528d63-bfdb-4036-af0f-4900569f04c8" containerName="mariadb-account-create-update" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109704 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0528d63-bfdb-4036-af0f-4900569f04c8" containerName="mariadb-account-create-update" Feb 26 16:01:54 crc kubenswrapper[5033]: E0226 16:01:54.109721 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281e04fa-65a9-4798-a371-3a644e34ec2d" containerName="init" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109728 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="281e04fa-65a9-4798-a371-3a644e34ec2d" containerName="init" Feb 26 16:01:54 crc kubenswrapper[5033]: E0226 16:01:54.109743 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281e04fa-65a9-4798-a371-3a644e34ec2d" containerName="dnsmasq-dns" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109750 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="281e04fa-65a9-4798-a371-3a644e34ec2d" containerName="dnsmasq-dns" Feb 26 16:01:54 crc kubenswrapper[5033]: E0226 16:01:54.109761 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9de3c3b-afe5-4a88-99bd-28da4fe00092" containerName="mariadb-account-create-update" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109767 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9de3c3b-afe5-4a88-99bd-28da4fe00092" containerName="mariadb-account-create-update" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109904 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0558b5db-1cf4-477a-8874-cf2931891f31" containerName="mariadb-database-create" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109916 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="771e4d50-f290-4165-99c3-dec5b3582458" containerName="mariadb-database-create" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109927 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ceab3ba-6b5a-4c0d-a81d-a17a9273837f" containerName="mariadb-database-create" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109938 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0528d63-bfdb-4036-af0f-4900569f04c8" containerName="mariadb-account-create-update" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109947 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="281e04fa-65a9-4798-a371-3a644e34ec2d" containerName="dnsmasq-dns" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.109956 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9de3c3b-afe5-4a88-99bd-28da4fe00092" containerName="mariadb-account-create-update" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.110448 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.118931 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qxt5t" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.124431 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9jqk7"] Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.124596 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.243641 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-combined-ca-bundle\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.244051 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdtqk\" (UniqueName: \"kubernetes.io/projected/40664161-2e8b-4145-8b7a-5ec3723150bf-kube-api-access-bdtqk\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.244095 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-db-sync-config-data\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.244327 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-config-data\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.299494 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.346311 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-db-sync-config-data\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.346607 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-config-data\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.346927 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-combined-ca-bundle\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.347144 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdtqk\" (UniqueName: \"kubernetes.io/projected/40664161-2e8b-4145-8b7a-5ec3723150bf-kube-api-access-bdtqk\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.350192 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-config-data\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.354877 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-db-sync-config-data\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.355234 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-combined-ca-bundle\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.367018 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdtqk\" (UniqueName: \"kubernetes.io/projected/40664161-2e8b-4145-8b7a-5ec3723150bf-kube-api-access-bdtqk\") pod \"glance-db-sync-9jqk7\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.433266 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9jqk7" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.448046 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-operator-scripts\") pod \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\" (UID: \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\") " Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.448104 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nggbt\" (UniqueName: \"kubernetes.io/projected/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-kube-api-access-nggbt\") pod \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\" (UID: \"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5\") " Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.448866 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5" (UID: "ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.455556 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-kube-api-access-nggbt" (OuterVolumeSpecName: "kube-api-access-nggbt") pod "ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5" (UID: "ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5"). InnerVolumeSpecName "kube-api-access-nggbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.550462 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.550492 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nggbt\" (UniqueName: \"kubernetes.io/projected/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5-kube-api-access-nggbt\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.956754 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9jqk7"] Feb 26 16:01:54 crc kubenswrapper[5033]: W0226 16:01:54.959280 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40664161_2e8b_4145_8b7a_5ec3723150bf.slice/crio-2566491bfbe81f43a64df927f5e1133e2ad8f7505136dafbd19f2ea99d3ca184 WatchSource:0}: Error finding container 2566491bfbe81f43a64df927f5e1133e2ad8f7505136dafbd19f2ea99d3ca184: Status 404 returned error can't find the container with id 2566491bfbe81f43a64df927f5e1133e2ad8f7505136dafbd19f2ea99d3ca184 Feb 26 16:01:54 crc kubenswrapper[5033]: I0226 16:01:54.999539 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9jqk7" event={"ID":"40664161-2e8b-4145-8b7a-5ec3723150bf","Type":"ContainerStarted","Data":"2566491bfbe81f43a64df927f5e1133e2ad8f7505136dafbd19f2ea99d3ca184"} Feb 26 16:01:55 crc kubenswrapper[5033]: I0226 16:01:55.001210 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fdf9-account-create-update-hcvt7" event={"ID":"ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5","Type":"ContainerDied","Data":"b11107829cc2a6f596291f55d773302ce13d9035e238328ef7e75e9f10f429c8"} Feb 26 16:01:55 crc kubenswrapper[5033]: I0226 16:01:55.001223 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdf9-account-create-update-hcvt7" Feb 26 16:01:55 crc kubenswrapper[5033]: I0226 16:01:55.001236 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11107829cc2a6f596291f55d773302ce13d9035e238328ef7e75e9f10f429c8" Feb 26 16:01:55 crc kubenswrapper[5033]: I0226 16:01:55.007950 5033 generic.go:334] "Generic (PLEG): container finished" podID="005e3882-a184-4d2d-9c00-a3de51d47b6e" containerID="49468fafa60ce3443569ed80de49ce1efc39535d4d172f5cb2cbc367b94cfb46" exitCode=0 Feb 26 16:01:55 crc kubenswrapper[5033]: I0226 16:01:55.008017 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-84cln" event={"ID":"005e3882-a184-4d2d-9c00-a3de51d47b6e","Type":"ContainerDied","Data":"49468fafa60ce3443569ed80de49ce1efc39535d4d172f5cb2cbc367b94cfb46"} Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.015955 5033 generic.go:334] "Generic (PLEG): container finished" podID="94562cbd-1b98-4b30-9a97-f26fb103bb72" containerID="24ed5542d86de71c32fadfea0c9e8d2f9fc6c503f22d6a3bd14b9feec584230c" exitCode=0 Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.016056 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"94562cbd-1b98-4b30-9a97-f26fb103bb72","Type":"ContainerDied","Data":"24ed5542d86de71c32fadfea0c9e8d2f9fc6c503f22d6a3bd14b9feec584230c"} Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.291007 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.382970 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005e3882-a184-4d2d-9c00-a3de51d47b6e-etc-swift\") pod \"005e3882-a184-4d2d-9c00-a3de51d47b6e\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.383021 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-dispersionconf\") pod \"005e3882-a184-4d2d-9c00-a3de51d47b6e\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.383093 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-combined-ca-bundle\") pod \"005e3882-a184-4d2d-9c00-a3de51d47b6e\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.383151 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-scripts\") pod \"005e3882-a184-4d2d-9c00-a3de51d47b6e\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.383215 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-swiftconf\") pod \"005e3882-a184-4d2d-9c00-a3de51d47b6e\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.383247 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5k5l\" (UniqueName: \"kubernetes.io/projected/005e3882-a184-4d2d-9c00-a3de51d47b6e-kube-api-access-w5k5l\") pod \"005e3882-a184-4d2d-9c00-a3de51d47b6e\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.383281 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-ring-data-devices\") pod \"005e3882-a184-4d2d-9c00-a3de51d47b6e\" (UID: \"005e3882-a184-4d2d-9c00-a3de51d47b6e\") " Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.384137 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "005e3882-a184-4d2d-9c00-a3de51d47b6e" (UID: "005e3882-a184-4d2d-9c00-a3de51d47b6e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.384367 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005e3882-a184-4d2d-9c00-a3de51d47b6e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "005e3882-a184-4d2d-9c00-a3de51d47b6e" (UID: "005e3882-a184-4d2d-9c00-a3de51d47b6e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.389363 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/005e3882-a184-4d2d-9c00-a3de51d47b6e-kube-api-access-w5k5l" (OuterVolumeSpecName: "kube-api-access-w5k5l") pod "005e3882-a184-4d2d-9c00-a3de51d47b6e" (UID: "005e3882-a184-4d2d-9c00-a3de51d47b6e"). InnerVolumeSpecName "kube-api-access-w5k5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.406466 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "005e3882-a184-4d2d-9c00-a3de51d47b6e" (UID: "005e3882-a184-4d2d-9c00-a3de51d47b6e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.421845 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "005e3882-a184-4d2d-9c00-a3de51d47b6e" (UID: "005e3882-a184-4d2d-9c00-a3de51d47b6e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.422130 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "005e3882-a184-4d2d-9c00-a3de51d47b6e" (UID: "005e3882-a184-4d2d-9c00-a3de51d47b6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.429322 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-scripts" (OuterVolumeSpecName: "scripts") pod "005e3882-a184-4d2d-9c00-a3de51d47b6e" (UID: "005e3882-a184-4d2d-9c00-a3de51d47b6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.485418 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.485466 5033 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.485480 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5k5l\" (UniqueName: \"kubernetes.io/projected/005e3882-a184-4d2d-9c00-a3de51d47b6e-kube-api-access-w5k5l\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.485495 5033 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005e3882-a184-4d2d-9c00-a3de51d47b6e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.485519 5033 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005e3882-a184-4d2d-9c00-a3de51d47b6e-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.485531 5033 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:56 crc kubenswrapper[5033]: I0226 16:01:56.485729 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005e3882-a184-4d2d-9c00-a3de51d47b6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.025235 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-959s5"] Feb 26 16:01:57 crc kubenswrapper[5033]: E0226 16:01:57.025777 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="005e3882-a184-4d2d-9c00-a3de51d47b6e" containerName="swift-ring-rebalance" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.025789 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="005e3882-a184-4d2d-9c00-a3de51d47b6e" containerName="swift-ring-rebalance" Feb 26 16:01:57 crc kubenswrapper[5033]: E0226 16:01:57.025808 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5" containerName="mariadb-account-create-update" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.025814 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5" containerName="mariadb-account-create-update" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.025966 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5" containerName="mariadb-account-create-update" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.025984 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="005e3882-a184-4d2d-9c00-a3de51d47b6e" containerName="swift-ring-rebalance" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.026480 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-959s5" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.032048 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.034927 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"94562cbd-1b98-4b30-9a97-f26fb103bb72","Type":"ContainerStarted","Data":"4ad1039e51c54abd4ca6cc3d8acfd102de20d99c0d488b4055beccf49a6c1a6d"} Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.035216 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.039975 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-84cln" event={"ID":"005e3882-a184-4d2d-9c00-a3de51d47b6e","Type":"ContainerDied","Data":"e450f69761230896b9a6364295244ea578c7378fcb48f9969ee85627ca9d0f6d"} Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.040020 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e450f69761230896b9a6364295244ea578c7378fcb48f9969ee85627ca9d0f6d" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.040078 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-84cln" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.041540 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-959s5"] Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.089145 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.321165436 podStartE2EDuration="1m4.089126208s" podCreationTimestamp="2026-02-26 16:00:53 +0000 UTC" firstStartedPulling="2026-02-26 16:00:55.882471054 +0000 UTC m=+1183.095069704" lastFinishedPulling="2026-02-26 16:01:20.650431816 +0000 UTC m=+1207.863030476" observedRunningTime="2026-02-26 16:01:57.078243863 +0000 UTC m=+1244.290842523" watchObservedRunningTime="2026-02-26 16:01:57.089126208 +0000 UTC m=+1244.301724858" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.196457 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/143852fb-6ab2-485f-bc1f-e269f407f221-operator-scripts\") pod \"root-account-create-update-959s5\" (UID: \"143852fb-6ab2-485f-bc1f-e269f407f221\") " pod="openstack/root-account-create-update-959s5" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.196614 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2wp\" (UniqueName: \"kubernetes.io/projected/143852fb-6ab2-485f-bc1f-e269f407f221-kube-api-access-dh2wp\") pod \"root-account-create-update-959s5\" (UID: \"143852fb-6ab2-485f-bc1f-e269f407f221\") " pod="openstack/root-account-create-update-959s5" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.298024 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2wp\" (UniqueName: \"kubernetes.io/projected/143852fb-6ab2-485f-bc1f-e269f407f221-kube-api-access-dh2wp\") pod \"root-account-create-update-959s5\" (UID: \"143852fb-6ab2-485f-bc1f-e269f407f221\") " pod="openstack/root-account-create-update-959s5" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.298159 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/143852fb-6ab2-485f-bc1f-e269f407f221-operator-scripts\") pod \"root-account-create-update-959s5\" (UID: \"143852fb-6ab2-485f-bc1f-e269f407f221\") " pod="openstack/root-account-create-update-959s5" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.299400 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/143852fb-6ab2-485f-bc1f-e269f407f221-operator-scripts\") pod \"root-account-create-update-959s5\" (UID: \"143852fb-6ab2-485f-bc1f-e269f407f221\") " pod="openstack/root-account-create-update-959s5" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.327614 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2wp\" (UniqueName: \"kubernetes.io/projected/143852fb-6ab2-485f-bc1f-e269f407f221-kube-api-access-dh2wp\") pod \"root-account-create-update-959s5\" (UID: \"143852fb-6ab2-485f-bc1f-e269f407f221\") " pod="openstack/root-account-create-update-959s5" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.359010 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-959s5" Feb 26 16:01:57 crc kubenswrapper[5033]: I0226 16:01:57.822945 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-959s5"] Feb 26 16:01:57 crc kubenswrapper[5033]: W0226 16:01:57.834754 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod143852fb_6ab2_485f_bc1f_e269f407f221.slice/crio-cf5b5bab78bf97056f83fd81e7df19a2ac23cccbe0f198dd40bd38ce2e3c8679 WatchSource:0}: Error finding container cf5b5bab78bf97056f83fd81e7df19a2ac23cccbe0f198dd40bd38ce2e3c8679: Status 404 returned error can't find the container with id cf5b5bab78bf97056f83fd81e7df19a2ac23cccbe0f198dd40bd38ce2e3c8679 Feb 26 16:01:58 crc kubenswrapper[5033]: I0226 16:01:58.066784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-959s5" event={"ID":"143852fb-6ab2-485f-bc1f-e269f407f221","Type":"ContainerStarted","Data":"8aa3061c2c3213ed0df4860c6654d27a0c25ada76528ffc497cb2e23fc08edc9"} Feb 26 16:01:58 crc kubenswrapper[5033]: I0226 16:01:58.066844 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-959s5" event={"ID":"143852fb-6ab2-485f-bc1f-e269f407f221","Type":"ContainerStarted","Data":"cf5b5bab78bf97056f83fd81e7df19a2ac23cccbe0f198dd40bd38ce2e3c8679"} Feb 26 16:01:58 crc kubenswrapper[5033]: I0226 16:01:58.089387 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-959s5" podStartSLOduration=1.089354359 podStartE2EDuration="1.089354359s" podCreationTimestamp="2026-02-26 16:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:01:58.083045375 +0000 UTC m=+1245.295644055" watchObservedRunningTime="2026-02-26 16:01:58.089354359 +0000 UTC m=+1245.301953009" Feb 26 16:01:58 crc kubenswrapper[5033]: I0226 16:01:58.213829 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:58 crc kubenswrapper[5033]: I0226 16:01:58.220656 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") pod \"swift-storage-0\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " pod="openstack/swift-storage-0" Feb 26 16:01:58 crc kubenswrapper[5033]: I0226 16:01:58.500804 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 16:01:59 crc kubenswrapper[5033]: I0226 16:01:59.075964 5033 generic.go:334] "Generic (PLEG): container finished" podID="143852fb-6ab2-485f-bc1f-e269f407f221" containerID="8aa3061c2c3213ed0df4860c6654d27a0c25ada76528ffc497cb2e23fc08edc9" exitCode=0 Feb 26 16:01:59 crc kubenswrapper[5033]: I0226 16:01:59.076130 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-959s5" event={"ID":"143852fb-6ab2-485f-bc1f-e269f407f221","Type":"ContainerDied","Data":"8aa3061c2c3213ed0df4860c6654d27a0c25ada76528ffc497cb2e23fc08edc9"} Feb 26 16:01:59 crc kubenswrapper[5033]: I0226 16:01:59.087752 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.085282 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"b5d6fff8928b120f2104c3bf5317222e4e43e790d6abca6de70f7fcae6502398"} Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.128644 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535362-5fl8x"] Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.129962 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535362-5fl8x" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.132027 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.132208 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.134170 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.136564 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535362-5fl8x"] Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.238987 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fxrwp" podUID="44171111-32bf-415d-97c9-f966209cd0a0" containerName="ovn-controller" probeResult="failure" output=< Feb 26 16:02:00 crc kubenswrapper[5033]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 16:02:00 crc kubenswrapper[5033]: > Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.257014 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.275491 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.276419 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2st9\" (UniqueName: \"kubernetes.io/projected/66182ef7-3ffe-4652-bd43-e0d8be6ea7fa-kube-api-access-q2st9\") pod \"auto-csr-approver-29535362-5fl8x\" (UID: \"66182ef7-3ffe-4652-bd43-e0d8be6ea7fa\") " pod="openshift-infra/auto-csr-approver-29535362-5fl8x" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.378034 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2st9\" (UniqueName: \"kubernetes.io/projected/66182ef7-3ffe-4652-bd43-e0d8be6ea7fa-kube-api-access-q2st9\") pod \"auto-csr-approver-29535362-5fl8x\" (UID: \"66182ef7-3ffe-4652-bd43-e0d8be6ea7fa\") " pod="openshift-infra/auto-csr-approver-29535362-5fl8x" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.398965 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2st9\" (UniqueName: \"kubernetes.io/projected/66182ef7-3ffe-4652-bd43-e0d8be6ea7fa-kube-api-access-q2st9\") pod \"auto-csr-approver-29535362-5fl8x\" (UID: \"66182ef7-3ffe-4652-bd43-e0d8be6ea7fa\") " pod="openshift-infra/auto-csr-approver-29535362-5fl8x" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.488585 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535362-5fl8x" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.501917 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fxrwp-config-8svb9"] Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.503299 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.505678 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.521692 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fxrwp-config-8svb9"] Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.583402 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-log-ovn\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.583498 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.583526 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run-ovn\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.583564 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxc95\" (UniqueName: \"kubernetes.io/projected/7e604e64-1aad-49d8-9756-f7a7362bd4a2-kube-api-access-lxc95\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.583582 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-scripts\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.583606 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-additional-scripts\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.684846 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.685032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run-ovn\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.685096 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxc95\" (UniqueName: \"kubernetes.io/projected/7e604e64-1aad-49d8-9756-f7a7362bd4a2-kube-api-access-lxc95\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.685124 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-scripts\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.685160 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-additional-scripts\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.685165 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run-ovn\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.685118 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.685329 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-log-ovn\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.685485 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-log-ovn\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.685852 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-additional-scripts\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.687236 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-scripts\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.703226 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxc95\" (UniqueName: \"kubernetes.io/projected/7e604e64-1aad-49d8-9756-f7a7362bd4a2-kube-api-access-lxc95\") pod \"ovn-controller-fxrwp-config-8svb9\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:00 crc kubenswrapper[5033]: I0226 16:02:00.830719 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:01 crc kubenswrapper[5033]: I0226 16:02:01.097096 5033 generic.go:334] "Generic (PLEG): container finished" podID="443e562a-aefe-46f3-a86f-59205f5849c7" containerID="247b57a885cc380f0649077b96f76c4e9d410ef7835d763d6902fafb762cc064" exitCode=0 Feb 26 16:02:01 crc kubenswrapper[5033]: I0226 16:02:01.097176 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"443e562a-aefe-46f3-a86f-59205f5849c7","Type":"ContainerDied","Data":"247b57a885cc380f0649077b96f76c4e9d410ef7835d763d6902fafb762cc064"} Feb 26 16:02:05 crc kubenswrapper[5033]: I0226 16:02:05.226076 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fxrwp" podUID="44171111-32bf-415d-97c9-f966209cd0a0" containerName="ovn-controller" probeResult="failure" output=< Feb 26 16:02:05 crc kubenswrapper[5033]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 16:02:05 crc kubenswrapper[5033]: > Feb 26 16:02:07 crc kubenswrapper[5033]: I0226 16:02:07.387157 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-959s5" Feb 26 16:02:07 crc kubenswrapper[5033]: I0226 16:02:07.504599 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh2wp\" (UniqueName: \"kubernetes.io/projected/143852fb-6ab2-485f-bc1f-e269f407f221-kube-api-access-dh2wp\") pod \"143852fb-6ab2-485f-bc1f-e269f407f221\" (UID: \"143852fb-6ab2-485f-bc1f-e269f407f221\") " Feb 26 16:02:07 crc kubenswrapper[5033]: I0226 16:02:07.505070 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/143852fb-6ab2-485f-bc1f-e269f407f221-operator-scripts\") pod \"143852fb-6ab2-485f-bc1f-e269f407f221\" (UID: \"143852fb-6ab2-485f-bc1f-e269f407f221\") " Feb 26 16:02:07 crc kubenswrapper[5033]: I0226 16:02:07.508526 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143852fb-6ab2-485f-bc1f-e269f407f221-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "143852fb-6ab2-485f-bc1f-e269f407f221" (UID: "143852fb-6ab2-485f-bc1f-e269f407f221"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:07 crc kubenswrapper[5033]: I0226 16:02:07.509307 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143852fb-6ab2-485f-bc1f-e269f407f221-kube-api-access-dh2wp" (OuterVolumeSpecName: "kube-api-access-dh2wp") pod "143852fb-6ab2-485f-bc1f-e269f407f221" (UID: "143852fb-6ab2-485f-bc1f-e269f407f221"). InnerVolumeSpecName "kube-api-access-dh2wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:07 crc kubenswrapper[5033]: I0226 16:02:07.607528 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh2wp\" (UniqueName: \"kubernetes.io/projected/143852fb-6ab2-485f-bc1f-e269f407f221-kube-api-access-dh2wp\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:07 crc kubenswrapper[5033]: I0226 16:02:07.607562 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/143852fb-6ab2-485f-bc1f-e269f407f221-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:07 crc kubenswrapper[5033]: I0226 16:02:07.775791 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535362-5fl8x"] Feb 26 16:02:07 crc kubenswrapper[5033]: W0226 16:02:07.793080 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66182ef7_3ffe_4652_bd43_e0d8be6ea7fa.slice/crio-8d797e804bbd4e235f16ca0a6413c08bef9842fc865ea520bc3ec61c5d2e3799 WatchSource:0}: Error finding container 8d797e804bbd4e235f16ca0a6413c08bef9842fc865ea520bc3ec61c5d2e3799: Status 404 returned error can't find the container with id 8d797e804bbd4e235f16ca0a6413c08bef9842fc865ea520bc3ec61c5d2e3799 Feb 26 16:02:07 crc kubenswrapper[5033]: I0226 16:02:07.868312 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fxrwp-config-8svb9"] Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.164328 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fxrwp-config-8svb9" event={"ID":"7e604e64-1aad-49d8-9756-f7a7362bd4a2","Type":"ContainerStarted","Data":"d0c5168432077274ca9e2370366e3d73d4540b196dbba62469d4125951288da7"} Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.166302 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-959s5" event={"ID":"143852fb-6ab2-485f-bc1f-e269f407f221","Type":"ContainerDied","Data":"cf5b5bab78bf97056f83fd81e7df19a2ac23cccbe0f198dd40bd38ce2e3c8679"} Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.166342 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf5b5bab78bf97056f83fd81e7df19a2ac23cccbe0f198dd40bd38ce2e3c8679" Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.166386 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-959s5" Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.167204 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535362-5fl8x" event={"ID":"66182ef7-3ffe-4652-bd43-e0d8be6ea7fa","Type":"ContainerStarted","Data":"8d797e804bbd4e235f16ca0a6413c08bef9842fc865ea520bc3ec61c5d2e3799"} Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.169229 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9jqk7" event={"ID":"40664161-2e8b-4145-8b7a-5ec3723150bf","Type":"ContainerStarted","Data":"ff33f6599e5ddd894eee122447342aa510b435cb73d1291495aac3158217c977"} Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.173439 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"443e562a-aefe-46f3-a86f-59205f5849c7","Type":"ContainerStarted","Data":"118f64c4209e08f3dc281ec9601aaee1f783225640fc7b6f23d85a36174417c3"} Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.173719 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.176770 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec"} Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.176826 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6"} Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.176843 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2"} Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.187611 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9jqk7" podStartSLOduration=1.784635923 podStartE2EDuration="14.187589128s" podCreationTimestamp="2026-02-26 16:01:54 +0000 UTC" firstStartedPulling="2026-02-26 16:01:54.961562429 +0000 UTC m=+1242.174161079" lastFinishedPulling="2026-02-26 16:02:07.364515624 +0000 UTC m=+1254.577114284" observedRunningTime="2026-02-26 16:02:08.185216274 +0000 UTC m=+1255.397814944" watchObservedRunningTime="2026-02-26 16:02:08.187589128 +0000 UTC m=+1255.400187778" Feb 26 16:02:08 crc kubenswrapper[5033]: I0226 16:02:08.225475 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371962.62933 podStartE2EDuration="1m14.225446096s" podCreationTimestamp="2026-02-26 16:00:54 +0000 UTC" firstStartedPulling="2026-02-26 16:00:56.319982419 +0000 UTC m=+1183.532581069" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:08.219238846 +0000 UTC m=+1255.431837496" watchObservedRunningTime="2026-02-26 16:02:08.225446096 +0000 UTC m=+1255.438044746" Feb 26 16:02:09 crc kubenswrapper[5033]: I0226 16:02:09.189796 5033 generic.go:334] "Generic (PLEG): container finished" podID="7e604e64-1aad-49d8-9756-f7a7362bd4a2" containerID="fd7383dc4bc45e9bd455588f2952d88a9cce86734afce891b6ca79bc4eaa00c8" exitCode=0 Feb 26 16:02:09 crc kubenswrapper[5033]: I0226 16:02:09.192361 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fxrwp-config-8svb9" event={"ID":"7e604e64-1aad-49d8-9756-f7a7362bd4a2","Type":"ContainerDied","Data":"fd7383dc4bc45e9bd455588f2952d88a9cce86734afce891b6ca79bc4eaa00c8"} Feb 26 16:02:09 crc kubenswrapper[5033]: I0226 16:02:09.200620 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe"} Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.217357 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f"} Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.217703 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c"} Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.217719 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37"} Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.252852 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-fxrwp" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.600159 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.667337 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-additional-scripts\") pod \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.667728 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxc95\" (UniqueName: \"kubernetes.io/projected/7e604e64-1aad-49d8-9756-f7a7362bd4a2-kube-api-access-lxc95\") pod \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.667795 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-log-ovn\") pod \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.667837 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run-ovn\") pod \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.667908 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run\") pod \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668016 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-scripts\") pod \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\" (UID: \"7e604e64-1aad-49d8-9756-f7a7362bd4a2\") " Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668134 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7e604e64-1aad-49d8-9756-f7a7362bd4a2" (UID: "7e604e64-1aad-49d8-9756-f7a7362bd4a2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668201 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7e604e64-1aad-49d8-9756-f7a7362bd4a2" (UID: "7e604e64-1aad-49d8-9756-f7a7362bd4a2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668230 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7e604e64-1aad-49d8-9756-f7a7362bd4a2" (UID: "7e604e64-1aad-49d8-9756-f7a7362bd4a2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668253 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run" (OuterVolumeSpecName: "var-run") pod "7e604e64-1aad-49d8-9756-f7a7362bd4a2" (UID: "7e604e64-1aad-49d8-9756-f7a7362bd4a2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668574 5033 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668593 5033 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668606 5033 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e604e64-1aad-49d8-9756-f7a7362bd4a2-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668617 5033 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.668971 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-scripts" (OuterVolumeSpecName: "scripts") pod "7e604e64-1aad-49d8-9756-f7a7362bd4a2" (UID: "7e604e64-1aad-49d8-9756-f7a7362bd4a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.682475 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e604e64-1aad-49d8-9756-f7a7362bd4a2-kube-api-access-lxc95" (OuterVolumeSpecName: "kube-api-access-lxc95") pod "7e604e64-1aad-49d8-9756-f7a7362bd4a2" (UID: "7e604e64-1aad-49d8-9756-f7a7362bd4a2"). InnerVolumeSpecName "kube-api-access-lxc95". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.770243 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e604e64-1aad-49d8-9756-f7a7362bd4a2-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:10 crc kubenswrapper[5033]: I0226 16:02:10.770292 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxc95\" (UniqueName: \"kubernetes.io/projected/7e604e64-1aad-49d8-9756-f7a7362bd4a2-kube-api-access-lxc95\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:11 crc kubenswrapper[5033]: I0226 16:02:11.238287 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90"} Feb 26 16:02:11 crc kubenswrapper[5033]: I0226 16:02:11.239905 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fxrwp-config-8svb9" event={"ID":"7e604e64-1aad-49d8-9756-f7a7362bd4a2","Type":"ContainerDied","Data":"d0c5168432077274ca9e2370366e3d73d4540b196dbba62469d4125951288da7"} Feb 26 16:02:11 crc kubenswrapper[5033]: I0226 16:02:11.239939 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0c5168432077274ca9e2370366e3d73d4540b196dbba62469d4125951288da7" Feb 26 16:02:11 crc kubenswrapper[5033]: I0226 16:02:11.239983 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fxrwp-config-8svb9" Feb 26 16:02:11 crc kubenswrapper[5033]: I0226 16:02:11.729407 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fxrwp-config-8svb9"] Feb 26 16:02:11 crc kubenswrapper[5033]: I0226 16:02:11.737693 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fxrwp-config-8svb9"] Feb 26 16:02:13 crc kubenswrapper[5033]: I0226 16:02:13.665251 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e604e64-1aad-49d8-9756-f7a7362bd4a2" path="/var/lib/kubelet/pods/7e604e64-1aad-49d8-9756-f7a7362bd4a2/volumes" Feb 26 16:02:14 crc kubenswrapper[5033]: I0226 16:02:14.266282 5033 generic.go:334] "Generic (PLEG): container finished" podID="66182ef7-3ffe-4652-bd43-e0d8be6ea7fa" containerID="842f851e2c6783742a1f73d15c38ae7e35440285fe1efdf4007c5f41253c5d16" exitCode=0 Feb 26 16:02:14 crc kubenswrapper[5033]: I0226 16:02:14.266329 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535362-5fl8x" event={"ID":"66182ef7-3ffe-4652-bd43-e0d8be6ea7fa","Type":"ContainerDied","Data":"842f851e2c6783742a1f73d15c38ae7e35440285fe1efdf4007c5f41253c5d16"} Feb 26 16:02:15 crc kubenswrapper[5033]: I0226 16:02:15.224473 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:15 crc kubenswrapper[5033]: I0226 16:02:15.287857 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa"} Feb 26 16:02:15 crc kubenswrapper[5033]: I0226 16:02:15.287906 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023"} Feb 26 16:02:15 crc kubenswrapper[5033]: I0226 16:02:15.287920 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379"} Feb 26 16:02:15 crc kubenswrapper[5033]: I0226 16:02:15.536617 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535362-5fl8x" Feb 26 16:02:15 crc kubenswrapper[5033]: I0226 16:02:15.651272 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2st9\" (UniqueName: \"kubernetes.io/projected/66182ef7-3ffe-4652-bd43-e0d8be6ea7fa-kube-api-access-q2st9\") pod \"66182ef7-3ffe-4652-bd43-e0d8be6ea7fa\" (UID: \"66182ef7-3ffe-4652-bd43-e0d8be6ea7fa\") " Feb 26 16:02:15 crc kubenswrapper[5033]: I0226 16:02:15.656669 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66182ef7-3ffe-4652-bd43-e0d8be6ea7fa-kube-api-access-q2st9" (OuterVolumeSpecName: "kube-api-access-q2st9") pod "66182ef7-3ffe-4652-bd43-e0d8be6ea7fa" (UID: "66182ef7-3ffe-4652-bd43-e0d8be6ea7fa"). InnerVolumeSpecName "kube-api-access-q2st9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:15 crc kubenswrapper[5033]: I0226 16:02:15.753266 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2st9\" (UniqueName: \"kubernetes.io/projected/66182ef7-3ffe-4652-bd43-e0d8be6ea7fa-kube-api-access-q2st9\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:16 crc kubenswrapper[5033]: I0226 16:02:16.298114 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535362-5fl8x" event={"ID":"66182ef7-3ffe-4652-bd43-e0d8be6ea7fa","Type":"ContainerDied","Data":"8d797e804bbd4e235f16ca0a6413c08bef9842fc865ea520bc3ec61c5d2e3799"} Feb 26 16:02:16 crc kubenswrapper[5033]: I0226 16:02:16.298182 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d797e804bbd4e235f16ca0a6413c08bef9842fc865ea520bc3ec61c5d2e3799" Feb 26 16:02:16 crc kubenswrapper[5033]: I0226 16:02:16.298132 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535362-5fl8x" Feb 26 16:02:16 crc kubenswrapper[5033]: I0226 16:02:16.303599 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1"} Feb 26 16:02:16 crc kubenswrapper[5033]: I0226 16:02:16.623819 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535356-cjwln"] Feb 26 16:02:16 crc kubenswrapper[5033]: I0226 16:02:16.634498 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535356-cjwln"] Feb 26 16:02:17 crc kubenswrapper[5033]: I0226 16:02:17.318492 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085"} Feb 26 16:02:17 crc kubenswrapper[5033]: I0226 16:02:17.638368 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="012bed3c-4f40-4873-86fd-1a97de8df980" path="/var/lib/kubelet/pods/012bed3c-4f40-4873-86fd-1a97de8df980/volumes" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.341697 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2"} Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.342023 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerStarted","Data":"0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0"} Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.385234 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=22.084127011 podStartE2EDuration="37.38520686s" podCreationTimestamp="2026-02-26 16:01:41 +0000 UTC" firstStartedPulling="2026-02-26 16:01:59.110013481 +0000 UTC m=+1246.322612131" lastFinishedPulling="2026-02-26 16:02:14.41109333 +0000 UTC m=+1261.623691980" observedRunningTime="2026-02-26 16:02:18.372495882 +0000 UTC m=+1265.585094552" watchObservedRunningTime="2026-02-26 16:02:18.38520686 +0000 UTC m=+1265.597805550" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.673810 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-vkqqn"] Feb 26 16:02:18 crc kubenswrapper[5033]: E0226 16:02:18.674237 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e604e64-1aad-49d8-9756-f7a7362bd4a2" containerName="ovn-config" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.674273 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e604e64-1aad-49d8-9756-f7a7362bd4a2" containerName="ovn-config" Feb 26 16:02:18 crc kubenswrapper[5033]: E0226 16:02:18.674294 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66182ef7-3ffe-4652-bd43-e0d8be6ea7fa" containerName="oc" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.674304 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="66182ef7-3ffe-4652-bd43-e0d8be6ea7fa" containerName="oc" Feb 26 16:02:18 crc kubenswrapper[5033]: E0226 16:02:18.674320 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143852fb-6ab2-485f-bc1f-e269f407f221" containerName="mariadb-account-create-update" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.674328 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="143852fb-6ab2-485f-bc1f-e269f407f221" containerName="mariadb-account-create-update" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.674537 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="66182ef7-3ffe-4652-bd43-e0d8be6ea7fa" containerName="oc" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.674557 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e604e64-1aad-49d8-9756-f7a7362bd4a2" containerName="ovn-config" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.674569 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="143852fb-6ab2-485f-bc1f-e269f407f221" containerName="mariadb-account-create-update" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.675623 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.691445 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-vkqqn"] Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.691698 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.705123 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.705182 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.705216 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-config\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.705407 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcd9\" (UniqueName: \"kubernetes.io/projected/295a78fa-b18a-44d1-ba84-9c325d83bda9-kube-api-access-jrcd9\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.705440 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.705475 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.807270 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcd9\" (UniqueName: \"kubernetes.io/projected/295a78fa-b18a-44d1-ba84-9c325d83bda9-kube-api-access-jrcd9\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.807323 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.807348 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.807380 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.807418 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.807451 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-config\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.808503 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.808511 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-config\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.808543 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.809138 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.809396 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:18 crc kubenswrapper[5033]: I0226 16:02:18.842087 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcd9\" (UniqueName: \"kubernetes.io/projected/295a78fa-b18a-44d1-ba84-9c325d83bda9-kube-api-access-jrcd9\") pod \"dnsmasq-dns-5c79d794d7-vkqqn\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:19 crc kubenswrapper[5033]: I0226 16:02:19.005073 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:19 crc kubenswrapper[5033]: I0226 16:02:19.488131 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-vkqqn"] Feb 26 16:02:19 crc kubenswrapper[5033]: W0226 16:02:19.492886 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod295a78fa_b18a_44d1_ba84_9c325d83bda9.slice/crio-a454475a8c53ae93a1a66b83acb7fca3deccc0f59d9e0ac7409c7899cb87c858 WatchSource:0}: Error finding container a454475a8c53ae93a1a66b83acb7fca3deccc0f59d9e0ac7409c7899cb87c858: Status 404 returned error can't find the container with id a454475a8c53ae93a1a66b83acb7fca3deccc0f59d9e0ac7409c7899cb87c858 Feb 26 16:02:20 crc kubenswrapper[5033]: I0226 16:02:20.360044 5033 generic.go:334] "Generic (PLEG): container finished" podID="295a78fa-b18a-44d1-ba84-9c325d83bda9" containerID="98708fa1df8c223588f0d564d0bda8f30f7950dcb996257462125b57ce97c4e1" exitCode=0 Feb 26 16:02:20 crc kubenswrapper[5033]: I0226 16:02:20.360101 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" event={"ID":"295a78fa-b18a-44d1-ba84-9c325d83bda9","Type":"ContainerDied","Data":"98708fa1df8c223588f0d564d0bda8f30f7950dcb996257462125b57ce97c4e1"} Feb 26 16:02:20 crc kubenswrapper[5033]: I0226 16:02:20.360625 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" event={"ID":"295a78fa-b18a-44d1-ba84-9c325d83bda9","Type":"ContainerStarted","Data":"a454475a8c53ae93a1a66b83acb7fca3deccc0f59d9e0ac7409c7899cb87c858"} Feb 26 16:02:21 crc kubenswrapper[5033]: I0226 16:02:21.381409 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" event={"ID":"295a78fa-b18a-44d1-ba84-9c325d83bda9","Type":"ContainerStarted","Data":"6a7d7dbe02afd322b3e2d289d323bed25dba88e717f698d8dc49df491eab87d0"} Feb 26 16:02:21 crc kubenswrapper[5033]: I0226 16:02:21.385194 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:21 crc kubenswrapper[5033]: I0226 16:02:21.411234 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" podStartSLOduration=3.411213053 podStartE2EDuration="3.411213053s" podCreationTimestamp="2026-02-26 16:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:21.406883125 +0000 UTC m=+1268.619481875" watchObservedRunningTime="2026-02-26 16:02:21.411213053 +0000 UTC m=+1268.623811713" Feb 26 16:02:21 crc kubenswrapper[5033]: I0226 16:02:21.667324 5033 scope.go:117] "RemoveContainer" containerID="708a5969434e503948662e31899db8e9de9238ce44ad3d2abada45c3bb90752c" Feb 26 16:02:22 crc kubenswrapper[5033]: I0226 16:02:22.390963 5033 generic.go:334] "Generic (PLEG): container finished" podID="40664161-2e8b-4145-8b7a-5ec3723150bf" containerID="ff33f6599e5ddd894eee122447342aa510b435cb73d1291495aac3158217c977" exitCode=0 Feb 26 16:02:22 crc kubenswrapper[5033]: I0226 16:02:22.391060 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9jqk7" event={"ID":"40664161-2e8b-4145-8b7a-5ec3723150bf","Type":"ContainerDied","Data":"ff33f6599e5ddd894eee122447342aa510b435cb73d1291495aac3158217c977"} Feb 26 16:02:23 crc kubenswrapper[5033]: I0226 16:02:23.840737 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9jqk7" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.005867 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-config-data\") pod \"40664161-2e8b-4145-8b7a-5ec3723150bf\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.006006 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-combined-ca-bundle\") pod \"40664161-2e8b-4145-8b7a-5ec3723150bf\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.006119 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdtqk\" (UniqueName: \"kubernetes.io/projected/40664161-2e8b-4145-8b7a-5ec3723150bf-kube-api-access-bdtqk\") pod \"40664161-2e8b-4145-8b7a-5ec3723150bf\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.006221 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-db-sync-config-data\") pod \"40664161-2e8b-4145-8b7a-5ec3723150bf\" (UID: \"40664161-2e8b-4145-8b7a-5ec3723150bf\") " Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.011349 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "40664161-2e8b-4145-8b7a-5ec3723150bf" (UID: "40664161-2e8b-4145-8b7a-5ec3723150bf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.011651 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40664161-2e8b-4145-8b7a-5ec3723150bf-kube-api-access-bdtqk" (OuterVolumeSpecName: "kube-api-access-bdtqk") pod "40664161-2e8b-4145-8b7a-5ec3723150bf" (UID: "40664161-2e8b-4145-8b7a-5ec3723150bf"). InnerVolumeSpecName "kube-api-access-bdtqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.028055 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40664161-2e8b-4145-8b7a-5ec3723150bf" (UID: "40664161-2e8b-4145-8b7a-5ec3723150bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.045843 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-config-data" (OuterVolumeSpecName: "config-data") pod "40664161-2e8b-4145-8b7a-5ec3723150bf" (UID: "40664161-2e8b-4145-8b7a-5ec3723150bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.107958 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdtqk\" (UniqueName: \"kubernetes.io/projected/40664161-2e8b-4145-8b7a-5ec3723150bf-kube-api-access-bdtqk\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.108005 5033 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.108017 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.108030 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40664161-2e8b-4145-8b7a-5ec3723150bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.411654 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9jqk7" event={"ID":"40664161-2e8b-4145-8b7a-5ec3723150bf","Type":"ContainerDied","Data":"2566491bfbe81f43a64df927f5e1133e2ad8f7505136dafbd19f2ea99d3ca184"} Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.411713 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2566491bfbe81f43a64df927f5e1133e2ad8f7505136dafbd19f2ea99d3ca184" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.411770 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9jqk7" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.944985 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-vkqqn"] Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.948939 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" podUID="295a78fa-b18a-44d1-ba84-9c325d83bda9" containerName="dnsmasq-dns" containerID="cri-o://6a7d7dbe02afd322b3e2d289d323bed25dba88e717f698d8dc49df491eab87d0" gracePeriod=10 Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.993375 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-mwdpf"] Feb 26 16:02:24 crc kubenswrapper[5033]: E0226 16:02:24.994021 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40664161-2e8b-4145-8b7a-5ec3723150bf" containerName="glance-db-sync" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.994043 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="40664161-2e8b-4145-8b7a-5ec3723150bf" containerName="glance-db-sync" Feb 26 16:02:24 crc kubenswrapper[5033]: I0226 16:02:24.994216 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="40664161-2e8b-4145-8b7a-5ec3723150bf" containerName="glance-db-sync" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.021673 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.029442 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-mwdpf"] Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.126811 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-config\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.127022 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.127103 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdscf\" (UniqueName: \"kubernetes.io/projected/b76c988f-7123-4c0c-84a2-44ae8ffb7898-kube-api-access-mdscf\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.128050 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.128203 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.128319 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.229476 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-config\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.229520 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.229546 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdscf\" (UniqueName: \"kubernetes.io/projected/b76c988f-7123-4c0c-84a2-44ae8ffb7898-kube-api-access-mdscf\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.229571 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.229604 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.229631 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.230553 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.231077 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-config\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.231591 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.231760 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.231787 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.267233 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdscf\" (UniqueName: \"kubernetes.io/projected/b76c988f-7123-4c0c-84a2-44ae8ffb7898-kube-api-access-mdscf\") pod \"dnsmasq-dns-5f59b8f679-mwdpf\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.363338 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.425944 5033 generic.go:334] "Generic (PLEG): container finished" podID="295a78fa-b18a-44d1-ba84-9c325d83bda9" containerID="6a7d7dbe02afd322b3e2d289d323bed25dba88e717f698d8dc49df491eab87d0" exitCode=0 Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.425997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" event={"ID":"295a78fa-b18a-44d1-ba84-9c325d83bda9","Type":"ContainerDied","Data":"6a7d7dbe02afd322b3e2d289d323bed25dba88e717f698d8dc49df491eab87d0"} Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.426035 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" event={"ID":"295a78fa-b18a-44d1-ba84-9c325d83bda9","Type":"ContainerDied","Data":"a454475a8c53ae93a1a66b83acb7fca3deccc0f59d9e0ac7409c7899cb87c858"} Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.426050 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a454475a8c53ae93a1a66b83acb7fca3deccc0f59d9e0ac7409c7899cb87c858" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.471505 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.547693 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-swift-storage-0\") pod \"295a78fa-b18a-44d1-ba84-9c325d83bda9\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.547768 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-svc\") pod \"295a78fa-b18a-44d1-ba84-9c325d83bda9\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.547795 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrcd9\" (UniqueName: \"kubernetes.io/projected/295a78fa-b18a-44d1-ba84-9c325d83bda9-kube-api-access-jrcd9\") pod \"295a78fa-b18a-44d1-ba84-9c325d83bda9\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.547831 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-sb\") pod \"295a78fa-b18a-44d1-ba84-9c325d83bda9\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.547882 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-nb\") pod \"295a78fa-b18a-44d1-ba84-9c325d83bda9\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.547944 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-config\") pod \"295a78fa-b18a-44d1-ba84-9c325d83bda9\" (UID: \"295a78fa-b18a-44d1-ba84-9c325d83bda9\") " Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.571128 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295a78fa-b18a-44d1-ba84-9c325d83bda9-kube-api-access-jrcd9" (OuterVolumeSpecName: "kube-api-access-jrcd9") pod "295a78fa-b18a-44d1-ba84-9c325d83bda9" (UID: "295a78fa-b18a-44d1-ba84-9c325d83bda9"). InnerVolumeSpecName "kube-api-access-jrcd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.574424 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.610812 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "295a78fa-b18a-44d1-ba84-9c325d83bda9" (UID: "295a78fa-b18a-44d1-ba84-9c325d83bda9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.614463 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "295a78fa-b18a-44d1-ba84-9c325d83bda9" (UID: "295a78fa-b18a-44d1-ba84-9c325d83bda9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.639367 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-config" (OuterVolumeSpecName: "config") pod "295a78fa-b18a-44d1-ba84-9c325d83bda9" (UID: "295a78fa-b18a-44d1-ba84-9c325d83bda9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.641870 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "295a78fa-b18a-44d1-ba84-9c325d83bda9" (UID: "295a78fa-b18a-44d1-ba84-9c325d83bda9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.650505 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.650544 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrcd9\" (UniqueName: \"kubernetes.io/projected/295a78fa-b18a-44d1-ba84-9c325d83bda9-kube-api-access-jrcd9\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.650558 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.650572 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.650585 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.746130 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "295a78fa-b18a-44d1-ba84-9c325d83bda9" (UID: "295a78fa-b18a-44d1-ba84-9c325d83bda9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.755596 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/295a78fa-b18a-44d1-ba84-9c325d83bda9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.958708 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6kvjv"] Feb 26 16:02:25 crc kubenswrapper[5033]: E0226 16:02:25.959454 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295a78fa-b18a-44d1-ba84-9c325d83bda9" containerName="dnsmasq-dns" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.959471 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="295a78fa-b18a-44d1-ba84-9c325d83bda9" containerName="dnsmasq-dns" Feb 26 16:02:25 crc kubenswrapper[5033]: E0226 16:02:25.959488 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295a78fa-b18a-44d1-ba84-9c325d83bda9" containerName="init" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.959496 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="295a78fa-b18a-44d1-ba84-9c325d83bda9" containerName="init" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.959689 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="295a78fa-b18a-44d1-ba84-9c325d83bda9" containerName="dnsmasq-dns" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.960376 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.988331 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6kvjv"] Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.995825 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-bb44-account-create-update-xkl5l"] Feb 26 16:02:25 crc kubenswrapper[5033]: I0226 16:02:25.997159 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.006499 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.026330 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bb44-account-create-update-xkl5l"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.059334 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/494e1101-248e-4991-82f0-03bbec4fb3c2-operator-scripts\") pod \"cinder-db-create-6kvjv\" (UID: \"494e1101-248e-4991-82f0-03bbec4fb3c2\") " pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.059421 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbqf\" (UniqueName: \"kubernetes.io/projected/494e1101-248e-4991-82f0-03bbec4fb3c2-kube-api-access-bzbqf\") pod \"cinder-db-create-6kvjv\" (UID: \"494e1101-248e-4991-82f0-03bbec4fb3c2\") " pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.074031 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-mwdpf"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.160798 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbqf\" (UniqueName: \"kubernetes.io/projected/494e1101-248e-4991-82f0-03bbec4fb3c2-kube-api-access-bzbqf\") pod \"cinder-db-create-6kvjv\" (UID: \"494e1101-248e-4991-82f0-03bbec4fb3c2\") " pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.161070 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtxc9\" (UniqueName: \"kubernetes.io/projected/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-kube-api-access-gtxc9\") pod \"cinder-bb44-account-create-update-xkl5l\" (UID: \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\") " pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.161161 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-operator-scripts\") pod \"cinder-bb44-account-create-update-xkl5l\" (UID: \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\") " pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.161282 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/494e1101-248e-4991-82f0-03bbec4fb3c2-operator-scripts\") pod \"cinder-db-create-6kvjv\" (UID: \"494e1101-248e-4991-82f0-03bbec4fb3c2\") " pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.161990 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/494e1101-248e-4991-82f0-03bbec4fb3c2-operator-scripts\") pod \"cinder-db-create-6kvjv\" (UID: \"494e1101-248e-4991-82f0-03bbec4fb3c2\") " pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.204236 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbqf\" (UniqueName: \"kubernetes.io/projected/494e1101-248e-4991-82f0-03bbec4fb3c2-kube-api-access-bzbqf\") pod \"cinder-db-create-6kvjv\" (UID: \"494e1101-248e-4991-82f0-03bbec4fb3c2\") " pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.205756 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9702-account-create-update-rfpn9"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.206814 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.212698 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.249897 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ckn6b"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.252168 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.262978 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9702-account-create-update-rfpn9"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.263042 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtxc9\" (UniqueName: \"kubernetes.io/projected/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-kube-api-access-gtxc9\") pod \"cinder-bb44-account-create-update-xkl5l\" (UID: \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\") " pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.263083 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-operator-scripts\") pod \"cinder-bb44-account-create-update-xkl5l\" (UID: \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\") " pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.263827 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-operator-scripts\") pod \"cinder-bb44-account-create-update-xkl5l\" (UID: \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\") " pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.273302 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ckn6b"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.283767 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtxc9\" (UniqueName: \"kubernetes.io/projected/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-kube-api-access-gtxc9\") pod \"cinder-bb44-account-create-update-xkl5l\" (UID: \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\") " pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.286605 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.296036 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9xg9v"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.297212 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.299491 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.299665 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.299892 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.299930 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pg9lf" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.317208 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9xg9v"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.325604 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.365716 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5ntp4"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.367067 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.369190 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tc94\" (UniqueName: \"kubernetes.io/projected/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-kube-api-access-4tc94\") pod \"barbican-db-create-ckn6b\" (UID: \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\") " pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.369264 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sdlt\" (UniqueName: \"kubernetes.io/projected/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-kube-api-access-7sdlt\") pod \"barbican-9702-account-create-update-rfpn9\" (UID: \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\") " pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.369304 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-operator-scripts\") pod \"barbican-db-create-ckn6b\" (UID: \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\") " pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.369361 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-operator-scripts\") pod \"barbican-9702-account-create-update-rfpn9\" (UID: \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\") " pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.381558 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5ntp4"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.388110 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6a31-account-create-update-5x6jh"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.389247 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.398381 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6a31-account-create-update-5x6jh"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.398595 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.469573 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" event={"ID":"b76c988f-7123-4c0c-84a2-44ae8ffb7898","Type":"ContainerStarted","Data":"bf34709205899c4b71c7f60e1812d42a081fcf7607295820a8425d6399b3492c"} Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.469680 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-vkqqn" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.471559 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-config-data\") pod \"keystone-db-sync-9xg9v\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.471606 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644c07f3-9ac5-48e9-92e3-641647c2e147-operator-scripts\") pod \"neutron-db-create-5ntp4\" (UID: \"644c07f3-9ac5-48e9-92e3-641647c2e147\") " pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.471651 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tc94\" (UniqueName: \"kubernetes.io/projected/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-kube-api-access-4tc94\") pod \"barbican-db-create-ckn6b\" (UID: \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\") " pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.471677 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sdlt\" (UniqueName: \"kubernetes.io/projected/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-kube-api-access-7sdlt\") pod \"barbican-9702-account-create-update-rfpn9\" (UID: \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\") " pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.471711 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-operator-scripts\") pod \"barbican-db-create-ckn6b\" (UID: \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\") " pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.471750 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crgmq\" (UniqueName: \"kubernetes.io/projected/41652b04-a199-4ceb-a87d-5734867a31b5-kube-api-access-crgmq\") pod \"keystone-db-sync-9xg9v\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.471777 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-combined-ca-bundle\") pod \"keystone-db-sync-9xg9v\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.471798 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-operator-scripts\") pod \"barbican-9702-account-create-update-rfpn9\" (UID: \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\") " pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.471822 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxtnd\" (UniqueName: \"kubernetes.io/projected/644c07f3-9ac5-48e9-92e3-641647c2e147-kube-api-access-sxtnd\") pod \"neutron-db-create-5ntp4\" (UID: \"644c07f3-9ac5-48e9-92e3-641647c2e147\") " pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.473786 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-operator-scripts\") pod \"barbican-db-create-ckn6b\" (UID: \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\") " pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.474228 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-operator-scripts\") pod \"barbican-9702-account-create-update-rfpn9\" (UID: \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\") " pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.494594 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sdlt\" (UniqueName: \"kubernetes.io/projected/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-kube-api-access-7sdlt\") pod \"barbican-9702-account-create-update-rfpn9\" (UID: \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\") " pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.494938 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tc94\" (UniqueName: \"kubernetes.io/projected/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-kube-api-access-4tc94\") pod \"barbican-db-create-ckn6b\" (UID: \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\") " pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.518591 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-vkqqn"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.537826 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-vkqqn"] Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.573736 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-kube-api-access-sntwv\") pod \"neutron-6a31-account-create-update-5x6jh\" (UID: \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\") " pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.573782 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-operator-scripts\") pod \"neutron-6a31-account-create-update-5x6jh\" (UID: \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\") " pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.573824 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crgmq\" (UniqueName: \"kubernetes.io/projected/41652b04-a199-4ceb-a87d-5734867a31b5-kube-api-access-crgmq\") pod \"keystone-db-sync-9xg9v\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.573855 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-combined-ca-bundle\") pod \"keystone-db-sync-9xg9v\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.573899 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxtnd\" (UniqueName: \"kubernetes.io/projected/644c07f3-9ac5-48e9-92e3-641647c2e147-kube-api-access-sxtnd\") pod \"neutron-db-create-5ntp4\" (UID: \"644c07f3-9ac5-48e9-92e3-641647c2e147\") " pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.573935 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-config-data\") pod \"keystone-db-sync-9xg9v\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.573954 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644c07f3-9ac5-48e9-92e3-641647c2e147-operator-scripts\") pod \"neutron-db-create-5ntp4\" (UID: \"644c07f3-9ac5-48e9-92e3-641647c2e147\") " pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.574664 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644c07f3-9ac5-48e9-92e3-641647c2e147-operator-scripts\") pod \"neutron-db-create-5ntp4\" (UID: \"644c07f3-9ac5-48e9-92e3-641647c2e147\") " pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.580014 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-config-data\") pod \"keystone-db-sync-9xg9v\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.583702 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-combined-ca-bundle\") pod \"keystone-db-sync-9xg9v\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.596026 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxtnd\" (UniqueName: \"kubernetes.io/projected/644c07f3-9ac5-48e9-92e3-641647c2e147-kube-api-access-sxtnd\") pod \"neutron-db-create-5ntp4\" (UID: \"644c07f3-9ac5-48e9-92e3-641647c2e147\") " pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.596469 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crgmq\" (UniqueName: \"kubernetes.io/projected/41652b04-a199-4ceb-a87d-5734867a31b5-kube-api-access-crgmq\") pod \"keystone-db-sync-9xg9v\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.647516 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.663308 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.674838 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.675586 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-kube-api-access-sntwv\") pod \"neutron-6a31-account-create-update-5x6jh\" (UID: \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\") " pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.675630 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-operator-scripts\") pod \"neutron-6a31-account-create-update-5x6jh\" (UID: \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\") " pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.676494 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-operator-scripts\") pod \"neutron-6a31-account-create-update-5x6jh\" (UID: \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\") " pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.685607 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.694778 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-kube-api-access-sntwv\") pod \"neutron-6a31-account-create-update-5x6jh\" (UID: \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\") " pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.743614 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.922302 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bb44-account-create-update-xkl5l"] Feb 26 16:02:26 crc kubenswrapper[5033]: W0226 16:02:26.936557 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d1d3eb9_d4fb_45e6_a12f_2a4823dee95d.slice/crio-c46709c5d4d6e6b2c830c302ea89dc5637b6b65445a2353df4541f552403bd25 WatchSource:0}: Error finding container c46709c5d4d6e6b2c830c302ea89dc5637b6b65445a2353df4541f552403bd25: Status 404 returned error can't find the container with id c46709c5d4d6e6b2c830c302ea89dc5637b6b65445a2353df4541f552403bd25 Feb 26 16:02:26 crc kubenswrapper[5033]: I0226 16:02:26.999140 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6kvjv"] Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.220374 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9702-account-create-update-rfpn9"] Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.354738 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9xg9v"] Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.458515 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ckn6b"] Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.482363 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xg9v" event={"ID":"41652b04-a199-4ceb-a87d-5734867a31b5","Type":"ContainerStarted","Data":"8d9b8ca6e7dc6f79c1b915953143796980ab8246c4535c666e28979c2b2f5058"} Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.495869 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6kvjv" event={"ID":"494e1101-248e-4991-82f0-03bbec4fb3c2","Type":"ContainerStarted","Data":"172da1142b99165336a234020b93a79cc90ba02e0907932c242923af1b6dd5cf"} Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.495942 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6kvjv" event={"ID":"494e1101-248e-4991-82f0-03bbec4fb3c2","Type":"ContainerStarted","Data":"2d6c6ef90d46fb4a9f1069885c9fdf9167ee83297864641f9ad438254a16e723"} Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.498746 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bb44-account-create-update-xkl5l" event={"ID":"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d","Type":"ContainerStarted","Data":"67e9048d9f60698463f280ec768071214417ebd7eeb7838d27122da3960e1adf"} Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.498785 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bb44-account-create-update-xkl5l" event={"ID":"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d","Type":"ContainerStarted","Data":"c46709c5d4d6e6b2c830c302ea89dc5637b6b65445a2353df4541f552403bd25"} Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.504783 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6a31-account-create-update-5x6jh"] Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.505437 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9702-account-create-update-rfpn9" event={"ID":"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6","Type":"ContainerStarted","Data":"8048b99d6102184bc815ed0caa1b92fdffca1a1e35dc184d3381ce81ff22a083"} Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.508615 5033 generic.go:334] "Generic (PLEG): container finished" podID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerID="56eb026c665ccccb999987a07c3068aa61f0d3ee6e4cefc01258c36598b9f051" exitCode=0 Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.508655 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" event={"ID":"b76c988f-7123-4c0c-84a2-44ae8ffb7898","Type":"ContainerDied","Data":"56eb026c665ccccb999987a07c3068aa61f0d3ee6e4cefc01258c36598b9f051"} Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.521248 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5ntp4"] Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.525686 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-6kvjv" podStartSLOduration=2.525661524 podStartE2EDuration="2.525661524s" podCreationTimestamp="2026-02-26 16:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:27.51978852 +0000 UTC m=+1274.732387170" watchObservedRunningTime="2026-02-26 16:02:27.525661524 +0000 UTC m=+1274.738260174" Feb 26 16:02:27 crc kubenswrapper[5033]: W0226 16:02:27.538402 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod644c07f3_9ac5_48e9_92e3_641647c2e147.slice/crio-0fb3b05762cb9a3db01b1ddc2b1f6f703ef736d9d02fba3477049a8a4d5064de WatchSource:0}: Error finding container 0fb3b05762cb9a3db01b1ddc2b1f6f703ef736d9d02fba3477049a8a4d5064de: Status 404 returned error can't find the container with id 0fb3b05762cb9a3db01b1ddc2b1f6f703ef736d9d02fba3477049a8a4d5064de Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.548382 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-bb44-account-create-update-xkl5l" podStartSLOduration=2.548360058 podStartE2EDuration="2.548360058s" podCreationTimestamp="2026-02-26 16:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:27.547100589 +0000 UTC m=+1274.759699239" watchObservedRunningTime="2026-02-26 16:02:27.548360058 +0000 UTC m=+1274.760958708" Feb 26 16:02:27 crc kubenswrapper[5033]: I0226 16:02:27.641607 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295a78fa-b18a-44d1-ba84-9c325d83bda9" path="/var/lib/kubelet/pods/295a78fa-b18a-44d1-ba84-9c325d83bda9/volumes" Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.517910 5033 generic.go:334] "Generic (PLEG): container finished" podID="644c07f3-9ac5-48e9-92e3-641647c2e147" containerID="b14df90635c8f9f683dd06ebec2c2537674122a3b91f3dd889cb1e109a2874d9" exitCode=0 Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.517963 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5ntp4" event={"ID":"644c07f3-9ac5-48e9-92e3-641647c2e147","Type":"ContainerDied","Data":"b14df90635c8f9f683dd06ebec2c2537674122a3b91f3dd889cb1e109a2874d9"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.518008 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5ntp4" event={"ID":"644c07f3-9ac5-48e9-92e3-641647c2e147","Type":"ContainerStarted","Data":"0fb3b05762cb9a3db01b1ddc2b1f6f703ef736d9d02fba3477049a8a4d5064de"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.520244 5033 generic.go:334] "Generic (PLEG): container finished" podID="a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6" containerID="ba8083c3d660d1cba09283dbdbdc8e9d22c0f6c25c7dc930e73006ee1f1e2a16" exitCode=0 Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.520339 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9702-account-create-update-rfpn9" event={"ID":"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6","Type":"ContainerDied","Data":"ba8083c3d660d1cba09283dbdbdc8e9d22c0f6c25c7dc930e73006ee1f1e2a16"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.522812 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" event={"ID":"b76c988f-7123-4c0c-84a2-44ae8ffb7898","Type":"ContainerStarted","Data":"1257f20356492fc3f99a4c76c10244dac99e5fd97cd925d186c71d4e21948a5d"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.523654 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.525596 5033 generic.go:334] "Generic (PLEG): container finished" podID="8e6df4ba-4af0-46df-a43a-ff8737d7de9c" containerID="7afaf1da9f235b64e0b8813df9a957f0a8325406367387aae716204d0ae91726" exitCode=0 Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.525648 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6a31-account-create-update-5x6jh" event={"ID":"8e6df4ba-4af0-46df-a43a-ff8737d7de9c","Type":"ContainerDied","Data":"7afaf1da9f235b64e0b8813df9a957f0a8325406367387aae716204d0ae91726"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.525668 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6a31-account-create-update-5x6jh" event={"ID":"8e6df4ba-4af0-46df-a43a-ff8737d7de9c","Type":"ContainerStarted","Data":"2c770d825ebc53f6d8a90c56e3d4dee004c046dbbb35d7abc52a7cdc9cacb7ff"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.528905 5033 generic.go:334] "Generic (PLEG): container finished" podID="cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1" containerID="a305a66c7d239ac14f07cf5803873ef4155ad10317dad1eda014aca0549ea266" exitCode=0 Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.528965 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ckn6b" event={"ID":"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1","Type":"ContainerDied","Data":"a305a66c7d239ac14f07cf5803873ef4155ad10317dad1eda014aca0549ea266"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.529038 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ckn6b" event={"ID":"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1","Type":"ContainerStarted","Data":"f89e894dff7998c10da760258fc7672f5b2f34d7a8e014f0658a94fb29b35615"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.530910 5033 generic.go:334] "Generic (PLEG): container finished" podID="494e1101-248e-4991-82f0-03bbec4fb3c2" containerID="172da1142b99165336a234020b93a79cc90ba02e0907932c242923af1b6dd5cf" exitCode=0 Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.530956 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6kvjv" event={"ID":"494e1101-248e-4991-82f0-03bbec4fb3c2","Type":"ContainerDied","Data":"172da1142b99165336a234020b93a79cc90ba02e0907932c242923af1b6dd5cf"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.533157 5033 generic.go:334] "Generic (PLEG): container finished" podID="6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d" containerID="67e9048d9f60698463f280ec768071214417ebd7eeb7838d27122da3960e1adf" exitCode=0 Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.533194 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bb44-account-create-update-xkl5l" event={"ID":"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d","Type":"ContainerDied","Data":"67e9048d9f60698463f280ec768071214417ebd7eeb7838d27122da3960e1adf"} Feb 26 16:02:28 crc kubenswrapper[5033]: I0226 16:02:28.612854 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" podStartSLOduration=4.612825563 podStartE2EDuration="4.612825563s" podCreationTimestamp="2026-02-26 16:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:28.603031561 +0000 UTC m=+1275.815630211" watchObservedRunningTime="2026-02-26 16:02:28.612825563 +0000 UTC m=+1275.825424213" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.074217 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.083570 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.102076 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.115500 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.116155 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.116622 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.186399 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sdlt\" (UniqueName: \"kubernetes.io/projected/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-kube-api-access-7sdlt\") pod \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\" (UID: \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.186605 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-kube-api-access-sntwv\") pod \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\" (UID: \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.186664 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-operator-scripts\") pod \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\" (UID: \"8e6df4ba-4af0-46df-a43a-ff8737d7de9c\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.186782 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-operator-scripts\") pod \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\" (UID: \"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.188147 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6" (UID: "a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.188703 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e6df4ba-4af0-46df-a43a-ff8737d7de9c" (UID: "8e6df4ba-4af0-46df-a43a-ff8737d7de9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.193565 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-kube-api-access-7sdlt" (OuterVolumeSpecName: "kube-api-access-7sdlt") pod "a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6" (UID: "a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6"). InnerVolumeSpecName "kube-api-access-7sdlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.196963 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-kube-api-access-sntwv" (OuterVolumeSpecName: "kube-api-access-sntwv") pod "8e6df4ba-4af0-46df-a43a-ff8737d7de9c" (UID: "8e6df4ba-4af0-46df-a43a-ff8737d7de9c"). InnerVolumeSpecName "kube-api-access-sntwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.287814 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzbqf\" (UniqueName: \"kubernetes.io/projected/494e1101-248e-4991-82f0-03bbec4fb3c2-kube-api-access-bzbqf\") pod \"494e1101-248e-4991-82f0-03bbec4fb3c2\" (UID: \"494e1101-248e-4991-82f0-03bbec4fb3c2\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.287868 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxtnd\" (UniqueName: \"kubernetes.io/projected/644c07f3-9ac5-48e9-92e3-641647c2e147-kube-api-access-sxtnd\") pod \"644c07f3-9ac5-48e9-92e3-641647c2e147\" (UID: \"644c07f3-9ac5-48e9-92e3-641647c2e147\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.287912 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644c07f3-9ac5-48e9-92e3-641647c2e147-operator-scripts\") pod \"644c07f3-9ac5-48e9-92e3-641647c2e147\" (UID: \"644c07f3-9ac5-48e9-92e3-641647c2e147\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.287942 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tc94\" (UniqueName: \"kubernetes.io/projected/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-kube-api-access-4tc94\") pod \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\" (UID: \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.287983 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-operator-scripts\") pod \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\" (UID: \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.288095 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtxc9\" (UniqueName: \"kubernetes.io/projected/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-kube-api-access-gtxc9\") pod \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\" (UID: \"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.288194 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-operator-scripts\") pod \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\" (UID: \"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.288244 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/494e1101-248e-4991-82f0-03bbec4fb3c2-operator-scripts\") pod \"494e1101-248e-4991-82f0-03bbec4fb3c2\" (UID: \"494e1101-248e-4991-82f0-03bbec4fb3c2\") " Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.288689 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sntwv\" (UniqueName: \"kubernetes.io/projected/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-kube-api-access-sntwv\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.288574 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/644c07f3-9ac5-48e9-92e3-641647c2e147-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "644c07f3-9ac5-48e9-92e3-641647c2e147" (UID: "644c07f3-9ac5-48e9-92e3-641647c2e147"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.288708 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e6df4ba-4af0-46df-a43a-ff8737d7de9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.288753 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d" (UID: "6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.288769 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.288843 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sdlt\" (UniqueName: \"kubernetes.io/projected/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6-kube-api-access-7sdlt\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.289140 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/494e1101-248e-4991-82f0-03bbec4fb3c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "494e1101-248e-4991-82f0-03bbec4fb3c2" (UID: "494e1101-248e-4991-82f0-03bbec4fb3c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.289796 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1" (UID: "cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.302329 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/494e1101-248e-4991-82f0-03bbec4fb3c2-kube-api-access-bzbqf" (OuterVolumeSpecName: "kube-api-access-bzbqf") pod "494e1101-248e-4991-82f0-03bbec4fb3c2" (UID: "494e1101-248e-4991-82f0-03bbec4fb3c2"). InnerVolumeSpecName "kube-api-access-bzbqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.302454 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-kube-api-access-gtxc9" (OuterVolumeSpecName: "kube-api-access-gtxc9") pod "6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d" (UID: "6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d"). InnerVolumeSpecName "kube-api-access-gtxc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.302484 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644c07f3-9ac5-48e9-92e3-641647c2e147-kube-api-access-sxtnd" (OuterVolumeSpecName: "kube-api-access-sxtnd") pod "644c07f3-9ac5-48e9-92e3-641647c2e147" (UID: "644c07f3-9ac5-48e9-92e3-641647c2e147"). InnerVolumeSpecName "kube-api-access-sxtnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.303837 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-kube-api-access-4tc94" (OuterVolumeSpecName: "kube-api-access-4tc94") pod "cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1" (UID: "cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1"). InnerVolumeSpecName "kube-api-access-4tc94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.390953 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzbqf\" (UniqueName: \"kubernetes.io/projected/494e1101-248e-4991-82f0-03bbec4fb3c2-kube-api-access-bzbqf\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.391006 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxtnd\" (UniqueName: \"kubernetes.io/projected/644c07f3-9ac5-48e9-92e3-641647c2e147-kube-api-access-sxtnd\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.391024 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644c07f3-9ac5-48e9-92e3-641647c2e147-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.391040 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tc94\" (UniqueName: \"kubernetes.io/projected/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-kube-api-access-4tc94\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.391055 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.391067 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtxc9\" (UniqueName: \"kubernetes.io/projected/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d-kube-api-access-gtxc9\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.391079 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.391090 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/494e1101-248e-4991-82f0-03bbec4fb3c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.569088 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5ntp4" event={"ID":"644c07f3-9ac5-48e9-92e3-641647c2e147","Type":"ContainerDied","Data":"0fb3b05762cb9a3db01b1ddc2b1f6f703ef736d9d02fba3477049a8a4d5064de"} Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.569512 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fb3b05762cb9a3db01b1ddc2b1f6f703ef736d9d02fba3477049a8a4d5064de" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.569144 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5ntp4" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.571459 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9702-account-create-update-rfpn9" event={"ID":"a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6","Type":"ContainerDied","Data":"8048b99d6102184bc815ed0caa1b92fdffca1a1e35dc184d3381ce81ff22a083"} Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.571524 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8048b99d6102184bc815ed0caa1b92fdffca1a1e35dc184d3381ce81ff22a083" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.571482 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9702-account-create-update-rfpn9" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.573916 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xg9v" event={"ID":"41652b04-a199-4ceb-a87d-5734867a31b5","Type":"ContainerStarted","Data":"27390461cfcd9e9921cea3efef5e382089d375c8f6c002d59c240bf21f61222b"} Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.578158 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6a31-account-create-update-5x6jh" event={"ID":"8e6df4ba-4af0-46df-a43a-ff8737d7de9c","Type":"ContainerDied","Data":"2c770d825ebc53f6d8a90c56e3d4dee004c046dbbb35d7abc52a7cdc9cacb7ff"} Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.578202 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c770d825ebc53f6d8a90c56e3d4dee004c046dbbb35d7abc52a7cdc9cacb7ff" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.578299 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6a31-account-create-update-5x6jh" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.586841 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ckn6b" event={"ID":"cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1","Type":"ContainerDied","Data":"f89e894dff7998c10da760258fc7672f5b2f34d7a8e014f0658a94fb29b35615"} Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.586884 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f89e894dff7998c10da760258fc7672f5b2f34d7a8e014f0658a94fb29b35615" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.586967 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ckn6b" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.588944 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6kvjv" event={"ID":"494e1101-248e-4991-82f0-03bbec4fb3c2","Type":"ContainerDied","Data":"2d6c6ef90d46fb4a9f1069885c9fdf9167ee83297864641f9ad438254a16e723"} Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.588980 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6kvjv" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.588987 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d6c6ef90d46fb4a9f1069885c9fdf9167ee83297864641f9ad438254a16e723" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.590270 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bb44-account-create-update-xkl5l" event={"ID":"6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d","Type":"ContainerDied","Data":"c46709c5d4d6e6b2c830c302ea89dc5637b6b65445a2353df4541f552403bd25"} Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.590297 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c46709c5d4d6e6b2c830c302ea89dc5637b6b65445a2353df4541f552403bd25" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.590363 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bb44-account-create-update-xkl5l" Feb 26 16:02:32 crc kubenswrapper[5033]: I0226 16:02:32.597948 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9xg9v" podStartSLOduration=1.792162833 podStartE2EDuration="6.597919502s" podCreationTimestamp="2026-02-26 16:02:26 +0000 UTC" firstStartedPulling="2026-02-26 16:02:27.365215527 +0000 UTC m=+1274.577814167" lastFinishedPulling="2026-02-26 16:02:32.170972186 +0000 UTC m=+1279.383570836" observedRunningTime="2026-02-26 16:02:32.594907374 +0000 UTC m=+1279.807506024" watchObservedRunningTime="2026-02-26 16:02:32.597919502 +0000 UTC m=+1279.810518152" Feb 26 16:02:35 crc kubenswrapper[5033]: I0226 16:02:35.365232 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:02:35 crc kubenswrapper[5033]: I0226 16:02:35.428694 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9hpw2"] Feb 26 16:02:35 crc kubenswrapper[5033]: I0226 16:02:35.428976 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" podUID="04084062-c67c-4433-bc9c-76203d0a326e" containerName="dnsmasq-dns" containerID="cri-o://508a044c3f1d584f9c821bc289398bcdb7799b7fe7c2e60db6edf5ef87d59c85" gracePeriod=10 Feb 26 16:02:35 crc kubenswrapper[5033]: I0226 16:02:35.675806 5033 generic.go:334] "Generic (PLEG): container finished" podID="04084062-c67c-4433-bc9c-76203d0a326e" containerID="508a044c3f1d584f9c821bc289398bcdb7799b7fe7c2e60db6edf5ef87d59c85" exitCode=0 Feb 26 16:02:35 crc kubenswrapper[5033]: I0226 16:02:35.675899 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" event={"ID":"04084062-c67c-4433-bc9c-76203d0a326e","Type":"ContainerDied","Data":"508a044c3f1d584f9c821bc289398bcdb7799b7fe7c2e60db6edf5ef87d59c85"} Feb 26 16:02:35 crc kubenswrapper[5033]: I0226 16:02:35.928416 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.078867 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-config\") pod \"04084062-c67c-4433-bc9c-76203d0a326e\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.078973 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-dns-svc\") pod \"04084062-c67c-4433-bc9c-76203d0a326e\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.079102 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-sb\") pod \"04084062-c67c-4433-bc9c-76203d0a326e\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.079139 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-nb\") pod \"04084062-c67c-4433-bc9c-76203d0a326e\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.079173 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzdx7\" (UniqueName: \"kubernetes.io/projected/04084062-c67c-4433-bc9c-76203d0a326e-kube-api-access-lzdx7\") pod \"04084062-c67c-4433-bc9c-76203d0a326e\" (UID: \"04084062-c67c-4433-bc9c-76203d0a326e\") " Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.084886 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04084062-c67c-4433-bc9c-76203d0a326e-kube-api-access-lzdx7" (OuterVolumeSpecName: "kube-api-access-lzdx7") pod "04084062-c67c-4433-bc9c-76203d0a326e" (UID: "04084062-c67c-4433-bc9c-76203d0a326e"). InnerVolumeSpecName "kube-api-access-lzdx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.124910 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-config" (OuterVolumeSpecName: "config") pod "04084062-c67c-4433-bc9c-76203d0a326e" (UID: "04084062-c67c-4433-bc9c-76203d0a326e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.126026 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04084062-c67c-4433-bc9c-76203d0a326e" (UID: "04084062-c67c-4433-bc9c-76203d0a326e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.128982 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04084062-c67c-4433-bc9c-76203d0a326e" (UID: "04084062-c67c-4433-bc9c-76203d0a326e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.133742 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04084062-c67c-4433-bc9c-76203d0a326e" (UID: "04084062-c67c-4433-bc9c-76203d0a326e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.182052 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.182097 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.182110 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.182120 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzdx7\" (UniqueName: \"kubernetes.io/projected/04084062-c67c-4433-bc9c-76203d0a326e-kube-api-access-lzdx7\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.182129 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04084062-c67c-4433-bc9c-76203d0a326e-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.692623 5033 generic.go:334] "Generic (PLEG): container finished" podID="41652b04-a199-4ceb-a87d-5734867a31b5" containerID="27390461cfcd9e9921cea3efef5e382089d375c8f6c002d59c240bf21f61222b" exitCode=0 Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.692754 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xg9v" event={"ID":"41652b04-a199-4ceb-a87d-5734867a31b5","Type":"ContainerDied","Data":"27390461cfcd9e9921cea3efef5e382089d375c8f6c002d59c240bf21f61222b"} Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.696605 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" event={"ID":"04084062-c67c-4433-bc9c-76203d0a326e","Type":"ContainerDied","Data":"4428b605e4cfe2ed9a3d36e1d8ca1c01f3852b34aa96816796aeafe8965bf22d"} Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.696660 5033 scope.go:117] "RemoveContainer" containerID="508a044c3f1d584f9c821bc289398bcdb7799b7fe7c2e60db6edf5ef87d59c85" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.696807 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-9hpw2" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.721116 5033 scope.go:117] "RemoveContainer" containerID="b984b0734b8df81526f145dcf503a31e10508a24ee58e3c5357cfa3ee5c427e7" Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.738657 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9hpw2"] Feb 26 16:02:36 crc kubenswrapper[5033]: I0226 16:02:36.747329 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-9hpw2"] Feb 26 16:02:37 crc kubenswrapper[5033]: I0226 16:02:37.638761 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04084062-c67c-4433-bc9c-76203d0a326e" path="/var/lib/kubelet/pods/04084062-c67c-4433-bc9c-76203d0a326e/volumes" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.045955 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.216689 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-config-data\") pod \"41652b04-a199-4ceb-a87d-5734867a31b5\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.216747 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-combined-ca-bundle\") pod \"41652b04-a199-4ceb-a87d-5734867a31b5\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.216870 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crgmq\" (UniqueName: \"kubernetes.io/projected/41652b04-a199-4ceb-a87d-5734867a31b5-kube-api-access-crgmq\") pod \"41652b04-a199-4ceb-a87d-5734867a31b5\" (UID: \"41652b04-a199-4ceb-a87d-5734867a31b5\") " Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.221967 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41652b04-a199-4ceb-a87d-5734867a31b5-kube-api-access-crgmq" (OuterVolumeSpecName: "kube-api-access-crgmq") pod "41652b04-a199-4ceb-a87d-5734867a31b5" (UID: "41652b04-a199-4ceb-a87d-5734867a31b5"). InnerVolumeSpecName "kube-api-access-crgmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.242925 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41652b04-a199-4ceb-a87d-5734867a31b5" (UID: "41652b04-a199-4ceb-a87d-5734867a31b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.262525 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-config-data" (OuterVolumeSpecName: "config-data") pod "41652b04-a199-4ceb-a87d-5734867a31b5" (UID: "41652b04-a199-4ceb-a87d-5734867a31b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.318413 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crgmq\" (UniqueName: \"kubernetes.io/projected/41652b04-a199-4ceb-a87d-5734867a31b5-kube-api-access-crgmq\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.318451 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.318460 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41652b04-a199-4ceb-a87d-5734867a31b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.730117 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xg9v" event={"ID":"41652b04-a199-4ceb-a87d-5734867a31b5","Type":"ContainerDied","Data":"8d9b8ca6e7dc6f79c1b915953143796980ab8246c4535c666e28979c2b2f5058"} Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.730166 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d9b8ca6e7dc6f79c1b915953143796980ab8246c4535c666e28979c2b2f5058" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.730218 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xg9v" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.945878 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-4jwcx"] Feb 26 16:02:38 crc kubenswrapper[5033]: E0226 16:02:38.946620 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41652b04-a199-4ceb-a87d-5734867a31b5" containerName="keystone-db-sync" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.946644 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="41652b04-a199-4ceb-a87d-5734867a31b5" containerName="keystone-db-sync" Feb 26 16:02:38 crc kubenswrapper[5033]: E0226 16:02:38.946663 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04084062-c67c-4433-bc9c-76203d0a326e" containerName="init" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.946671 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="04084062-c67c-4433-bc9c-76203d0a326e" containerName="init" Feb 26 16:02:38 crc kubenswrapper[5033]: E0226 16:02:38.946692 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6" containerName="mariadb-account-create-update" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.946701 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6" containerName="mariadb-account-create-update" Feb 26 16:02:38 crc kubenswrapper[5033]: E0226 16:02:38.946715 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04084062-c67c-4433-bc9c-76203d0a326e" containerName="dnsmasq-dns" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.946725 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="04084062-c67c-4433-bc9c-76203d0a326e" containerName="dnsmasq-dns" Feb 26 16:02:38 crc kubenswrapper[5033]: E0226 16:02:38.946737 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="494e1101-248e-4991-82f0-03bbec4fb3c2" containerName="mariadb-database-create" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.946744 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="494e1101-248e-4991-82f0-03bbec4fb3c2" containerName="mariadb-database-create" Feb 26 16:02:38 crc kubenswrapper[5033]: E0226 16:02:38.946759 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6df4ba-4af0-46df-a43a-ff8737d7de9c" containerName="mariadb-account-create-update" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.946767 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6df4ba-4af0-46df-a43a-ff8737d7de9c" containerName="mariadb-account-create-update" Feb 26 16:02:38 crc kubenswrapper[5033]: E0226 16:02:38.946782 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1" containerName="mariadb-database-create" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.946789 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1" containerName="mariadb-database-create" Feb 26 16:02:38 crc kubenswrapper[5033]: E0226 16:02:38.946801 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644c07f3-9ac5-48e9-92e3-641647c2e147" containerName="mariadb-database-create" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.946809 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="644c07f3-9ac5-48e9-92e3-641647c2e147" containerName="mariadb-database-create" Feb 26 16:02:38 crc kubenswrapper[5033]: E0226 16:02:38.946831 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d" containerName="mariadb-account-create-update" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.946839 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d" containerName="mariadb-account-create-update" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.947038 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6" containerName="mariadb-account-create-update" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.947059 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="04084062-c67c-4433-bc9c-76203d0a326e" containerName="dnsmasq-dns" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.947073 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="644c07f3-9ac5-48e9-92e3-641647c2e147" containerName="mariadb-database-create" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.947090 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e6df4ba-4af0-46df-a43a-ff8737d7de9c" containerName="mariadb-account-create-update" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.947110 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="41652b04-a199-4ceb-a87d-5734867a31b5" containerName="keystone-db-sync" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.947123 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="494e1101-248e-4991-82f0-03bbec4fb3c2" containerName="mariadb-database-create" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.947132 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1" containerName="mariadb-database-create" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.947141 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d" containerName="mariadb-account-create-update" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.949195 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:38 crc kubenswrapper[5033]: I0226 16:02:38.999270 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-4jwcx"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.015876 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fv28z"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.017121 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.024292 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.024371 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.024548 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.024567 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pg9lf" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.024775 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.026490 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fv28z"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.140504 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.140575 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftzmv\" (UniqueName: \"kubernetes.io/projected/ac6811eb-fca9-4942-996c-16807de04775-kube-api-access-ftzmv\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.140599 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-fernet-keys\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.140732 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-config-data\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.140815 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-combined-ca-bundle\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.141077 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7d4p\" (UniqueName: \"kubernetes.io/projected/7dfe9d70-c541-4d56-9d1d-0d1debb11025-kube-api-access-l7d4p\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.141178 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.141199 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.141231 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-scripts\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.141269 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-credential-keys\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.141318 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-config\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.141353 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242289 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242527 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftzmv\" (UniqueName: \"kubernetes.io/projected/ac6811eb-fca9-4942-996c-16807de04775-kube-api-access-ftzmv\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242571 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-fernet-keys\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242598 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-config-data\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242624 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-combined-ca-bundle\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242682 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7d4p\" (UniqueName: \"kubernetes.io/projected/7dfe9d70-c541-4d56-9d1d-0d1debb11025-kube-api-access-l7d4p\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242716 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242733 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242750 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-scripts\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242765 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-credential-keys\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242783 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-config\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242799 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.242836 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.243682 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.244641 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.245082 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.248446 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-fernet-keys\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.249048 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-config\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.249092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.249599 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.250501 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-credential-keys\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.250682 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-config-data\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.253974 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-combined-ca-bundle\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.260629 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-scripts\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.260962 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.261072 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.276686 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.289454 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-lkf9s"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.293086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7d4p\" (UniqueName: \"kubernetes.io/projected/7dfe9d70-c541-4d56-9d1d-0d1debb11025-kube-api-access-l7d4p\") pod \"keystone-bootstrap-fv28z\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.295100 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftzmv\" (UniqueName: \"kubernetes.io/projected/ac6811eb-fca9-4942-996c-16807de04775-kube-api-access-ftzmv\") pod \"dnsmasq-dns-bbf5cc879-4jwcx\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.306588 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.312467 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-794tw" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.312643 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.312969 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.326972 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9bknn"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.328021 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.342604 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.342798 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gbh42" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.342910 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.343491 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.343867 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.343917 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-scripts\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.343945 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-config-data\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.344043 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcm4l\" (UniqueName: \"kubernetes.io/projected/689edd77-629b-4ce9-9392-f52fb34a7a1e-kube-api-access-bcm4l\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.344063 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-log-httpd\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.344099 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.344116 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-run-httpd\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.351909 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lkf9s"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.440637 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9bknn"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.446703 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-db-sync-config-data\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.446764 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-config\") pod \"neutron-db-sync-9bknn\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.446799 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-config-data\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.446834 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.446858 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b86b\" (UniqueName: \"kubernetes.io/projected/cf479687-fa77-4c8b-95ee-23b30c858b35-kube-api-access-9b86b\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.446887 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-run-httpd\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.446931 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.446970 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-scripts\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.446998 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-config-data\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.447037 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-combined-ca-bundle\") pod \"neutron-db-sync-9bknn\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.447061 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf479687-fa77-4c8b-95ee-23b30c858b35-etc-machine-id\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.447100 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5jhv\" (UniqueName: \"kubernetes.io/projected/6410a07e-efb2-4773-8007-688038a9898e-kube-api-access-s5jhv\") pod \"neutron-db-sync-9bknn\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.447125 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-scripts\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.447166 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-combined-ca-bundle\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.447193 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcm4l\" (UniqueName: \"kubernetes.io/projected/689edd77-629b-4ce9-9392-f52fb34a7a1e-kube-api-access-bcm4l\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.447217 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-log-httpd\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.449190 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-log-httpd\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.457793 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-run-httpd\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.458409 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.458552 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-scripts\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.461760 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.475340 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-config-data\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.487958 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcm4l\" (UniqueName: \"kubernetes.io/projected/689edd77-629b-4ce9-9392-f52fb34a7a1e-kube-api-access-bcm4l\") pod \"ceilometer-0\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.497752 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.502109 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hb64j"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.503283 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.508420 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hb64j"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.510494 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.510788 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dv7jp" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.519278 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.539092 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-4jwcx"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.539831 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.551127 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-combined-ca-bundle\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.551180 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-db-sync-config-data\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.551210 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-config\") pod \"neutron-db-sync-9bknn\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.551234 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-config-data\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.551251 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b86b\" (UniqueName: \"kubernetes.io/projected/cf479687-fa77-4c8b-95ee-23b30c858b35-kube-api-access-9b86b\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.551335 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-combined-ca-bundle\") pod \"neutron-db-sync-9bknn\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.551351 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf479687-fa77-4c8b-95ee-23b30c858b35-etc-machine-id\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.551379 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5jhv\" (UniqueName: \"kubernetes.io/projected/6410a07e-efb2-4773-8007-688038a9898e-kube-api-access-s5jhv\") pod \"neutron-db-sync-9bknn\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.551399 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-scripts\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.553362 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d7qv5"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.555391 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.555906 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-scripts\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.555975 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf479687-fa77-4c8b-95ee-23b30c858b35-etc-machine-id\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.557296 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-combined-ca-bundle\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.559051 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-db-sync-config-data\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.560169 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-combined-ca-bundle\") pod \"neutron-db-sync-9bknn\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.561175 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-config-data\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.568718 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-config\") pod \"neutron-db-sync-9bknn\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.589637 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b86b\" (UniqueName: \"kubernetes.io/projected/cf479687-fa77-4c8b-95ee-23b30c858b35-kube-api-access-9b86b\") pod \"cinder-db-sync-lkf9s\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.593966 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5jhv\" (UniqueName: \"kubernetes.io/projected/6410a07e-efb2-4773-8007-688038a9898e-kube-api-access-s5jhv\") pod \"neutron-db-sync-9bknn\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.610871 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d7qv5"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.650086 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-kptqh"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.650973 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kptqh"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.651047 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.653795 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.653996 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4mxvv" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.654587 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-logs\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.654672 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-scripts\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.654711 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9nxq\" (UniqueName: \"kubernetes.io/projected/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-kube-api-access-l9nxq\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.654751 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxcbg\" (UniqueName: \"kubernetes.io/projected/9776d976-a39c-4d6a-ac1b-d2a03164c21e-kube-api-access-sxcbg\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.654791 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-config-data\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.654841 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-combined-ca-bundle\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.654893 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.654923 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.654977 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.655006 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-config\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.655038 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.801786 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.801956 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-combined-ca-bundle\") pod \"barbican-db-sync-kptqh\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802085 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-logs\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802191 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-scripts\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802217 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-db-sync-config-data\") pod \"barbican-db-sync-kptqh\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802280 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9nxq\" (UniqueName: \"kubernetes.io/projected/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-kube-api-access-l9nxq\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802307 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxcbg\" (UniqueName: \"kubernetes.io/projected/9776d976-a39c-4d6a-ac1b-d2a03164c21e-kube-api-access-sxcbg\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802358 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-config-data\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802403 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-combined-ca-bundle\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802447 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb4dj\" (UniqueName: \"kubernetes.io/projected/bc977b82-241c-4c92-9755-877433f3e4de-kube-api-access-hb4dj\") pod \"barbican-db-sync-kptqh\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802483 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802513 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802546 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.802581 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-config\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.804569 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-config\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.805247 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.807369 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.807938 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.808634 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.810214 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.812015 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-scripts\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.812157 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-combined-ca-bundle\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.815570 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-logs\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.819005 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-config-data\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.821642 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxcbg\" (UniqueName: \"kubernetes.io/projected/9776d976-a39c-4d6a-ac1b-d2a03164c21e-kube-api-access-sxcbg\") pod \"dnsmasq-dns-56df8fb6b7-d7qv5\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.821906 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9nxq\" (UniqueName: \"kubernetes.io/projected/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-kube-api-access-l9nxq\") pod \"placement-db-sync-hb64j\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.822586 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bknn" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.834737 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hb64j" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.905112 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-combined-ca-bundle\") pod \"barbican-db-sync-kptqh\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.905228 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-db-sync-config-data\") pod \"barbican-db-sync-kptqh\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.910033 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb4dj\" (UniqueName: \"kubernetes.io/projected/bc977b82-241c-4c92-9755-877433f3e4de-kube-api-access-hb4dj\") pod \"barbican-db-sync-kptqh\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.910908 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-db-sync-config-data\") pod \"barbican-db-sync-kptqh\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.911494 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-combined-ca-bundle\") pod \"barbican-db-sync-kptqh\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.913498 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.928096 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb4dj\" (UniqueName: \"kubernetes.io/projected/bc977b82-241c-4c92-9755-877433f3e4de-kube-api-access-hb4dj\") pod \"barbican-db-sync-kptqh\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.971327 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fv28z"] Feb 26 16:02:39 crc kubenswrapper[5033]: I0226 16:02:39.974324 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kptqh" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.145916 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.147752 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.157127 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.158046 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qxt5t" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.158334 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.158452 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.158541 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.180401 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.217203 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-4jwcx"] Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.218348 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.218376 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-scripts\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.218396 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-logs\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.218426 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5r9q\" (UniqueName: \"kubernetes.io/projected/7e523d1f-9816-4bc2-9264-74fd43fba986-kube-api-access-m5r9q\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.218522 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.218546 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.218586 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-config-data\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.218610 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.225175 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.226532 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.234410 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.234667 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.241216 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:02:40 crc kubenswrapper[5033]: W0226 16:02:40.287284 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac6811eb_fca9_4942_996c_16807de04775.slice/crio-29317aa10585cbb492a6c5a90357a4473f8c04678ba9f54a015ac5b8256844bf WatchSource:0}: Error finding container 29317aa10585cbb492a6c5a90357a4473f8c04678ba9f54a015ac5b8256844bf: Status 404 returned error can't find the container with id 29317aa10585cbb492a6c5a90357a4473f8c04678ba9f54a015ac5b8256844bf Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.319928 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.319973 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.319998 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320026 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320058 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-logs\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320092 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-config-data\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320113 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320144 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320162 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-scripts\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320180 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-logs\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320198 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf6mw\" (UniqueName: \"kubernetes.io/projected/674a25b4-95f2-47c6-9c86-a002b11c214a-kube-api-access-kf6mw\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320217 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320241 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320276 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5r9q\" (UniqueName: \"kubernetes.io/projected/7e523d1f-9816-4bc2-9264-74fd43fba986-kube-api-access-m5r9q\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320328 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.320356 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.321816 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-logs\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.323151 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.323200 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.327476 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.328763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-scripts\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.329818 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-config-data\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.330715 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.339379 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5r9q\" (UniqueName: \"kubernetes.io/projected/7e523d1f-9816-4bc2-9264-74fd43fba986-kube-api-access-m5r9q\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.377418 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.423890 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.423969 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.424007 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.424031 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.424069 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-logs\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.424143 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf6mw\" (UniqueName: \"kubernetes.io/projected/674a25b4-95f2-47c6-9c86-a002b11c214a-kube-api-access-kf6mw\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.424200 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.424428 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.425531 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.425591 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.425794 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-logs\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.430774 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.431535 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.434907 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.449905 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.451110 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf6mw\" (UniqueName: \"kubernetes.io/projected/674a25b4-95f2-47c6-9c86-a002b11c214a-kube-api-access-kf6mw\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.481116 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.488957 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.605103 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.809395 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d7qv5"] Feb 26 16:02:40 crc kubenswrapper[5033]: I0226 16:02:40.820909 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9bknn"] Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.050114 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kptqh"] Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.056566 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hb64j"] Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.154633 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fv28z" event={"ID":"7dfe9d70-c541-4d56-9d1d-0d1debb11025","Type":"ContainerStarted","Data":"821f636cc982e0875bd9d953abc8ae92ee1bf3e1a4ab78c45eca149715e8ab3b"} Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.155062 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fv28z" event={"ID":"7dfe9d70-c541-4d56-9d1d-0d1debb11025","Type":"ContainerStarted","Data":"e6d21bcdbe4667edf958d4b8f5f41f0fc57059cdd37be9f0cb236e0327a5b4d6"} Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.165485 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"689edd77-629b-4ce9-9392-f52fb34a7a1e","Type":"ContainerStarted","Data":"ecd1ae6d4c3b131b815493b82ab1b62271db20ca910cfce2408e0a644e9a8395"} Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.167750 5033 generic.go:334] "Generic (PLEG): container finished" podID="ac6811eb-fca9-4942-996c-16807de04775" containerID="714009bed495882e8a8e9344e22baae4b2f2d38f6b30a604636ee7e4e2ab7bb3" exitCode=0 Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.167803 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" event={"ID":"ac6811eb-fca9-4942-996c-16807de04775","Type":"ContainerDied","Data":"714009bed495882e8a8e9344e22baae4b2f2d38f6b30a604636ee7e4e2ab7bb3"} Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.167823 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" event={"ID":"ac6811eb-fca9-4942-996c-16807de04775","Type":"ContainerStarted","Data":"29317aa10585cbb492a6c5a90357a4473f8c04678ba9f54a015ac5b8256844bf"} Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.186354 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lkf9s"] Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.194003 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kptqh" event={"ID":"bc977b82-241c-4c92-9755-877433f3e4de","Type":"ContainerStarted","Data":"bc41c83925eaa20d4cd44e9c2cdc485aa5383c9d81798fa8c472bf48b122bbcb"} Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.207838 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fv28z" podStartSLOduration=3.207817422 podStartE2EDuration="3.207817422s" podCreationTimestamp="2026-02-26 16:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:41.178800644 +0000 UTC m=+1288.391399294" watchObservedRunningTime="2026-02-26 16:02:41.207817422 +0000 UTC m=+1288.420416092" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.212394 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hb64j" event={"ID":"ded3d8f6-4b47-4d1d-af78-b295f7b729a7","Type":"ContainerStarted","Data":"01943182c1e1c0840f50b25a28564679c030c4151cc1177cfdf00f3f1a76bc81"} Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.213504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" event={"ID":"9776d976-a39c-4d6a-ac1b-d2a03164c21e","Type":"ContainerStarted","Data":"0036f483e7374a4e38d9e168992b5d12633f95871b5298cca55be5606cf80319"} Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.214402 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bknn" event={"ID":"6410a07e-efb2-4773-8007-688038a9898e","Type":"ContainerStarted","Data":"201156ef11bb9c2bf2914374469531624d7b07a0c48cd39edb333f22dc3ea498"} Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.272386 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.474804 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:02:41 crc kubenswrapper[5033]: W0226 16:02:41.501598 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674a25b4_95f2_47c6_9c86_a002b11c214a.slice/crio-5589fd795ad853171b855d68e4c87ffe5c1a27874ecea548f56a625f99661c8b WatchSource:0}: Error finding container 5589fd795ad853171b855d68e4c87ffe5c1a27874ecea548f56a625f99661c8b: Status 404 returned error can't find the container with id 5589fd795ad853171b855d68e4c87ffe5c1a27874ecea548f56a625f99661c8b Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.609667 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.678155 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftzmv\" (UniqueName: \"kubernetes.io/projected/ac6811eb-fca9-4942-996c-16807de04775-kube-api-access-ftzmv\") pod \"ac6811eb-fca9-4942-996c-16807de04775\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.678237 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-config\") pod \"ac6811eb-fca9-4942-996c-16807de04775\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.678320 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-sb\") pod \"ac6811eb-fca9-4942-996c-16807de04775\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.678401 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-svc\") pod \"ac6811eb-fca9-4942-996c-16807de04775\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.678452 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-swift-storage-0\") pod \"ac6811eb-fca9-4942-996c-16807de04775\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.678493 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-nb\") pod \"ac6811eb-fca9-4942-996c-16807de04775\" (UID: \"ac6811eb-fca9-4942-996c-16807de04775\") " Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.701958 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac6811eb-fca9-4942-996c-16807de04775" (UID: "ac6811eb-fca9-4942-996c-16807de04775"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.707712 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6811eb-fca9-4942-996c-16807de04775-kube-api-access-ftzmv" (OuterVolumeSpecName: "kube-api-access-ftzmv") pod "ac6811eb-fca9-4942-996c-16807de04775" (UID: "ac6811eb-fca9-4942-996c-16807de04775"). InnerVolumeSpecName "kube-api-access-ftzmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:41 crc kubenswrapper[5033]: E0226 16:02:41.709083 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9776d976_a39c_4d6a_ac1b_d2a03164c21e.slice/crio-conmon-feea146a8e1448f993add04f8c1fcee6a819a21e21a884498fe86641c0f717a8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9776d976_a39c_4d6a_ac1b_d2a03164c21e.slice/crio-feea146a8e1448f993add04f8c1fcee6a819a21e21a884498fe86641c0f717a8.scope\": RecentStats: unable to find data in memory cache]" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.712017 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac6811eb-fca9-4942-996c-16807de04775" (UID: "ac6811eb-fca9-4942-996c-16807de04775"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.721853 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-config" (OuterVolumeSpecName: "config") pod "ac6811eb-fca9-4942-996c-16807de04775" (UID: "ac6811eb-fca9-4942-996c-16807de04775"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.729721 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac6811eb-fca9-4942-996c-16807de04775" (UID: "ac6811eb-fca9-4942-996c-16807de04775"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.734192 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac6811eb-fca9-4942-996c-16807de04775" (UID: "ac6811eb-fca9-4942-996c-16807de04775"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.781122 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.781162 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.781174 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftzmv\" (UniqueName: \"kubernetes.io/projected/ac6811eb-fca9-4942-996c-16807de04775-kube-api-access-ftzmv\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.781187 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.781200 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:41 crc kubenswrapper[5033]: I0226 16:02:41.781279 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac6811eb-fca9-4942-996c-16807de04775-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.235692 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674a25b4-95f2-47c6-9c86-a002b11c214a","Type":"ContainerStarted","Data":"a6889d37766bd69e29ef8bfe47fbbe988a6f1fc717d97b0a66287c9595940d9e"} Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.236118 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674a25b4-95f2-47c6-9c86-a002b11c214a","Type":"ContainerStarted","Data":"5589fd795ad853171b855d68e4c87ffe5c1a27874ecea548f56a625f99661c8b"} Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.236985 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lkf9s" event={"ID":"cf479687-fa77-4c8b-95ee-23b30c858b35","Type":"ContainerStarted","Data":"1cb52df17c55d2c2af4e4f86e59be92ac91587f0a35694028b2d1e3bfbcfa576"} Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.239385 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e523d1f-9816-4bc2-9264-74fd43fba986","Type":"ContainerStarted","Data":"356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2"} Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.239432 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e523d1f-9816-4bc2-9264-74fd43fba986","Type":"ContainerStarted","Data":"4f609b3e3bc7724fdd44b866afb08fb424fdf4c68a18ce523b934efd26ef6f08"} Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.241462 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" event={"ID":"ac6811eb-fca9-4942-996c-16807de04775","Type":"ContainerDied","Data":"29317aa10585cbb492a6c5a90357a4473f8c04678ba9f54a015ac5b8256844bf"} Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.241506 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-4jwcx" Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.241515 5033 scope.go:117] "RemoveContainer" containerID="714009bed495882e8a8e9344e22baae4b2f2d38f6b30a604636ee7e4e2ab7bb3" Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.245023 5033 generic.go:334] "Generic (PLEG): container finished" podID="9776d976-a39c-4d6a-ac1b-d2a03164c21e" containerID="feea146a8e1448f993add04f8c1fcee6a819a21e21a884498fe86641c0f717a8" exitCode=0 Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.245519 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" event={"ID":"9776d976-a39c-4d6a-ac1b-d2a03164c21e","Type":"ContainerDied","Data":"feea146a8e1448f993add04f8c1fcee6a819a21e21a884498fe86641c0f717a8"} Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.249430 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bknn" event={"ID":"6410a07e-efb2-4773-8007-688038a9898e","Type":"ContainerStarted","Data":"88846ecdfcf0ca846f3ca53a24d1fe2b4ad8f940ade0f867c6bb6c964a133abd"} Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.307589 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9bknn" podStartSLOduration=3.307546186 podStartE2EDuration="3.307546186s" podCreationTimestamp="2026-02-26 16:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:42.302754618 +0000 UTC m=+1289.515353268" watchObservedRunningTime="2026-02-26 16:02:42.307546186 +0000 UTC m=+1289.520144836" Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.423485 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-4jwcx"] Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.434812 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-4jwcx"] Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.940862 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:02:42 crc kubenswrapper[5033]: I0226 16:02:42.971902 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.039795 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.276553 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674a25b4-95f2-47c6-9c86-a002b11c214a","Type":"ContainerStarted","Data":"c77e40eb31aa925a22033e521ff5b2294d613b4bc01111cfa9b5d44bf7aa1406"} Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.283823 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e523d1f-9816-4bc2-9264-74fd43fba986","Type":"ContainerStarted","Data":"8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065"} Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.283984 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerName="glance-log" containerID="cri-o://356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2" gracePeriod=30 Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.284213 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerName="glance-httpd" containerID="cri-o://8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065" gracePeriod=30 Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.312071 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" event={"ID":"9776d976-a39c-4d6a-ac1b-d2a03164c21e","Type":"ContainerStarted","Data":"ef7eeb6aa813ec1d7fbf7209df2db453639f39a057ab6e1aa34fd3c66fe79cb1"} Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.312965 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.313541 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.313493546 podStartE2EDuration="4.313493546s" podCreationTimestamp="2026-02-26 16:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:43.300747446 +0000 UTC m=+1290.513346096" watchObservedRunningTime="2026-02-26 16:02:43.313493546 +0000 UTC m=+1290.526092196" Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.359575 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.359552579 podStartE2EDuration="4.359552579s" podCreationTimestamp="2026-02-26 16:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:43.334895551 +0000 UTC m=+1290.547494201" watchObservedRunningTime="2026-02-26 16:02:43.359552579 +0000 UTC m=+1290.572151229" Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.370071 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" podStartSLOduration=4.3700523669999995 podStartE2EDuration="4.370052367s" podCreationTimestamp="2026-02-26 16:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:43.367034039 +0000 UTC m=+1290.579632699" watchObservedRunningTime="2026-02-26 16:02:43.370052367 +0000 UTC m=+1290.582651017" Feb 26 16:02:43 crc kubenswrapper[5033]: I0226 16:02:43.655416 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6811eb-fca9-4942-996c-16807de04775" path="/var/lib/kubelet/pods/ac6811eb-fca9-4942-996c-16807de04775/volumes" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.091868 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.145910 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-combined-ca-bundle\") pod \"7e523d1f-9816-4bc2-9264-74fd43fba986\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.146149 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-logs\") pod \"7e523d1f-9816-4bc2-9264-74fd43fba986\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.146179 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5r9q\" (UniqueName: \"kubernetes.io/projected/7e523d1f-9816-4bc2-9264-74fd43fba986-kube-api-access-m5r9q\") pod \"7e523d1f-9816-4bc2-9264-74fd43fba986\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.146199 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-public-tls-certs\") pod \"7e523d1f-9816-4bc2-9264-74fd43fba986\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.146232 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-config-data\") pod \"7e523d1f-9816-4bc2-9264-74fd43fba986\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.146248 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-scripts\") pod \"7e523d1f-9816-4bc2-9264-74fd43fba986\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.146309 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-httpd-run\") pod \"7e523d1f-9816-4bc2-9264-74fd43fba986\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.146344 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7e523d1f-9816-4bc2-9264-74fd43fba986\" (UID: \"7e523d1f-9816-4bc2-9264-74fd43fba986\") " Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.146752 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-logs" (OuterVolumeSpecName: "logs") pod "7e523d1f-9816-4bc2-9264-74fd43fba986" (UID: "7e523d1f-9816-4bc2-9264-74fd43fba986"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.146903 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7e523d1f-9816-4bc2-9264-74fd43fba986" (UID: "7e523d1f-9816-4bc2-9264-74fd43fba986"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.147096 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.155022 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "7e523d1f-9816-4bc2-9264-74fd43fba986" (UID: "7e523d1f-9816-4bc2-9264-74fd43fba986"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.155521 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e523d1f-9816-4bc2-9264-74fd43fba986-kube-api-access-m5r9q" (OuterVolumeSpecName: "kube-api-access-m5r9q") pod "7e523d1f-9816-4bc2-9264-74fd43fba986" (UID: "7e523d1f-9816-4bc2-9264-74fd43fba986"). InnerVolumeSpecName "kube-api-access-m5r9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.157227 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-scripts" (OuterVolumeSpecName: "scripts") pod "7e523d1f-9816-4bc2-9264-74fd43fba986" (UID: "7e523d1f-9816-4bc2-9264-74fd43fba986"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.178408 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e523d1f-9816-4bc2-9264-74fd43fba986" (UID: "7e523d1f-9816-4bc2-9264-74fd43fba986"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.194977 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-config-data" (OuterVolumeSpecName: "config-data") pod "7e523d1f-9816-4bc2-9264-74fd43fba986" (UID: "7e523d1f-9816-4bc2-9264-74fd43fba986"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.218734 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7e523d1f-9816-4bc2-9264-74fd43fba986" (UID: "7e523d1f-9816-4bc2-9264-74fd43fba986"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.248286 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5r9q\" (UniqueName: \"kubernetes.io/projected/7e523d1f-9816-4bc2-9264-74fd43fba986-kube-api-access-m5r9q\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.248322 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.248335 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.248346 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.248360 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e523d1f-9816-4bc2-9264-74fd43fba986-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.248390 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.248402 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e523d1f-9816-4bc2-9264-74fd43fba986-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.264502 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.327564 5033 generic.go:334] "Generic (PLEG): container finished" podID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerID="8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065" exitCode=143 Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.327596 5033 generic.go:334] "Generic (PLEG): container finished" podID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerID="356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2" exitCode=143 Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.327770 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerName="glance-log" containerID="cri-o://a6889d37766bd69e29ef8bfe47fbbe988a6f1fc717d97b0a66287c9595940d9e" gracePeriod=30 Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.328078 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.328212 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e523d1f-9816-4bc2-9264-74fd43fba986","Type":"ContainerDied","Data":"8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065"} Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.328306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e523d1f-9816-4bc2-9264-74fd43fba986","Type":"ContainerDied","Data":"356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2"} Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.328320 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e523d1f-9816-4bc2-9264-74fd43fba986","Type":"ContainerDied","Data":"4f609b3e3bc7724fdd44b866afb08fb424fdf4c68a18ce523b934efd26ef6f08"} Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.328341 5033 scope.go:117] "RemoveContainer" containerID="8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.328518 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerName="glance-httpd" containerID="cri-o://c77e40eb31aa925a22033e521ff5b2294d613b4bc01111cfa9b5d44bf7aa1406" gracePeriod=30 Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.350601 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.369820 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.380955 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.406126 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:02:44 crc kubenswrapper[5033]: E0226 16:02:44.406946 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerName="glance-httpd" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.407038 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerName="glance-httpd" Feb 26 16:02:44 crc kubenswrapper[5033]: E0226 16:02:44.407128 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerName="glance-log" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.407195 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerName="glance-log" Feb 26 16:02:44 crc kubenswrapper[5033]: E0226 16:02:44.407385 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6811eb-fca9-4942-996c-16807de04775" containerName="init" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.407491 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6811eb-fca9-4942-996c-16807de04775" containerName="init" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.407827 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerName="glance-httpd" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.407939 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6811eb-fca9-4942-996c-16807de04775" containerName="init" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.408020 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e523d1f-9816-4bc2-9264-74fd43fba986" containerName="glance-log" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.409342 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.411685 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.411863 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.417492 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.554806 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.554868 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.554932 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-scripts\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.554958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsjl9\" (UniqueName: \"kubernetes.io/projected/19e1a391-4e54-4cee-9c88-34c9e99d71fa-kube-api-access-bsjl9\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.555504 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-logs\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.555591 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.555633 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-config-data\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.555680 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.657672 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.657735 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.657790 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-scripts\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.657818 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsjl9\" (UniqueName: \"kubernetes.io/projected/19e1a391-4e54-4cee-9c88-34c9e99d71fa-kube-api-access-bsjl9\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.657841 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-logs\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.657869 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.657902 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-config-data\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.658197 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.658425 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.660318 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-logs\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.660774 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.666353 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.668896 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.669334 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-config-data\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.674860 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-scripts\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.678198 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsjl9\" (UniqueName: \"kubernetes.io/projected/19e1a391-4e54-4cee-9c88-34c9e99d71fa-kube-api-access-bsjl9\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.692709 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " pod="openstack/glance-default-external-api-0" Feb 26 16:02:44 crc kubenswrapper[5033]: I0226 16:02:44.727302 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:02:45 crc kubenswrapper[5033]: I0226 16:02:45.344715 5033 generic.go:334] "Generic (PLEG): container finished" podID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerID="c77e40eb31aa925a22033e521ff5b2294d613b4bc01111cfa9b5d44bf7aa1406" exitCode=0 Feb 26 16:02:45 crc kubenswrapper[5033]: I0226 16:02:45.344749 5033 generic.go:334] "Generic (PLEG): container finished" podID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerID="a6889d37766bd69e29ef8bfe47fbbe988a6f1fc717d97b0a66287c9595940d9e" exitCode=143 Feb 26 16:02:45 crc kubenswrapper[5033]: I0226 16:02:45.344815 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674a25b4-95f2-47c6-9c86-a002b11c214a","Type":"ContainerDied","Data":"c77e40eb31aa925a22033e521ff5b2294d613b4bc01111cfa9b5d44bf7aa1406"} Feb 26 16:02:45 crc kubenswrapper[5033]: I0226 16:02:45.345069 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674a25b4-95f2-47c6-9c86-a002b11c214a","Type":"ContainerDied","Data":"a6889d37766bd69e29ef8bfe47fbbe988a6f1fc717d97b0a66287c9595940d9e"} Feb 26 16:02:45 crc kubenswrapper[5033]: I0226 16:02:45.348777 5033 generic.go:334] "Generic (PLEG): container finished" podID="7dfe9d70-c541-4d56-9d1d-0d1debb11025" containerID="821f636cc982e0875bd9d953abc8ae92ee1bf3e1a4ab78c45eca149715e8ab3b" exitCode=0 Feb 26 16:02:45 crc kubenswrapper[5033]: I0226 16:02:45.348838 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fv28z" event={"ID":"7dfe9d70-c541-4d56-9d1d-0d1debb11025","Type":"ContainerDied","Data":"821f636cc982e0875bd9d953abc8ae92ee1bf3e1a4ab78c45eca149715e8ab3b"} Feb 26 16:02:45 crc kubenswrapper[5033]: I0226 16:02:45.640485 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e523d1f-9816-4bc2-9264-74fd43fba986" path="/var/lib/kubelet/pods/7e523d1f-9816-4bc2-9264-74fd43fba986/volumes" Feb 26 16:02:49 crc kubenswrapper[5033]: I0226 16:02:49.915497 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:02:49 crc kubenswrapper[5033]: I0226 16:02:49.994940 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-mwdpf"] Feb 26 16:02:49 crc kubenswrapper[5033]: I0226 16:02:49.996361 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerName="dnsmasq-dns" containerID="cri-o://1257f20356492fc3f99a4c76c10244dac99e5fd97cd925d186c71d4e21948a5d" gracePeriod=10 Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.364852 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: connect: connection refused" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.398385 5033 generic.go:334] "Generic (PLEG): container finished" podID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerID="1257f20356492fc3f99a4c76c10244dac99e5fd97cd925d186c71d4e21948a5d" exitCode=0 Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.398428 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" event={"ID":"b76c988f-7123-4c0c-84a2-44ae8ffb7898","Type":"ContainerDied","Data":"1257f20356492fc3f99a4c76c10244dac99e5fd97cd925d186c71d4e21948a5d"} Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.718020 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.786907 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-scripts\") pod \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.787366 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7d4p\" (UniqueName: \"kubernetes.io/projected/7dfe9d70-c541-4d56-9d1d-0d1debb11025-kube-api-access-l7d4p\") pod \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.787398 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-config-data\") pod \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.787432 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-combined-ca-bundle\") pod \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.787459 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-credential-keys\") pod \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.787501 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-fernet-keys\") pod \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\" (UID: \"7dfe9d70-c541-4d56-9d1d-0d1debb11025\") " Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.795136 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7dfe9d70-c541-4d56-9d1d-0d1debb11025" (UID: "7dfe9d70-c541-4d56-9d1d-0d1debb11025"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.797838 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7dfe9d70-c541-4d56-9d1d-0d1debb11025" (UID: "7dfe9d70-c541-4d56-9d1d-0d1debb11025"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.799948 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dfe9d70-c541-4d56-9d1d-0d1debb11025-kube-api-access-l7d4p" (OuterVolumeSpecName: "kube-api-access-l7d4p") pod "7dfe9d70-c541-4d56-9d1d-0d1debb11025" (UID: "7dfe9d70-c541-4d56-9d1d-0d1debb11025"). InnerVolumeSpecName "kube-api-access-l7d4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.804698 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-scripts" (OuterVolumeSpecName: "scripts") pod "7dfe9d70-c541-4d56-9d1d-0d1debb11025" (UID: "7dfe9d70-c541-4d56-9d1d-0d1debb11025"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.816070 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dfe9d70-c541-4d56-9d1d-0d1debb11025" (UID: "7dfe9d70-c541-4d56-9d1d-0d1debb11025"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.823906 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-config-data" (OuterVolumeSpecName: "config-data") pod "7dfe9d70-c541-4d56-9d1d-0d1debb11025" (UID: "7dfe9d70-c541-4d56-9d1d-0d1debb11025"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.888662 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.888697 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.888709 5033 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.888727 5033 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.888741 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dfe9d70-c541-4d56-9d1d-0d1debb11025-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:50 crc kubenswrapper[5033]: I0226 16:02:50.888752 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7d4p\" (UniqueName: \"kubernetes.io/projected/7dfe9d70-c541-4d56-9d1d-0d1debb11025-kube-api-access-l7d4p\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.423161 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fv28z" event={"ID":"7dfe9d70-c541-4d56-9d1d-0d1debb11025","Type":"ContainerDied","Data":"e6d21bcdbe4667edf958d4b8f5f41f0fc57059cdd37be9f0cb236e0327a5b4d6"} Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.423204 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6d21bcdbe4667edf958d4b8f5f41f0fc57059cdd37be9f0cb236e0327a5b4d6" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.423296 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fv28z" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.830189 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fv28z"] Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.838993 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fv28z"] Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.937427 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mnxgk"] Feb 26 16:02:51 crc kubenswrapper[5033]: E0226 16:02:51.937825 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfe9d70-c541-4d56-9d1d-0d1debb11025" containerName="keystone-bootstrap" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.937844 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfe9d70-c541-4d56-9d1d-0d1debb11025" containerName="keystone-bootstrap" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.938077 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dfe9d70-c541-4d56-9d1d-0d1debb11025" containerName="keystone-bootstrap" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.938698 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.941915 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.942083 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.942180 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.942836 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pg9lf" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.943410 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 16:02:51 crc kubenswrapper[5033]: I0226 16:02:51.949015 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mnxgk"] Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.007918 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-fernet-keys\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.008011 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nmpx\" (UniqueName: \"kubernetes.io/projected/e8fd420c-3146-49a5-a9c5-69e3f01e8848-kube-api-access-4nmpx\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.008036 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-config-data\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.008063 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-credential-keys\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.008081 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-scripts\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.008266 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-combined-ca-bundle\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.110188 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-scripts\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.110274 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-combined-ca-bundle\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.110379 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-fernet-keys\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.110451 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nmpx\" (UniqueName: \"kubernetes.io/projected/e8fd420c-3146-49a5-a9c5-69e3f01e8848-kube-api-access-4nmpx\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.110480 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-config-data\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.110511 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-credential-keys\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.116283 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-credential-keys\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.116897 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-config-data\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.117073 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-combined-ca-bundle\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.117610 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-fernet-keys\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.118741 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-scripts\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.131032 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nmpx\" (UniqueName: \"kubernetes.io/projected/e8fd420c-3146-49a5-a9c5-69e3f01e8848-kube-api-access-4nmpx\") pod \"keystone-bootstrap-mnxgk\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:52 crc kubenswrapper[5033]: I0226 16:02:52.259776 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:02:53 crc kubenswrapper[5033]: I0226 16:02:53.639491 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dfe9d70-c541-4d56-9d1d-0d1debb11025" path="/var/lib/kubelet/pods/7dfe9d70-c541-4d56-9d1d-0d1debb11025/volumes" Feb 26 16:02:55 crc kubenswrapper[5033]: I0226 16:02:55.365226 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: connect: connection refused" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.148905 5033 scope.go:117] "RemoveContainer" containerID="356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.244560 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.359767 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"674a25b4-95f2-47c6-9c86-a002b11c214a\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.359848 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-internal-tls-certs\") pod \"674a25b4-95f2-47c6-9c86-a002b11c214a\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.359883 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-config-data\") pod \"674a25b4-95f2-47c6-9c86-a002b11c214a\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.359910 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-scripts\") pod \"674a25b4-95f2-47c6-9c86-a002b11c214a\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.359958 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-httpd-run\") pod \"674a25b4-95f2-47c6-9c86-a002b11c214a\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.360017 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-combined-ca-bundle\") pod \"674a25b4-95f2-47c6-9c86-a002b11c214a\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.360095 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf6mw\" (UniqueName: \"kubernetes.io/projected/674a25b4-95f2-47c6-9c86-a002b11c214a-kube-api-access-kf6mw\") pod \"674a25b4-95f2-47c6-9c86-a002b11c214a\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.360159 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-logs\") pod \"674a25b4-95f2-47c6-9c86-a002b11c214a\" (UID: \"674a25b4-95f2-47c6-9c86-a002b11c214a\") " Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.360592 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "674a25b4-95f2-47c6-9c86-a002b11c214a" (UID: "674a25b4-95f2-47c6-9c86-a002b11c214a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.361147 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.361318 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-logs" (OuterVolumeSpecName: "logs") pod "674a25b4-95f2-47c6-9c86-a002b11c214a" (UID: "674a25b4-95f2-47c6-9c86-a002b11c214a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.368935 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674a25b4-95f2-47c6-9c86-a002b11c214a-kube-api-access-kf6mw" (OuterVolumeSpecName: "kube-api-access-kf6mw") pod "674a25b4-95f2-47c6-9c86-a002b11c214a" (UID: "674a25b4-95f2-47c6-9c86-a002b11c214a"). InnerVolumeSpecName "kube-api-access-kf6mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.369950 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "674a25b4-95f2-47c6-9c86-a002b11c214a" (UID: "674a25b4-95f2-47c6-9c86-a002b11c214a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.399006 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-scripts" (OuterVolumeSpecName: "scripts") pod "674a25b4-95f2-47c6-9c86-a002b11c214a" (UID: "674a25b4-95f2-47c6-9c86-a002b11c214a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.402621 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "674a25b4-95f2-47c6-9c86-a002b11c214a" (UID: "674a25b4-95f2-47c6-9c86-a002b11c214a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.425056 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "674a25b4-95f2-47c6-9c86-a002b11c214a" (UID: "674a25b4-95f2-47c6-9c86-a002b11c214a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.440032 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-config-data" (OuterVolumeSpecName: "config-data") pod "674a25b4-95f2-47c6-9c86-a002b11c214a" (UID: "674a25b4-95f2-47c6-9c86-a002b11c214a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.462868 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.462916 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.462932 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.462943 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.462954 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674a25b4-95f2-47c6-9c86-a002b11c214a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.462965 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf6mw\" (UniqueName: \"kubernetes.io/projected/674a25b4-95f2-47c6-9c86-a002b11c214a-kube-api-access-kf6mw\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.462978 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674a25b4-95f2-47c6-9c86-a002b11c214a-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.484822 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.524246 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.526248 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"674a25b4-95f2-47c6-9c86-a002b11c214a","Type":"ContainerDied","Data":"5589fd795ad853171b855d68e4c87ffe5c1a27874ecea548f56a625f99661c8b"} Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.561399 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.564939 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.576469 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.584133 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:03:00 crc kubenswrapper[5033]: E0226 16:03:00.584559 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerName="glance-httpd" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.584581 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerName="glance-httpd" Feb 26 16:03:00 crc kubenswrapper[5033]: E0226 16:03:00.584602 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerName="glance-log" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.584611 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerName="glance-log" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.584799 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerName="glance-log" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.584824 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="674a25b4-95f2-47c6-9c86-a002b11c214a" containerName="glance-httpd" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.595598 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.595705 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.603742 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.604239 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.767627 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-logs\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.767679 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.767707 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.767741 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.767764 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.767834 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.768078 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.768201 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2ph8\" (UniqueName: \"kubernetes.io/projected/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-kube-api-access-r2ph8\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.870335 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-logs\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.870401 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.870444 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.870490 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.870530 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.870662 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.870741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.870804 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2ph8\" (UniqueName: \"kubernetes.io/projected/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-kube-api-access-r2ph8\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.870935 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-logs\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.871158 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.871244 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.875141 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.878538 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.879862 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.881885 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.887413 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2ph8\" (UniqueName: \"kubernetes.io/projected/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-kube-api-access-r2ph8\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.899857 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:00 crc kubenswrapper[5033]: I0226 16:03:00.916904 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.598081 5033 scope.go:117] "RemoveContainer" containerID="8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065" Feb 26 16:03:01 crc kubenswrapper[5033]: E0226 16:03:01.602778 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065\": container with ID starting with 8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065 not found: ID does not exist" containerID="8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.602832 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065"} err="failed to get container status \"8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065\": rpc error: code = NotFound desc = could not find container \"8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065\": container with ID starting with 8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065 not found: ID does not exist" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.602865 5033 scope.go:117] "RemoveContainer" containerID="356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2" Feb 26 16:03:01 crc kubenswrapper[5033]: E0226 16:03:01.603229 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2\": container with ID starting with 356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2 not found: ID does not exist" containerID="356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.603286 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2"} err="failed to get container status \"356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2\": rpc error: code = NotFound desc = could not find container \"356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2\": container with ID starting with 356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2 not found: ID does not exist" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.603315 5033 scope.go:117] "RemoveContainer" containerID="8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.603713 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065"} err="failed to get container status \"8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065\": rpc error: code = NotFound desc = could not find container \"8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065\": container with ID starting with 8e7f1b55525467231d6e4bffc2fbc203b3fad36f0d226f282a47ab849d72d065 not found: ID does not exist" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.603791 5033 scope.go:117] "RemoveContainer" containerID="356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.605017 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2"} err="failed to get container status \"356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2\": rpc error: code = NotFound desc = could not find container \"356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2\": container with ID starting with 356d095a424c99d35db4c1f6ce696d6e612d6f79d4ac76dfe7ee46629fce97c2 not found: ID does not exist" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.605041 5033 scope.go:117] "RemoveContainer" containerID="c77e40eb31aa925a22033e521ff5b2294d613b4bc01111cfa9b5d44bf7aa1406" Feb 26 16:03:01 crc kubenswrapper[5033]: E0226 16:03:01.609450 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 26 16:03:01 crc kubenswrapper[5033]: E0226 16:03:01.609587 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9b86b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-lkf9s_openstack(cf479687-fa77-4c8b-95ee-23b30c858b35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:03:01 crc kubenswrapper[5033]: E0226 16:03:01.610926 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-lkf9s" podUID="cf479687-fa77-4c8b-95ee-23b30c858b35" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.645728 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674a25b4-95f2-47c6-9c86-a002b11c214a" path="/var/lib/kubelet/pods/674a25b4-95f2-47c6-9c86-a002b11c214a/volumes" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.745679 5033 scope.go:117] "RemoveContainer" containerID="a6889d37766bd69e29ef8bfe47fbbe988a6f1fc717d97b0a66287c9595940d9e" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.779850 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.890240 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-sb\") pod \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.890429 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdscf\" (UniqueName: \"kubernetes.io/projected/b76c988f-7123-4c0c-84a2-44ae8ffb7898-kube-api-access-mdscf\") pod \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.890481 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-svc\") pod \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.890536 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-config\") pod \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.890576 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-swift-storage-0\") pod \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.890641 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-nb\") pod \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\" (UID: \"b76c988f-7123-4c0c-84a2-44ae8ffb7898\") " Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.900744 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76c988f-7123-4c0c-84a2-44ae8ffb7898-kube-api-access-mdscf" (OuterVolumeSpecName: "kube-api-access-mdscf") pod "b76c988f-7123-4c0c-84a2-44ae8ffb7898" (UID: "b76c988f-7123-4c0c-84a2-44ae8ffb7898"). InnerVolumeSpecName "kube-api-access-mdscf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:01 crc kubenswrapper[5033]: I0226 16:03:01.993465 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdscf\" (UniqueName: \"kubernetes.io/projected/b76c988f-7123-4c0c-84a2-44ae8ffb7898-kube-api-access-mdscf\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.024823 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-config" (OuterVolumeSpecName: "config") pod "b76c988f-7123-4c0c-84a2-44ae8ffb7898" (UID: "b76c988f-7123-4c0c-84a2-44ae8ffb7898"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.026139 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b76c988f-7123-4c0c-84a2-44ae8ffb7898" (UID: "b76c988f-7123-4c0c-84a2-44ae8ffb7898"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.034625 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b76c988f-7123-4c0c-84a2-44ae8ffb7898" (UID: "b76c988f-7123-4c0c-84a2-44ae8ffb7898"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.034835 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b76c988f-7123-4c0c-84a2-44ae8ffb7898" (UID: "b76c988f-7123-4c0c-84a2-44ae8ffb7898"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.035989 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b76c988f-7123-4c0c-84a2-44ae8ffb7898" (UID: "b76c988f-7123-4c0c-84a2-44ae8ffb7898"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.095312 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.095355 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.095368 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.095382 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.095396 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b76c988f-7123-4c0c-84a2-44ae8ffb7898-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.183986 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mnxgk"] Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.257539 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:03:02 crc kubenswrapper[5033]: W0226 16:03:02.261165 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd382a65_6478_4ecd_ab76_bc1f1fcf2fc1.slice/crio-07f3716167aecd2312843e52a1f3985dfc3430b96be6c6e5df230daf57d12b46 WatchSource:0}: Error finding container 07f3716167aecd2312843e52a1f3985dfc3430b96be6c6e5df230daf57d12b46: Status 404 returned error can't find the container with id 07f3716167aecd2312843e52a1f3985dfc3430b96be6c6e5df230daf57d12b46 Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.550871 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"689edd77-629b-4ce9-9392-f52fb34a7a1e","Type":"ContainerStarted","Data":"34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35"} Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.554093 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kptqh" event={"ID":"bc977b82-241c-4c92-9755-877433f3e4de","Type":"ContainerStarted","Data":"444638d70c14e26c0e7807701010c49966be1b71768de02d5f5fcc407e2851b7"} Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.557768 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hb64j" event={"ID":"ded3d8f6-4b47-4d1d-af78-b295f7b729a7","Type":"ContainerStarted","Data":"8cffddd5fbb4d925155c313ad8810b6f0ee25c013102a3cec9d3a4efd08a23af"} Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.560762 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" event={"ID":"b76c988f-7123-4c0c-84a2-44ae8ffb7898","Type":"ContainerDied","Data":"bf34709205899c4b71c7f60e1812d42a081fcf7607295820a8425d6399b3492c"} Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.560827 5033 scope.go:117] "RemoveContainer" containerID="1257f20356492fc3f99a4c76c10244dac99e5fd97cd925d186c71d4e21948a5d" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.561549 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.574529 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1","Type":"ContainerStarted","Data":"07f3716167aecd2312843e52a1f3985dfc3430b96be6c6e5df230daf57d12b46"} Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.578913 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-kptqh" podStartSLOduration=3.067392334 podStartE2EDuration="23.578897783s" podCreationTimestamp="2026-02-26 16:02:39 +0000 UTC" firstStartedPulling="2026-02-26 16:02:41.077424187 +0000 UTC m=+1288.290022837" lastFinishedPulling="2026-02-26 16:03:01.588929636 +0000 UTC m=+1308.801528286" observedRunningTime="2026-02-26 16:03:02.57347445 +0000 UTC m=+1309.786073120" watchObservedRunningTime="2026-02-26 16:03:02.578897783 +0000 UTC m=+1309.791496433" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.586999 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mnxgk" event={"ID":"e8fd420c-3146-49a5-a9c5-69e3f01e8848","Type":"ContainerStarted","Data":"d97ab34cf5faf76006f9859b79ccdbff315105cdef47777cbd0d6eac5ee0600b"} Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.587045 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mnxgk" event={"ID":"e8fd420c-3146-49a5-a9c5-69e3f01e8848","Type":"ContainerStarted","Data":"affb0ef33c84ee02c59a5f07aa6a514d460ba3b73113769e8184d77ff5cb0366"} Feb 26 16:03:02 crc kubenswrapper[5033]: E0226 16:03:02.595493 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-lkf9s" podUID="cf479687-fa77-4c8b-95ee-23b30c858b35" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.606084 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hb64j" podStartSLOduration=3.103967923 podStartE2EDuration="23.60606213s" podCreationTimestamp="2026-02-26 16:02:39 +0000 UTC" firstStartedPulling="2026-02-26 16:02:41.078716445 +0000 UTC m=+1288.291315085" lastFinishedPulling="2026-02-26 16:03:01.580810632 +0000 UTC m=+1308.793409292" observedRunningTime="2026-02-26 16:03:02.595710674 +0000 UTC m=+1309.808309324" watchObservedRunningTime="2026-02-26 16:03:02.60606213 +0000 UTC m=+1309.818660780" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.610890 5033 scope.go:117] "RemoveContainer" containerID="56eb026c665ccccb999987a07c3068aa61f0d3ee6e4cefc01258c36598b9f051" Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.632978 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-mwdpf"] Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.640860 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-mwdpf"] Feb 26 16:03:02 crc kubenswrapper[5033]: I0226 16:03:02.648028 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mnxgk" podStartSLOduration=11.64800223 podStartE2EDuration="11.64800223s" podCreationTimestamp="2026-02-26 16:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:02.633664314 +0000 UTC m=+1309.846263004" watchObservedRunningTime="2026-02-26 16:03:02.64800223 +0000 UTC m=+1309.860600880" Feb 26 16:03:03 crc kubenswrapper[5033]: I0226 16:03:03.169433 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:03:03 crc kubenswrapper[5033]: I0226 16:03:03.596677 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"19e1a391-4e54-4cee-9c88-34c9e99d71fa","Type":"ContainerStarted","Data":"2d61b32ec1248749b78b8ad80dcf4818c724a02d34739f7f44fcfb315ea4733c"} Feb 26 16:03:03 crc kubenswrapper[5033]: I0226 16:03:03.604377 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1","Type":"ContainerStarted","Data":"1e75b7b42268d9506f3f067be14ed00342281227703735eed095f02cda5dca8e"} Feb 26 16:03:03 crc kubenswrapper[5033]: I0226 16:03:03.604440 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1","Type":"ContainerStarted","Data":"22ddbe7584d58799a625aebf692f002a308be7926f76b849641146aacf642ea8"} Feb 26 16:03:03 crc kubenswrapper[5033]: I0226 16:03:03.636054 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.636028672 podStartE2EDuration="3.636028672s" podCreationTimestamp="2026-02-26 16:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:03.635981351 +0000 UTC m=+1310.848580011" watchObservedRunningTime="2026-02-26 16:03:03.636028672 +0000 UTC m=+1310.848627322" Feb 26 16:03:03 crc kubenswrapper[5033]: I0226 16:03:03.651013 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" path="/var/lib/kubelet/pods/b76c988f-7123-4c0c-84a2-44ae8ffb7898/volumes" Feb 26 16:03:04 crc kubenswrapper[5033]: I0226 16:03:04.627465 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"19e1a391-4e54-4cee-9c88-34c9e99d71fa","Type":"ContainerStarted","Data":"5b44a06a2b3944d4f7000b47c862b2ab7a0f04f864ceac7276e4a15e1ad9a689"} Feb 26 16:03:04 crc kubenswrapper[5033]: I0226 16:03:04.631350 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"689edd77-629b-4ce9-9392-f52fb34a7a1e","Type":"ContainerStarted","Data":"c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc"} Feb 26 16:03:05 crc kubenswrapper[5033]: I0226 16:03:05.366092 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-mwdpf" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: i/o timeout" Feb 26 16:03:05 crc kubenswrapper[5033]: I0226 16:03:05.640913 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"19e1a391-4e54-4cee-9c88-34c9e99d71fa","Type":"ContainerStarted","Data":"751099a6fd3491807a70d3af4718e1c4d1a7f27761350ddb2b8727da704f7e06"} Feb 26 16:03:10 crc kubenswrapper[5033]: I0226 16:03:10.917412 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:10 crc kubenswrapper[5033]: I0226 16:03:10.917685 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:10 crc kubenswrapper[5033]: I0226 16:03:10.951385 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:10 crc kubenswrapper[5033]: I0226 16:03:10.976512 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=26.976449789 podStartE2EDuration="26.976449789s" podCreationTimestamp="2026-02-26 16:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:05.658981972 +0000 UTC m=+1312.871580642" watchObservedRunningTime="2026-02-26 16:03:10.976449789 +0000 UTC m=+1318.189048449" Feb 26 16:03:10 crc kubenswrapper[5033]: I0226 16:03:10.983798 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:11 crc kubenswrapper[5033]: I0226 16:03:11.705285 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:11 crc kubenswrapper[5033]: I0226 16:03:11.705649 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:13 crc kubenswrapper[5033]: I0226 16:03:13.776089 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:13 crc kubenswrapper[5033]: I0226 16:03:13.776476 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:14 crc kubenswrapper[5033]: I0226 16:03:14.728774 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 16:03:14 crc kubenswrapper[5033]: I0226 16:03:14.729569 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 16:03:14 crc kubenswrapper[5033]: I0226 16:03:14.729718 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 16:03:14 crc kubenswrapper[5033]: I0226 16:03:14.729823 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 16:03:14 crc kubenswrapper[5033]: I0226 16:03:14.800083 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 16:03:14 crc kubenswrapper[5033]: I0226 16:03:14.844762 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 16:03:16 crc kubenswrapper[5033]: E0226 16:03:16.048883 5033 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Feb 26 16:03:16 crc kubenswrapper[5033]: E0226 16:03:16.049287 5033 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcm4l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(689edd77-629b-4ce9-9392-f52fb34a7a1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:03:16 crc kubenswrapper[5033]: I0226 16:03:16.761900 5033 generic.go:334] "Generic (PLEG): container finished" podID="e8fd420c-3146-49a5-a9c5-69e3f01e8848" containerID="d97ab34cf5faf76006f9859b79ccdbff315105cdef47777cbd0d6eac5ee0600b" exitCode=0 Feb 26 16:03:16 crc kubenswrapper[5033]: I0226 16:03:16.763744 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mnxgk" event={"ID":"e8fd420c-3146-49a5-a9c5-69e3f01e8848","Type":"ContainerDied","Data":"d97ab34cf5faf76006f9859b79ccdbff315105cdef47777cbd0d6eac5ee0600b"} Feb 26 16:03:17 crc kubenswrapper[5033]: I0226 16:03:17.779562 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 16:03:17 crc kubenswrapper[5033]: I0226 16:03:17.779647 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:03:17 crc kubenswrapper[5033]: I0226 16:03:17.790632 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.298720 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.439537 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-combined-ca-bundle\") pod \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.439613 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-credential-keys\") pod \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.439747 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nmpx\" (UniqueName: \"kubernetes.io/projected/e8fd420c-3146-49a5-a9c5-69e3f01e8848-kube-api-access-4nmpx\") pod \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.439780 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-fernet-keys\") pod \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.440318 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-config-data\") pod \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.440358 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-scripts\") pod \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\" (UID: \"e8fd420c-3146-49a5-a9c5-69e3f01e8848\") " Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.444249 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e8fd420c-3146-49a5-a9c5-69e3f01e8848" (UID: "e8fd420c-3146-49a5-a9c5-69e3f01e8848"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.444779 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8fd420c-3146-49a5-a9c5-69e3f01e8848-kube-api-access-4nmpx" (OuterVolumeSpecName: "kube-api-access-4nmpx") pod "e8fd420c-3146-49a5-a9c5-69e3f01e8848" (UID: "e8fd420c-3146-49a5-a9c5-69e3f01e8848"). InnerVolumeSpecName "kube-api-access-4nmpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.444922 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-scripts" (OuterVolumeSpecName: "scripts") pod "e8fd420c-3146-49a5-a9c5-69e3f01e8848" (UID: "e8fd420c-3146-49a5-a9c5-69e3f01e8848"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.445173 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nmpx\" (UniqueName: \"kubernetes.io/projected/e8fd420c-3146-49a5-a9c5-69e3f01e8848-kube-api-access-4nmpx\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.445190 5033 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.445199 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.457706 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e8fd420c-3146-49a5-a9c5-69e3f01e8848" (UID: "e8fd420c-3146-49a5-a9c5-69e3f01e8848"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.485005 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-config-data" (OuterVolumeSpecName: "config-data") pod "e8fd420c-3146-49a5-a9c5-69e3f01e8848" (UID: "e8fd420c-3146-49a5-a9c5-69e3f01e8848"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.520286 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8fd420c-3146-49a5-a9c5-69e3f01e8848" (UID: "e8fd420c-3146-49a5-a9c5-69e3f01e8848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.546717 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.546749 5033 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.546758 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8fd420c-3146-49a5-a9c5-69e3f01e8848-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.786861 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mnxgk" event={"ID":"e8fd420c-3146-49a5-a9c5-69e3f01e8848","Type":"ContainerDied","Data":"affb0ef33c84ee02c59a5f07aa6a514d460ba3b73113769e8184d77ff5cb0366"} Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.786905 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="affb0ef33c84ee02c59a5f07aa6a514d460ba3b73113769e8184d77ff5cb0366" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.786903 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mnxgk" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.898052 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7d658b969f-npt2l"] Feb 26 16:03:18 crc kubenswrapper[5033]: E0226 16:03:18.898474 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerName="dnsmasq-dns" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.898491 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerName="dnsmasq-dns" Feb 26 16:03:18 crc kubenswrapper[5033]: E0226 16:03:18.898506 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerName="init" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.898512 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerName="init" Feb 26 16:03:18 crc kubenswrapper[5033]: E0226 16:03:18.898520 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fd420c-3146-49a5-a9c5-69e3f01e8848" containerName="keystone-bootstrap" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.898526 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fd420c-3146-49a5-a9c5-69e3f01e8848" containerName="keystone-bootstrap" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.898697 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76c988f-7123-4c0c-84a2-44ae8ffb7898" containerName="dnsmasq-dns" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.898714 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8fd420c-3146-49a5-a9c5-69e3f01e8848" containerName="keystone-bootstrap" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.899309 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.904640 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.904677 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.904636 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.904949 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.905381 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pg9lf" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.905489 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 26 16:03:18 crc kubenswrapper[5033]: I0226 16:03:18.906549 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d658b969f-npt2l"] Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.056526 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-public-tls-certs\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.056958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-internal-tls-certs\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.057030 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-credential-keys\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.057060 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-combined-ca-bundle\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.057123 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-scripts\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.057149 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-fernet-keys\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.057195 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-config-data\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.057230 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh72x\" (UniqueName: \"kubernetes.io/projected/8e80c8c9-0f28-488c-9b68-ff4d379f9059-kube-api-access-xh72x\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.158966 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-config-data\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.159021 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh72x\" (UniqueName: \"kubernetes.io/projected/8e80c8c9-0f28-488c-9b68-ff4d379f9059-kube-api-access-xh72x\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.159056 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-public-tls-certs\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.159094 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-internal-tls-certs\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.159138 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-credential-keys\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.159155 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-combined-ca-bundle\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.159195 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-scripts\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.159211 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-fernet-keys\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.162894 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-fernet-keys\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.163915 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-credential-keys\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.165240 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-config-data\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.165852 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-scripts\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.167197 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-internal-tls-certs\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.167948 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-combined-ca-bundle\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.171338 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-public-tls-certs\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.181895 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh72x\" (UniqueName: \"kubernetes.io/projected/8e80c8c9-0f28-488c-9b68-ff4d379f9059-kube-api-access-xh72x\") pod \"keystone-7d658b969f-npt2l\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.220977 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.716408 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d658b969f-npt2l"] Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.799712 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d658b969f-npt2l" event={"ID":"8e80c8c9-0f28-488c-9b68-ff4d379f9059","Type":"ContainerStarted","Data":"a94ee81af9cd8002a44ed4bec2c8a1ef53d9b54b037314b582dc30ec28af9619"} Feb 26 16:03:19 crc kubenswrapper[5033]: I0226 16:03:19.801196 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lkf9s" event={"ID":"cf479687-fa77-4c8b-95ee-23b30c858b35","Type":"ContainerStarted","Data":"4fc856374b9e4577d7bc3935a637339263321a668147cce6238a1cda79ff279b"} Feb 26 16:03:20 crc kubenswrapper[5033]: I0226 16:03:20.809145 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d658b969f-npt2l" event={"ID":"8e80c8c9-0f28-488c-9b68-ff4d379f9059","Type":"ContainerStarted","Data":"e93416b1954dbbe264e3883541d6c591523a47b3cbef0c1e3095a85863fc60b8"} Feb 26 16:03:20 crc kubenswrapper[5033]: I0226 16:03:20.809498 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:20 crc kubenswrapper[5033]: I0226 16:03:20.823886 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7d658b969f-npt2l" podStartSLOduration=2.823867544 podStartE2EDuration="2.823867544s" podCreationTimestamp="2026-02-26 16:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:20.823170509 +0000 UTC m=+1328.035769169" watchObservedRunningTime="2026-02-26 16:03:20.823867544 +0000 UTC m=+1328.036466194" Feb 26 16:03:20 crc kubenswrapper[5033]: I0226 16:03:20.853791 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-lkf9s" podStartSLOduration=4.869831774 podStartE2EDuration="41.853775191s" podCreationTimestamp="2026-02-26 16:02:39 +0000 UTC" firstStartedPulling="2026-02-26 16:02:41.203106035 +0000 UTC m=+1288.415704685" lastFinishedPulling="2026-02-26 16:03:18.187049442 +0000 UTC m=+1325.399648102" observedRunningTime="2026-02-26 16:03:20.845645758 +0000 UTC m=+1328.058244418" watchObservedRunningTime="2026-02-26 16:03:20.853775191 +0000 UTC m=+1328.066373841" Feb 26 16:03:26 crc kubenswrapper[5033]: E0226 16:03:26.957085 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" Feb 26 16:03:27 crc kubenswrapper[5033]: I0226 16:03:27.867369 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"689edd77-629b-4ce9-9392-f52fb34a7a1e","Type":"ContainerStarted","Data":"b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e"} Feb 26 16:03:27 crc kubenswrapper[5033]: I0226 16:03:27.867583 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="ceilometer-central-agent" containerID="cri-o://34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35" gracePeriod=30 Feb 26 16:03:27 crc kubenswrapper[5033]: I0226 16:03:27.867650 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="proxy-httpd" containerID="cri-o://b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e" gracePeriod=30 Feb 26 16:03:27 crc kubenswrapper[5033]: I0226 16:03:27.867650 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="ceilometer-notification-agent" containerID="cri-o://c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc" gracePeriod=30 Feb 26 16:03:27 crc kubenswrapper[5033]: I0226 16:03:27.867846 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:03:28 crc kubenswrapper[5033]: I0226 16:03:28.876853 5033 generic.go:334] "Generic (PLEG): container finished" podID="ded3d8f6-4b47-4d1d-af78-b295f7b729a7" containerID="8cffddd5fbb4d925155c313ad8810b6f0ee25c013102a3cec9d3a4efd08a23af" exitCode=0 Feb 26 16:03:28 crc kubenswrapper[5033]: I0226 16:03:28.876937 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hb64j" event={"ID":"ded3d8f6-4b47-4d1d-af78-b295f7b729a7","Type":"ContainerDied","Data":"8cffddd5fbb4d925155c313ad8810b6f0ee25c013102a3cec9d3a4efd08a23af"} Feb 26 16:03:28 crc kubenswrapper[5033]: I0226 16:03:28.880061 5033 generic.go:334] "Generic (PLEG): container finished" podID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerID="b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e" exitCode=0 Feb 26 16:03:28 crc kubenswrapper[5033]: I0226 16:03:28.880134 5033 generic.go:334] "Generic (PLEG): container finished" podID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerID="34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35" exitCode=0 Feb 26 16:03:28 crc kubenswrapper[5033]: I0226 16:03:28.880140 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"689edd77-629b-4ce9-9392-f52fb34a7a1e","Type":"ContainerDied","Data":"b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e"} Feb 26 16:03:28 crc kubenswrapper[5033]: I0226 16:03:28.880202 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"689edd77-629b-4ce9-9392-f52fb34a7a1e","Type":"ContainerDied","Data":"34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35"} Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.279164 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hb64j" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.357480 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-logs\") pod \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.357534 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-scripts\") pod \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.357597 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-combined-ca-bundle\") pod \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.357768 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-config-data\") pod \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.357820 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9nxq\" (UniqueName: \"kubernetes.io/projected/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-kube-api-access-l9nxq\") pod \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\" (UID: \"ded3d8f6-4b47-4d1d-af78-b295f7b729a7\") " Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.358536 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-logs" (OuterVolumeSpecName: "logs") pod "ded3d8f6-4b47-4d1d-af78-b295f7b729a7" (UID: "ded3d8f6-4b47-4d1d-af78-b295f7b729a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.362978 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-kube-api-access-l9nxq" (OuterVolumeSpecName: "kube-api-access-l9nxq") pod "ded3d8f6-4b47-4d1d-af78-b295f7b729a7" (UID: "ded3d8f6-4b47-4d1d-af78-b295f7b729a7"). InnerVolumeSpecName "kube-api-access-l9nxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.368434 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-scripts" (OuterVolumeSpecName: "scripts") pod "ded3d8f6-4b47-4d1d-af78-b295f7b729a7" (UID: "ded3d8f6-4b47-4d1d-af78-b295f7b729a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.383459 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ded3d8f6-4b47-4d1d-af78-b295f7b729a7" (UID: "ded3d8f6-4b47-4d1d-af78-b295f7b729a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.384525 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-config-data" (OuterVolumeSpecName: "config-data") pod "ded3d8f6-4b47-4d1d-af78-b295f7b729a7" (UID: "ded3d8f6-4b47-4d1d-af78-b295f7b729a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.459759 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.459804 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.459833 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9nxq\" (UniqueName: \"kubernetes.io/projected/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-kube-api-access-l9nxq\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.459853 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.459868 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded3d8f6-4b47-4d1d-af78-b295f7b729a7-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.902053 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hb64j" event={"ID":"ded3d8f6-4b47-4d1d-af78-b295f7b729a7","Type":"ContainerDied","Data":"01943182c1e1c0840f50b25a28564679c030c4151cc1177cfdf00f3f1a76bc81"} Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.902097 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01943182c1e1c0840f50b25a28564679c030c4151cc1177cfdf00f3f1a76bc81" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.902120 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hb64j" Feb 26 16:03:30 crc kubenswrapper[5033]: I0226 16:03:30.999639 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d648bbbdd-g6xz5"] Feb 26 16:03:31 crc kubenswrapper[5033]: E0226 16:03:31.000291 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded3d8f6-4b47-4d1d-af78-b295f7b729a7" containerName="placement-db-sync" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.000313 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded3d8f6-4b47-4d1d-af78-b295f7b729a7" containerName="placement-db-sync" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.000503 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded3d8f6-4b47-4d1d-af78-b295f7b729a7" containerName="placement-db-sync" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.001390 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.003043 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.003192 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.003501 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.005852 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.006933 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dv7jp" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.020292 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d648bbbdd-g6xz5"] Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.070710 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-logs\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.070977 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-internal-tls-certs\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.071057 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-config-data\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.071182 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-public-tls-certs\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.071221 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-scripts\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.071247 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-combined-ca-bundle\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.071372 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bchzp\" (UniqueName: \"kubernetes.io/projected/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-kube-api-access-bchzp\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.173578 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-scripts\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.173626 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-public-tls-certs\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.173660 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-combined-ca-bundle\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.173698 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bchzp\" (UniqueName: \"kubernetes.io/projected/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-kube-api-access-bchzp\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.173786 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-logs\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.173867 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-internal-tls-certs\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.173891 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-config-data\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.174446 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-logs\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.178324 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-config-data\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.178550 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-scripts\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.178827 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-combined-ca-bundle\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.180220 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-internal-tls-certs\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.193478 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-public-tls-certs\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.194153 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bchzp\" (UniqueName: \"kubernetes.io/projected/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-kube-api-access-bchzp\") pod \"placement-6d648bbbdd-g6xz5\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.318726 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.617634 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d648bbbdd-g6xz5"] Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.618151 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:31 crc kubenswrapper[5033]: W0226 16:03:31.619574 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ef30f16_84a1_4809_8ec4_0c36e315c5f3.slice/crio-22427c3ebd79a2624636bc4e51acacd7a681350f748159da033bf50d2e3b1787 WatchSource:0}: Error finding container 22427c3ebd79a2624636bc4e51acacd7a681350f748159da033bf50d2e3b1787: Status 404 returned error can't find the container with id 22427c3ebd79a2624636bc4e51acacd7a681350f748159da033bf50d2e3b1787 Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.687304 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-log-httpd\") pod \"689edd77-629b-4ce9-9392-f52fb34a7a1e\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.687390 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-combined-ca-bundle\") pod \"689edd77-629b-4ce9-9392-f52fb34a7a1e\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.687443 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcm4l\" (UniqueName: \"kubernetes.io/projected/689edd77-629b-4ce9-9392-f52fb34a7a1e-kube-api-access-bcm4l\") pod \"689edd77-629b-4ce9-9392-f52fb34a7a1e\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.687560 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-run-httpd\") pod \"689edd77-629b-4ce9-9392-f52fb34a7a1e\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.687604 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-scripts\") pod \"689edd77-629b-4ce9-9392-f52fb34a7a1e\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.687627 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-config-data\") pod \"689edd77-629b-4ce9-9392-f52fb34a7a1e\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.687644 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-sg-core-conf-yaml\") pod \"689edd77-629b-4ce9-9392-f52fb34a7a1e\" (UID: \"689edd77-629b-4ce9-9392-f52fb34a7a1e\") " Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.687919 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "689edd77-629b-4ce9-9392-f52fb34a7a1e" (UID: "689edd77-629b-4ce9-9392-f52fb34a7a1e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.688173 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.688187 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "689edd77-629b-4ce9-9392-f52fb34a7a1e" (UID: "689edd77-629b-4ce9-9392-f52fb34a7a1e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.691930 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689edd77-629b-4ce9-9392-f52fb34a7a1e-kube-api-access-bcm4l" (OuterVolumeSpecName: "kube-api-access-bcm4l") pod "689edd77-629b-4ce9-9392-f52fb34a7a1e" (UID: "689edd77-629b-4ce9-9392-f52fb34a7a1e"). InnerVolumeSpecName "kube-api-access-bcm4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.692251 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "689edd77-629b-4ce9-9392-f52fb34a7a1e" (UID: "689edd77-629b-4ce9-9392-f52fb34a7a1e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.692935 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-scripts" (OuterVolumeSpecName: "scripts") pod "689edd77-629b-4ce9-9392-f52fb34a7a1e" (UID: "689edd77-629b-4ce9-9392-f52fb34a7a1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.770099 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "689edd77-629b-4ce9-9392-f52fb34a7a1e" (UID: "689edd77-629b-4ce9-9392-f52fb34a7a1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.789496 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/689edd77-629b-4ce9-9392-f52fb34a7a1e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.789541 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.789554 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.789567 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.789578 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcm4l\" (UniqueName: \"kubernetes.io/projected/689edd77-629b-4ce9-9392-f52fb34a7a1e-kube-api-access-bcm4l\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.814074 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-config-data" (OuterVolumeSpecName: "config-data") pod "689edd77-629b-4ce9-9392-f52fb34a7a1e" (UID: "689edd77-629b-4ce9-9392-f52fb34a7a1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.890733 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/689edd77-629b-4ce9-9392-f52fb34a7a1e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.917823 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d648bbbdd-g6xz5" event={"ID":"5ef30f16-84a1-4809-8ec4-0c36e315c5f3","Type":"ContainerStarted","Data":"22427c3ebd79a2624636bc4e51acacd7a681350f748159da033bf50d2e3b1787"} Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.920282 5033 generic.go:334] "Generic (PLEG): container finished" podID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerID="c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc" exitCode=0 Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.920311 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"689edd77-629b-4ce9-9392-f52fb34a7a1e","Type":"ContainerDied","Data":"c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc"} Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.920328 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"689edd77-629b-4ce9-9392-f52fb34a7a1e","Type":"ContainerDied","Data":"ecd1ae6d4c3b131b815493b82ab1b62271db20ca910cfce2408e0a644e9a8395"} Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.920344 5033 scope.go:117] "RemoveContainer" containerID="b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.920409 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.943158 5033 scope.go:117] "RemoveContainer" containerID="c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.979555 5033 scope.go:117] "RemoveContainer" containerID="34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35" Feb 26 16:03:31 crc kubenswrapper[5033]: I0226 16:03:31.986515 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.006343 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.015178 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:32 crc kubenswrapper[5033]: E0226 16:03:32.015829 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="ceilometer-notification-agent" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.015860 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="ceilometer-notification-agent" Feb 26 16:03:32 crc kubenswrapper[5033]: E0226 16:03:32.015872 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="proxy-httpd" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.015880 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="proxy-httpd" Feb 26 16:03:32 crc kubenswrapper[5033]: E0226 16:03:32.015923 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="ceilometer-central-agent" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.015932 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="ceilometer-central-agent" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.016658 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="ceilometer-notification-agent" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.016710 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="proxy-httpd" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.016723 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" containerName="ceilometer-central-agent" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.020770 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.024615 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.024882 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.029093 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.032375 5033 scope.go:117] "RemoveContainer" containerID="b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e" Feb 26 16:03:32 crc kubenswrapper[5033]: E0226 16:03:32.032824 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e\": container with ID starting with b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e not found: ID does not exist" containerID="b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.032866 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e"} err="failed to get container status \"b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e\": rpc error: code = NotFound desc = could not find container \"b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e\": container with ID starting with b1d682626104948c9cafa008bf857d709b3873a4772e7c00fa54bd67faf2da8e not found: ID does not exist" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.032891 5033 scope.go:117] "RemoveContainer" containerID="c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc" Feb 26 16:03:32 crc kubenswrapper[5033]: E0226 16:03:32.033347 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc\": container with ID starting with c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc not found: ID does not exist" containerID="c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.033372 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc"} err="failed to get container status \"c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc\": rpc error: code = NotFound desc = could not find container \"c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc\": container with ID starting with c4ac720764c111b350d7a363fa17f0e041bdbfa5e8ed5d9a3466965e83bbeacc not found: ID does not exist" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.033385 5033 scope.go:117] "RemoveContainer" containerID="34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35" Feb 26 16:03:32 crc kubenswrapper[5033]: E0226 16:03:32.033596 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35\": container with ID starting with 34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35 not found: ID does not exist" containerID="34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.033617 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35"} err="failed to get container status \"34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35\": rpc error: code = NotFound desc = could not find container \"34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35\": container with ID starting with 34a580d30670aea4701d79db6947632d90a1b334fb89d7bf10ea04029d1d8e35 not found: ID does not exist" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.090949 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:32 crc kubenswrapper[5033]: E0226 16:03:32.091748 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-8ln5t log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="1e929f18-8111-4dca-952c-567845db9b64" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.094277 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-scripts\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.094348 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-run-httpd\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.094514 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-config-data\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.094725 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.094787 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.094833 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-log-httpd\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.094877 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ln5t\" (UniqueName: \"kubernetes.io/projected/1e929f18-8111-4dca-952c-567845db9b64-kube-api-access-8ln5t\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.196611 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.196678 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.196704 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-log-httpd\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.196725 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ln5t\" (UniqueName: \"kubernetes.io/projected/1e929f18-8111-4dca-952c-567845db9b64-kube-api-access-8ln5t\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.196746 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-scripts\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.196780 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-run-httpd\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.196836 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-config-data\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.198359 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-run-httpd\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.199628 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-log-httpd\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.200856 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.201660 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-scripts\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.203048 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.204468 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-config-data\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.214218 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ln5t\" (UniqueName: \"kubernetes.io/projected/1e929f18-8111-4dca-952c-567845db9b64-kube-api-access-8ln5t\") pod \"ceilometer-0\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.939202 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d648bbbdd-g6xz5" event={"ID":"5ef30f16-84a1-4809-8ec4-0c36e315c5f3","Type":"ContainerStarted","Data":"37b6556400506b1236e7114115d1a4566edf6e5a76d32b1374c1c673791271cf"} Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.939604 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d648bbbdd-g6xz5" event={"ID":"5ef30f16-84a1-4809-8ec4-0c36e315c5f3","Type":"ContainerStarted","Data":"e957c4c3c0035ebbddff5e5a96c325ad85985b3bd7c2e72eaa4ed32e100f9146"} Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.940403 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.940466 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.942123 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.956501 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:32 crc kubenswrapper[5033]: I0226 16:03:32.979670 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d648bbbdd-g6xz5" podStartSLOduration=2.9796423279999997 podStartE2EDuration="2.979642328s" podCreationTimestamp="2026-02-26 16:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:32.972387563 +0000 UTC m=+1340.184986213" watchObservedRunningTime="2026-02-26 16:03:32.979642328 +0000 UTC m=+1340.192240978" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.012684 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-scripts\") pod \"1e929f18-8111-4dca-952c-567845db9b64\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.012754 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-run-httpd\") pod \"1e929f18-8111-4dca-952c-567845db9b64\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.012812 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-sg-core-conf-yaml\") pod \"1e929f18-8111-4dca-952c-567845db9b64\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.012833 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-combined-ca-bundle\") pod \"1e929f18-8111-4dca-952c-567845db9b64\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.012863 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-config-data\") pod \"1e929f18-8111-4dca-952c-567845db9b64\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.012886 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-log-httpd\") pod \"1e929f18-8111-4dca-952c-567845db9b64\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.012913 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ln5t\" (UniqueName: \"kubernetes.io/projected/1e929f18-8111-4dca-952c-567845db9b64-kube-api-access-8ln5t\") pod \"1e929f18-8111-4dca-952c-567845db9b64\" (UID: \"1e929f18-8111-4dca-952c-567845db9b64\") " Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.016152 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1e929f18-8111-4dca-952c-567845db9b64" (UID: "1e929f18-8111-4dca-952c-567845db9b64"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.016320 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1e929f18-8111-4dca-952c-567845db9b64" (UID: "1e929f18-8111-4dca-952c-567845db9b64"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.037236 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-config-data" (OuterVolumeSpecName: "config-data") pod "1e929f18-8111-4dca-952c-567845db9b64" (UID: "1e929f18-8111-4dca-952c-567845db9b64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.037353 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1e929f18-8111-4dca-952c-567845db9b64" (UID: "1e929f18-8111-4dca-952c-567845db9b64"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.037441 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e929f18-8111-4dca-952c-567845db9b64-kube-api-access-8ln5t" (OuterVolumeSpecName: "kube-api-access-8ln5t") pod "1e929f18-8111-4dca-952c-567845db9b64" (UID: "1e929f18-8111-4dca-952c-567845db9b64"). InnerVolumeSpecName "kube-api-access-8ln5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.037686 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e929f18-8111-4dca-952c-567845db9b64" (UID: "1e929f18-8111-4dca-952c-567845db9b64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.041889 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-scripts" (OuterVolumeSpecName: "scripts") pod "1e929f18-8111-4dca-952c-567845db9b64" (UID: "1e929f18-8111-4dca-952c-567845db9b64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.115211 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.115247 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.115273 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.115285 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.115297 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e929f18-8111-4dca-952c-567845db9b64-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.115306 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e929f18-8111-4dca-952c-567845db9b64-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.115316 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ln5t\" (UniqueName: \"kubernetes.io/projected/1e929f18-8111-4dca-952c-567845db9b64-kube-api-access-8ln5t\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.639244 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="689edd77-629b-4ce9-9392-f52fb34a7a1e" path="/var/lib/kubelet/pods/689edd77-629b-4ce9-9392-f52fb34a7a1e/volumes" Feb 26 16:03:33 crc kubenswrapper[5033]: I0226 16:03:33.950451 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.000796 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.009840 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.038496 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.040842 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.044709 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.045179 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.060574 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.137372 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-log-httpd\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.137497 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.137629 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2jff\" (UniqueName: \"kubernetes.io/projected/d36cde35-f717-4cea-a07c-fbec49bcef15-kube-api-access-c2jff\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.137725 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-config-data\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.137962 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-scripts\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.138022 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-run-httpd\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.138056 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.240049 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2jff\" (UniqueName: \"kubernetes.io/projected/d36cde35-f717-4cea-a07c-fbec49bcef15-kube-api-access-c2jff\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.240209 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-config-data\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.240301 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-scripts\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.240416 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-run-httpd\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.240466 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.240603 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-log-httpd\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.240654 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.241716 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-log-httpd\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.241727 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-run-httpd\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.245135 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.245213 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-config-data\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.246441 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.247993 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-scripts\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.272697 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2jff\" (UniqueName: \"kubernetes.io/projected/d36cde35-f717-4cea-a07c-fbec49bcef15-kube-api-access-c2jff\") pod \"ceilometer-0\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.371742 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.886880 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:34 crc kubenswrapper[5033]: I0226 16:03:34.963429 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerStarted","Data":"02f32c7357b261818718ce799ef777668e2443c47c12341f93a4ba024d83241a"} Feb 26 16:03:35 crc kubenswrapper[5033]: I0226 16:03:35.641450 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e929f18-8111-4dca-952c-567845db9b64" path="/var/lib/kubelet/pods/1e929f18-8111-4dca-952c-567845db9b64/volumes" Feb 26 16:03:35 crc kubenswrapper[5033]: I0226 16:03:35.970470 5033 generic.go:334] "Generic (PLEG): container finished" podID="bc977b82-241c-4c92-9755-877433f3e4de" containerID="444638d70c14e26c0e7807701010c49966be1b71768de02d5f5fcc407e2851b7" exitCode=0 Feb 26 16:03:35 crc kubenswrapper[5033]: I0226 16:03:35.970521 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kptqh" event={"ID":"bc977b82-241c-4c92-9755-877433f3e4de","Type":"ContainerDied","Data":"444638d70c14e26c0e7807701010c49966be1b71768de02d5f5fcc407e2851b7"} Feb 26 16:03:36 crc kubenswrapper[5033]: I0226 16:03:36.998319 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerStarted","Data":"18b936b2b45c65229294f50f96bb7bfc8e3d7858b5abc87cf9855a5c7bb44e8a"} Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.309433 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kptqh" Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.408963 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-combined-ca-bundle\") pod \"bc977b82-241c-4c92-9755-877433f3e4de\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.409020 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-db-sync-config-data\") pod \"bc977b82-241c-4c92-9755-877433f3e4de\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.409137 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb4dj\" (UniqueName: \"kubernetes.io/projected/bc977b82-241c-4c92-9755-877433f3e4de-kube-api-access-hb4dj\") pod \"bc977b82-241c-4c92-9755-877433f3e4de\" (UID: \"bc977b82-241c-4c92-9755-877433f3e4de\") " Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.414444 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc977b82-241c-4c92-9755-877433f3e4de-kube-api-access-hb4dj" (OuterVolumeSpecName: "kube-api-access-hb4dj") pod "bc977b82-241c-4c92-9755-877433f3e4de" (UID: "bc977b82-241c-4c92-9755-877433f3e4de"). InnerVolumeSpecName "kube-api-access-hb4dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.414526 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bc977b82-241c-4c92-9755-877433f3e4de" (UID: "bc977b82-241c-4c92-9755-877433f3e4de"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.436297 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc977b82-241c-4c92-9755-877433f3e4de" (UID: "bc977b82-241c-4c92-9755-877433f3e4de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.510831 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb4dj\" (UniqueName: \"kubernetes.io/projected/bc977b82-241c-4c92-9755-877433f3e4de-kube-api-access-hb4dj\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.510869 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:37 crc kubenswrapper[5033]: I0226 16:03:37.510878 5033 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc977b82-241c-4c92-9755-877433f3e4de-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.012722 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerStarted","Data":"65615cebbcd9363429c94ffd2cd583a0fc62f0931d4c41d25b34b33293acb9e1"} Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.015699 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kptqh" event={"ID":"bc977b82-241c-4c92-9755-877433f3e4de","Type":"ContainerDied","Data":"bc41c83925eaa20d4cd44e9c2cdc485aa5383c9d81798fa8c472bf48b122bbcb"} Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.015782 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc41c83925eaa20d4cd44e9c2cdc485aa5383c9d81798fa8c472bf48b122bbcb" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.015859 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kptqh" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.262103 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5c49db967-q8hjf"] Feb 26 16:03:38 crc kubenswrapper[5033]: E0226 16:03:38.262570 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc977b82-241c-4c92-9755-877433f3e4de" containerName="barbican-db-sync" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.262594 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc977b82-241c-4c92-9755-877433f3e4de" containerName="barbican-db-sync" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.262842 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc977b82-241c-4c92-9755-877433f3e4de" containerName="barbican-db-sync" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.264034 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.265997 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4mxvv" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.275047 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.284114 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.302429 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c49db967-q8hjf"] Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.309737 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-564dc86f78-j9p7h"] Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.311000 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.319399 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.325775 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50dfde4d-180f-4679-b02b-62a9592eb237-logs\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.325830 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data-custom\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.325855 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxzm\" (UniqueName: \"kubernetes.io/projected/50dfde4d-180f-4679-b02b-62a9592eb237-kube-api-access-psxzm\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.325877 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.325903 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-combined-ca-bundle\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.350322 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-564dc86f78-j9p7h"] Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.382213 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-p5bpd"] Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.384691 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.412411 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-p5bpd"] Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.434973 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50dfde4d-180f-4679-b02b-62a9592eb237-logs\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435052 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data-custom\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435084 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psxzm\" (UniqueName: \"kubernetes.io/projected/50dfde4d-180f-4679-b02b-62a9592eb237-kube-api-access-psxzm\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435116 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435157 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-config\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435193 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435237 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-combined-ca-bundle\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435292 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435329 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435369 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435405 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgzpd\" (UniqueName: \"kubernetes.io/projected/b7246c5a-8508-4f21-ac7a-76d218044d4c-kube-api-access-kgzpd\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435439 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435493 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-combined-ca-bundle\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435527 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szz94\" (UniqueName: \"kubernetes.io/projected/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-kube-api-access-szz94\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435552 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-logs\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.435571 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data-custom\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.436130 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50dfde4d-180f-4679-b02b-62a9592eb237-logs\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.453585 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.456631 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data-custom\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.457923 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-combined-ca-bundle\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.464851 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxzm\" (UniqueName: \"kubernetes.io/projected/50dfde4d-180f-4679-b02b-62a9592eb237-kube-api-access-psxzm\") pod \"barbican-worker-5c49db967-q8hjf\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537175 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537217 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-config\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537247 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537287 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537323 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537357 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgzpd\" (UniqueName: \"kubernetes.io/projected/b7246c5a-8508-4f21-ac7a-76d218044d4c-kube-api-access-kgzpd\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537384 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537422 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-combined-ca-bundle\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537451 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szz94\" (UniqueName: \"kubernetes.io/projected/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-kube-api-access-szz94\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537471 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-logs\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.537486 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data-custom\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.548709 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.549281 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.549766 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-config\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.553402 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79658967bb-s277d"] Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.555320 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.555767 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.555920 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.556400 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-logs\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.556955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-combined-ca-bundle\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.557532 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.559849 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.564633 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data-custom\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.575445 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgzpd\" (UniqueName: \"kubernetes.io/projected/b7246c5a-8508-4f21-ac7a-76d218044d4c-kube-api-access-kgzpd\") pod \"dnsmasq-dns-7c67bffd47-p5bpd\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.579436 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79658967bb-s277d"] Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.581325 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.596344 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szz94\" (UniqueName: \"kubernetes.io/projected/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-kube-api-access-szz94\") pod \"barbican-keystone-listener-564dc86f78-j9p7h\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.629610 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.639562 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-combined-ca-bundle\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.639604 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8tj6\" (UniqueName: \"kubernetes.io/projected/c2821d0f-99b6-4333-a90b-d82302ca28e4-kube-api-access-q8tj6\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.639668 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2821d0f-99b6-4333-a90b-d82302ca28e4-logs\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.639716 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.639773 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data-custom\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.712954 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.741572 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data-custom\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.741635 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-combined-ca-bundle\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.741671 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8tj6\" (UniqueName: \"kubernetes.io/projected/c2821d0f-99b6-4333-a90b-d82302ca28e4-kube-api-access-q8tj6\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.741773 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2821d0f-99b6-4333-a90b-d82302ca28e4-logs\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.741850 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.743068 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2821d0f-99b6-4333-a90b-d82302ca28e4-logs\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.748586 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.749737 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data-custom\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.755737 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-combined-ca-bundle\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.763935 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8tj6\" (UniqueName: \"kubernetes.io/projected/c2821d0f-99b6-4333-a90b-d82302ca28e4-kube-api-access-q8tj6\") pod \"barbican-api-79658967bb-s277d\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:38 crc kubenswrapper[5033]: I0226 16:03:38.958041 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:39 crc kubenswrapper[5033]: I0226 16:03:39.046396 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerStarted","Data":"d785b0a3a4d8ca4d33004dd1cb2d5aa0b2cdde86c1d586f47f28f2eefc320789"} Feb 26 16:03:39 crc kubenswrapper[5033]: I0226 16:03:39.214730 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-564dc86f78-j9p7h"] Feb 26 16:03:39 crc kubenswrapper[5033]: I0226 16:03:39.229558 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-p5bpd"] Feb 26 16:03:39 crc kubenswrapper[5033]: W0226 16:03:39.234526 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb24a7f0_3a2d_4337_83aa_3c244c2bd3c8.slice/crio-e2992f9194b8ed7eeaf8b826192cb625bbf131c35506c9d3572d8183375270b0 WatchSource:0}: Error finding container e2992f9194b8ed7eeaf8b826192cb625bbf131c35506c9d3572d8183375270b0: Status 404 returned error can't find the container with id e2992f9194b8ed7eeaf8b826192cb625bbf131c35506c9d3572d8183375270b0 Feb 26 16:03:39 crc kubenswrapper[5033]: W0226 16:03:39.238354 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7246c5a_8508_4f21_ac7a_76d218044d4c.slice/crio-a5f59e5232250a8dac761019a6c5365ff522ffcae235e4f64d0c84d2376e9b72 WatchSource:0}: Error finding container a5f59e5232250a8dac761019a6c5365ff522ffcae235e4f64d0c84d2376e9b72: Status 404 returned error can't find the container with id a5f59e5232250a8dac761019a6c5365ff522ffcae235e4f64d0c84d2376e9b72 Feb 26 16:03:39 crc kubenswrapper[5033]: I0226 16:03:39.371519 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c49db967-q8hjf"] Feb 26 16:03:39 crc kubenswrapper[5033]: W0226 16:03:39.375154 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50dfde4d_180f_4679_b02b_62a9592eb237.slice/crio-aae91e5df7617b6ed34e30efc4eb81b28d75b8e331eae519cfc2026fb556154d WatchSource:0}: Error finding container aae91e5df7617b6ed34e30efc4eb81b28d75b8e331eae519cfc2026fb556154d: Status 404 returned error can't find the container with id aae91e5df7617b6ed34e30efc4eb81b28d75b8e331eae519cfc2026fb556154d Feb 26 16:03:39 crc kubenswrapper[5033]: I0226 16:03:39.500942 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79658967bb-s277d"] Feb 26 16:03:40 crc kubenswrapper[5033]: I0226 16:03:40.061932 5033 generic.go:334] "Generic (PLEG): container finished" podID="b7246c5a-8508-4f21-ac7a-76d218044d4c" containerID="f904a5ad2956da4d61a1720665d277c271058e99bcae935155400081d797e70a" exitCode=0 Feb 26 16:03:40 crc kubenswrapper[5033]: I0226 16:03:40.062343 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" event={"ID":"b7246c5a-8508-4f21-ac7a-76d218044d4c","Type":"ContainerDied","Data":"f904a5ad2956da4d61a1720665d277c271058e99bcae935155400081d797e70a"} Feb 26 16:03:40 crc kubenswrapper[5033]: I0226 16:03:40.062374 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" event={"ID":"b7246c5a-8508-4f21-ac7a-76d218044d4c","Type":"ContainerStarted","Data":"a5f59e5232250a8dac761019a6c5365ff522ffcae235e4f64d0c84d2376e9b72"} Feb 26 16:03:40 crc kubenswrapper[5033]: I0226 16:03:40.066875 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79658967bb-s277d" event={"ID":"c2821d0f-99b6-4333-a90b-d82302ca28e4","Type":"ContainerStarted","Data":"f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351"} Feb 26 16:03:40 crc kubenswrapper[5033]: I0226 16:03:40.066915 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79658967bb-s277d" event={"ID":"c2821d0f-99b6-4333-a90b-d82302ca28e4","Type":"ContainerStarted","Data":"f9341db30e3ff3d81bf340b8e9df276c0e11b157b66e3d9893775f68cfb317c8"} Feb 26 16:03:40 crc kubenswrapper[5033]: I0226 16:03:40.070147 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c49db967-q8hjf" event={"ID":"50dfde4d-180f-4679-b02b-62a9592eb237","Type":"ContainerStarted","Data":"aae91e5df7617b6ed34e30efc4eb81b28d75b8e331eae519cfc2026fb556154d"} Feb 26 16:03:40 crc kubenswrapper[5033]: I0226 16:03:40.074273 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" event={"ID":"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8","Type":"ContainerStarted","Data":"e2992f9194b8ed7eeaf8b826192cb625bbf131c35506c9d3572d8183375270b0"} Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.089171 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" event={"ID":"b7246c5a-8508-4f21-ac7a-76d218044d4c","Type":"ContainerStarted","Data":"e1e88b8703395f870140ff3011ae32e83bb5f9661f7c45eb76a45cdcdd83c106"} Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.089838 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.094634 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79658967bb-s277d" event={"ID":"c2821d0f-99b6-4333-a90b-d82302ca28e4","Type":"ContainerStarted","Data":"a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4"} Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.095021 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.095068 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.110524 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" podStartSLOduration=3.110509638 podStartE2EDuration="3.110509638s" podCreationTimestamp="2026-02-26 16:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:41.106982369 +0000 UTC m=+1348.319581019" watchObservedRunningTime="2026-02-26 16:03:41.110509638 +0000 UTC m=+1348.323108288" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.134779 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79658967bb-s277d" podStartSLOduration=3.134759058 podStartE2EDuration="3.134759058s" podCreationTimestamp="2026-02-26 16:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:41.124143668 +0000 UTC m=+1348.336742338" watchObservedRunningTime="2026-02-26 16:03:41.134759058 +0000 UTC m=+1348.347357718" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.591989 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86984d6b66-tg6tl"] Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.593894 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.600704 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86984d6b66-tg6tl"] Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.600721 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.601419 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.714158 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.714510 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-public-tls-certs\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.714537 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q9gx\" (UniqueName: \"kubernetes.io/projected/9646add7-5da6-4590-9394-b4325e8ebd2d-kube-api-access-4q9gx\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.714593 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646add7-5da6-4590-9394-b4325e8ebd2d-logs\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.714618 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-internal-tls-certs\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.714635 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data-custom\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.714652 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-combined-ca-bundle\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.816536 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.816609 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-public-tls-certs\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.816634 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q9gx\" (UniqueName: \"kubernetes.io/projected/9646add7-5da6-4590-9394-b4325e8ebd2d-kube-api-access-4q9gx\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.816700 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646add7-5da6-4590-9394-b4325e8ebd2d-logs\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.816730 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-internal-tls-certs\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.816752 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data-custom\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.816777 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-combined-ca-bundle\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.817710 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646add7-5da6-4590-9394-b4325e8ebd2d-logs\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.820374 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-combined-ca-bundle\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.821718 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-public-tls-certs\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.823729 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data-custom\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.824630 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-internal-tls-certs\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.837339 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:41 crc kubenswrapper[5033]: I0226 16:03:41.843821 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q9gx\" (UniqueName: \"kubernetes.io/projected/9646add7-5da6-4590-9394-b4325e8ebd2d-kube-api-access-4q9gx\") pod \"barbican-api-86984d6b66-tg6tl\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.032161 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.105872 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c49db967-q8hjf" event={"ID":"50dfde4d-180f-4679-b02b-62a9592eb237","Type":"ContainerStarted","Data":"e792002563a7c6234a552b17da91049d9907ca9b6088ef5dc670bcb69af229ae"} Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.105918 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c49db967-q8hjf" event={"ID":"50dfde4d-180f-4679-b02b-62a9592eb237","Type":"ContainerStarted","Data":"f144e91d04c79983ad9d2426e8a65c28484d0f2b51c249d12afc1f6ee1a0c14a"} Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.113716 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerStarted","Data":"1b43d07fa03fbf179ff0065c49a4d5153ac07e4f5e097e96635a86e920a046a4"} Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.113779 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.124954 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" event={"ID":"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8","Type":"ContainerStarted","Data":"21a665deef8b6a56264f1e3d28a3709293482a49500e908c4c8d0cd127477c4b"} Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.125221 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" event={"ID":"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8","Type":"ContainerStarted","Data":"154a195b0bbcd9352b7c0c53282fb8b79ead74d9d7ef469042e66e9b52b0dc36"} Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.142350 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5c49db967-q8hjf" podStartSLOduration=2.018033769 podStartE2EDuration="4.142324404s" podCreationTimestamp="2026-02-26 16:03:38 +0000 UTC" firstStartedPulling="2026-02-26 16:03:39.377903791 +0000 UTC m=+1346.590502431" lastFinishedPulling="2026-02-26 16:03:41.502194416 +0000 UTC m=+1348.714793066" observedRunningTime="2026-02-26 16:03:42.129312409 +0000 UTC m=+1349.341911079" watchObservedRunningTime="2026-02-26 16:03:42.142324404 +0000 UTC m=+1349.354923084" Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.158872 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" podStartSLOduration=1.9011851210000001 podStartE2EDuration="4.158847579s" podCreationTimestamp="2026-02-26 16:03:38 +0000 UTC" firstStartedPulling="2026-02-26 16:03:39.243609987 +0000 UTC m=+1346.456208637" lastFinishedPulling="2026-02-26 16:03:41.501272445 +0000 UTC m=+1348.713871095" observedRunningTime="2026-02-26 16:03:42.148283499 +0000 UTC m=+1349.360882149" watchObservedRunningTime="2026-02-26 16:03:42.158847579 +0000 UTC m=+1349.371446229" Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.181517 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.579987514 podStartE2EDuration="8.181489202s" podCreationTimestamp="2026-02-26 16:03:34 +0000 UTC" firstStartedPulling="2026-02-26 16:03:34.901651649 +0000 UTC m=+1342.114250309" lastFinishedPulling="2026-02-26 16:03:41.503153347 +0000 UTC m=+1348.715751997" observedRunningTime="2026-02-26 16:03:42.172706723 +0000 UTC m=+1349.385305373" watchObservedRunningTime="2026-02-26 16:03:42.181489202 +0000 UTC m=+1349.394087852" Feb 26 16:03:42 crc kubenswrapper[5033]: I0226 16:03:42.541565 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86984d6b66-tg6tl"] Feb 26 16:03:43 crc kubenswrapper[5033]: I0226 16:03:43.142993 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86984d6b66-tg6tl" event={"ID":"9646add7-5da6-4590-9394-b4325e8ebd2d","Type":"ContainerStarted","Data":"31c65a36bdf508f77701ec37bf337b3a58e6c854126c11d769248ce82adecb01"} Feb 26 16:03:43 crc kubenswrapper[5033]: I0226 16:03:43.143471 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86984d6b66-tg6tl" event={"ID":"9646add7-5da6-4590-9394-b4325e8ebd2d","Type":"ContainerStarted","Data":"09653e26a9486bd66b87542be3842434fcfe613ab05437a4b825e241e39e81f0"} Feb 26 16:03:43 crc kubenswrapper[5033]: I0226 16:03:43.143482 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86984d6b66-tg6tl" event={"ID":"9646add7-5da6-4590-9394-b4325e8ebd2d","Type":"ContainerStarted","Data":"056cff826fc8e45b52030e3b8c00067d7e0a0355ae304f9d8c32c8242e5947f2"} Feb 26 16:03:44 crc kubenswrapper[5033]: I0226 16:03:44.152443 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:44 crc kubenswrapper[5033]: I0226 16:03:44.152858 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:44 crc kubenswrapper[5033]: I0226 16:03:44.195085 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86984d6b66-tg6tl" podStartSLOduration=3.195062818 podStartE2EDuration="3.195062818s" podCreationTimestamp="2026-02-26 16:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:44.181876949 +0000 UTC m=+1351.394475599" watchObservedRunningTime="2026-02-26 16:03:44.195062818 +0000 UTC m=+1351.407661488" Feb 26 16:03:48 crc kubenswrapper[5033]: I0226 16:03:48.188808 5033 generic.go:334] "Generic (PLEG): container finished" podID="cf479687-fa77-4c8b-95ee-23b30c858b35" containerID="4fc856374b9e4577d7bc3935a637339263321a668147cce6238a1cda79ff279b" exitCode=0 Feb 26 16:03:48 crc kubenswrapper[5033]: I0226 16:03:48.188855 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lkf9s" event={"ID":"cf479687-fa77-4c8b-95ee-23b30c858b35","Type":"ContainerDied","Data":"4fc856374b9e4577d7bc3935a637339263321a668147cce6238a1cda79ff279b"} Feb 26 16:03:48 crc kubenswrapper[5033]: I0226 16:03:48.520042 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:48 crc kubenswrapper[5033]: I0226 16:03:48.715341 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:03:48 crc kubenswrapper[5033]: I0226 16:03:48.799297 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d7qv5"] Feb 26 16:03:48 crc kubenswrapper[5033]: I0226 16:03:48.799628 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" podUID="9776d976-a39c-4d6a-ac1b-d2a03164c21e" containerName="dnsmasq-dns" containerID="cri-o://ef7eeb6aa813ec1d7fbf7209df2db453639f39a057ab6e1aa34fd3c66fe79cb1" gracePeriod=10 Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.203543 5033 generic.go:334] "Generic (PLEG): container finished" podID="9776d976-a39c-4d6a-ac1b-d2a03164c21e" containerID="ef7eeb6aa813ec1d7fbf7209df2db453639f39a057ab6e1aa34fd3c66fe79cb1" exitCode=0 Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.203619 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" event={"ID":"9776d976-a39c-4d6a-ac1b-d2a03164c21e","Type":"ContainerDied","Data":"ef7eeb6aa813ec1d7fbf7209df2db453639f39a057ab6e1aa34fd3c66fe79cb1"} Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.385433 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.487524 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-svc\") pod \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.487610 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-nb\") pod \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.487678 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-sb\") pod \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.487741 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-config\") pod \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.487875 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-swift-storage-0\") pod \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.487947 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxcbg\" (UniqueName: \"kubernetes.io/projected/9776d976-a39c-4d6a-ac1b-d2a03164c21e-kube-api-access-sxcbg\") pod \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\" (UID: \"9776d976-a39c-4d6a-ac1b-d2a03164c21e\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.495682 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9776d976-a39c-4d6a-ac1b-d2a03164c21e-kube-api-access-sxcbg" (OuterVolumeSpecName: "kube-api-access-sxcbg") pod "9776d976-a39c-4d6a-ac1b-d2a03164c21e" (UID: "9776d976-a39c-4d6a-ac1b-d2a03164c21e"). InnerVolumeSpecName "kube-api-access-sxcbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.559992 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9776d976-a39c-4d6a-ac1b-d2a03164c21e" (UID: "9776d976-a39c-4d6a-ac1b-d2a03164c21e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.591294 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxcbg\" (UniqueName: \"kubernetes.io/projected/9776d976-a39c-4d6a-ac1b-d2a03164c21e-kube-api-access-sxcbg\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.591329 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.594750 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-config" (OuterVolumeSpecName: "config") pod "9776d976-a39c-4d6a-ac1b-d2a03164c21e" (UID: "9776d976-a39c-4d6a-ac1b-d2a03164c21e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.613372 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9776d976-a39c-4d6a-ac1b-d2a03164c21e" (UID: "9776d976-a39c-4d6a-ac1b-d2a03164c21e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.615963 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9776d976-a39c-4d6a-ac1b-d2a03164c21e" (UID: "9776d976-a39c-4d6a-ac1b-d2a03164c21e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.644699 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9776d976-a39c-4d6a-ac1b-d2a03164c21e" (UID: "9776d976-a39c-4d6a-ac1b-d2a03164c21e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.652688 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.694034 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.694077 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.694090 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.694102 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9776d976-a39c-4d6a-ac1b-d2a03164c21e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.794894 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b86b\" (UniqueName: \"kubernetes.io/projected/cf479687-fa77-4c8b-95ee-23b30c858b35-kube-api-access-9b86b\") pod \"cf479687-fa77-4c8b-95ee-23b30c858b35\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.795069 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf479687-fa77-4c8b-95ee-23b30c858b35-etc-machine-id\") pod \"cf479687-fa77-4c8b-95ee-23b30c858b35\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.795220 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-config-data\") pod \"cf479687-fa77-4c8b-95ee-23b30c858b35\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.795294 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf479687-fa77-4c8b-95ee-23b30c858b35-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cf479687-fa77-4c8b-95ee-23b30c858b35" (UID: "cf479687-fa77-4c8b-95ee-23b30c858b35"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.795549 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-scripts\") pod \"cf479687-fa77-4c8b-95ee-23b30c858b35\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.796067 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-db-sync-config-data\") pod \"cf479687-fa77-4c8b-95ee-23b30c858b35\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.796395 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-combined-ca-bundle\") pod \"cf479687-fa77-4c8b-95ee-23b30c858b35\" (UID: \"cf479687-fa77-4c8b-95ee-23b30c858b35\") " Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.796992 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf479687-fa77-4c8b-95ee-23b30c858b35-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.800591 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cf479687-fa77-4c8b-95ee-23b30c858b35" (UID: "cf479687-fa77-4c8b-95ee-23b30c858b35"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.800784 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf479687-fa77-4c8b-95ee-23b30c858b35-kube-api-access-9b86b" (OuterVolumeSpecName: "kube-api-access-9b86b") pod "cf479687-fa77-4c8b-95ee-23b30c858b35" (UID: "cf479687-fa77-4c8b-95ee-23b30c858b35"). InnerVolumeSpecName "kube-api-access-9b86b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.802488 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-scripts" (OuterVolumeSpecName: "scripts") pod "cf479687-fa77-4c8b-95ee-23b30c858b35" (UID: "cf479687-fa77-4c8b-95ee-23b30c858b35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.824029 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf479687-fa77-4c8b-95ee-23b30c858b35" (UID: "cf479687-fa77-4c8b-95ee-23b30c858b35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.898508 5033 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.898553 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.898570 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b86b\" (UniqueName: \"kubernetes.io/projected/cf479687-fa77-4c8b-95ee-23b30c858b35-kube-api-access-9b86b\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.898582 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:49 crc kubenswrapper[5033]: I0226 16:03:49.899742 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-config-data" (OuterVolumeSpecName: "config-data") pod "cf479687-fa77-4c8b-95ee-23b30c858b35" (UID: "cf479687-fa77-4c8b-95ee-23b30c858b35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.000422 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf479687-fa77-4c8b-95ee-23b30c858b35-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.189314 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.189695 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.247551 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" event={"ID":"9776d976-a39c-4d6a-ac1b-d2a03164c21e","Type":"ContainerDied","Data":"0036f483e7374a4e38d9e168992b5d12633f95871b5298cca55be5606cf80319"} Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.247616 5033 scope.go:117] "RemoveContainer" containerID="ef7eeb6aa813ec1d7fbf7209df2db453639f39a057ab6e1aa34fd3c66fe79cb1" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.247775 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-d7qv5" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.260834 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lkf9s" event={"ID":"cf479687-fa77-4c8b-95ee-23b30c858b35","Type":"ContainerDied","Data":"1cb52df17c55d2c2af4e4f86e59be92ac91587f0a35694028b2d1e3bfbcfa576"} Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.260871 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cb52df17c55d2c2af4e4f86e59be92ac91587f0a35694028b2d1e3bfbcfa576" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.260923 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lkf9s" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.286769 5033 scope.go:117] "RemoveContainer" containerID="feea146a8e1448f993add04f8c1fcee6a819a21e21a884498fe86641c0f717a8" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.294120 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d7qv5"] Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.300733 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d7qv5"] Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.502922 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:03:50 crc kubenswrapper[5033]: E0226 16:03:50.503305 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf479687-fa77-4c8b-95ee-23b30c858b35" containerName="cinder-db-sync" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.503316 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf479687-fa77-4c8b-95ee-23b30c858b35" containerName="cinder-db-sync" Feb 26 16:03:50 crc kubenswrapper[5033]: E0226 16:03:50.503326 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9776d976-a39c-4d6a-ac1b-d2a03164c21e" containerName="dnsmasq-dns" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.503333 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9776d976-a39c-4d6a-ac1b-d2a03164c21e" containerName="dnsmasq-dns" Feb 26 16:03:50 crc kubenswrapper[5033]: E0226 16:03:50.503348 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9776d976-a39c-4d6a-ac1b-d2a03164c21e" containerName="init" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.503353 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9776d976-a39c-4d6a-ac1b-d2a03164c21e" containerName="init" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.503541 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9776d976-a39c-4d6a-ac1b-d2a03164c21e" containerName="dnsmasq-dns" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.503562 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf479687-fa77-4c8b-95ee-23b30c858b35" containerName="cinder-db-sync" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.504695 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.506632 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.506868 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-794tw" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.506996 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.511013 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.522848 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.567222 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-txvs4"] Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.570855 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.582019 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-txvs4"] Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.611215 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.611295 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmvwg\" (UniqueName: \"kubernetes.io/projected/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-kube-api-access-hmvwg\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.611341 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.611403 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.611436 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.611466 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.676247 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.686249 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.689789 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.705554 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.712813 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-svc\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.712863 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.712887 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.712941 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.712981 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.713011 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.713042 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.713060 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.713099 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.713120 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-config\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.713147 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgmf\" (UniqueName: \"kubernetes.io/projected/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-kube-api-access-4fgmf\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.713164 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmvwg\" (UniqueName: \"kubernetes.io/projected/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-kube-api-access-hmvwg\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.715331 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.721476 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.725941 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.728828 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.734727 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.746695 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmvwg\" (UniqueName: \"kubernetes.io/projected/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-kube-api-access-hmvwg\") pod \"cinder-scheduler-0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.814734 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.814777 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-scripts\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.814806 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.814859 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.814881 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-config\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.814916 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgmf\" (UniqueName: \"kubernetes.io/projected/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-kube-api-access-4fgmf\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.814961 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-svc\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.814987 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.815016 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973e0871-cf11-4259-9af4-05208084bda2-logs\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.815035 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz29s\" (UniqueName: \"kubernetes.io/projected/973e0871-cf11-4259-9af4-05208084bda2-kube-api-access-xz29s\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.815058 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.815081 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/973e0871-cf11-4259-9af4-05208084bda2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.815116 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data-custom\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.815920 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.816231 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.816483 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.816902 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-config\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.817771 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-svc\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.840382 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.842804 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgmf\" (UniqueName: \"kubernetes.io/projected/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-kube-api-access-4fgmf\") pod \"dnsmasq-dns-5cc8b5d5c5-txvs4\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.859549 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.918494 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data-custom\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.918585 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.918610 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-scripts\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.918671 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.918842 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973e0871-cf11-4259-9af4-05208084bda2-logs\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.918873 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz29s\" (UniqueName: \"kubernetes.io/projected/973e0871-cf11-4259-9af4-05208084bda2-kube-api-access-xz29s\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.918909 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/973e0871-cf11-4259-9af4-05208084bda2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.927891 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data-custom\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.933171 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.934077 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973e0871-cf11-4259-9af4-05208084bda2-logs\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.934121 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/973e0871-cf11-4259-9af4-05208084bda2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.935333 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.935722 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.948076 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-scripts\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:50 crc kubenswrapper[5033]: I0226 16:03:50.983028 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz29s\" (UniqueName: \"kubernetes.io/projected/973e0871-cf11-4259-9af4-05208084bda2-kube-api-access-xz29s\") pod \"cinder-api-0\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " pod="openstack/cinder-api-0" Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.010141 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79658967bb-s277d"] Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.010539 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api-log" containerID="cri-o://f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351" gracePeriod=30 Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.011081 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api" containerID="cri-o://a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4" gracePeriod=30 Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.029169 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.086549 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.087066 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.184254 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.184956 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.644029 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9776d976-a39c-4d6a-ac1b-d2a03164c21e" path="/var/lib/kubelet/pods/9776d976-a39c-4d6a-ac1b-d2a03164c21e/volumes" Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.682660 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.840688 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-txvs4"] Feb 26 16:03:51 crc kubenswrapper[5033]: I0226 16:03:51.864940 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:03:52 crc kubenswrapper[5033]: I0226 16:03:52.227883 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:03:52 crc kubenswrapper[5033]: I0226 16:03:52.293796 5033 generic.go:334] "Generic (PLEG): container finished" podID="fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" containerID="d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff" exitCode=0 Feb 26 16:03:52 crc kubenswrapper[5033]: I0226 16:03:52.294117 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" event={"ID":"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8","Type":"ContainerDied","Data":"d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff"} Feb 26 16:03:52 crc kubenswrapper[5033]: I0226 16:03:52.294160 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" event={"ID":"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8","Type":"ContainerStarted","Data":"0e2e27e4727a8f7148f1bf9ad60a6a7d0afcb17a681874bf1d73bf38a91f3e8e"} Feb 26 16:03:52 crc kubenswrapper[5033]: I0226 16:03:52.315830 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0","Type":"ContainerStarted","Data":"ddd88155fc5bcf454cc0588f82a7b837678f5ba4e22ca17821b2b1f703c7f0c8"} Feb 26 16:03:52 crc kubenswrapper[5033]: I0226 16:03:52.317772 5033 generic.go:334] "Generic (PLEG): container finished" podID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerID="f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351" exitCode=143 Feb 26 16:03:52 crc kubenswrapper[5033]: I0226 16:03:52.317829 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79658967bb-s277d" event={"ID":"c2821d0f-99b6-4333-a90b-d82302ca28e4","Type":"ContainerDied","Data":"f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351"} Feb 26 16:03:52 crc kubenswrapper[5033]: I0226 16:03:52.318742 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"973e0871-cf11-4259-9af4-05208084bda2","Type":"ContainerStarted","Data":"229beeaf8f0454e68c76732cc1f104360dd015d35069d0c2147a12a4b6ec931a"} Feb 26 16:03:53 crc kubenswrapper[5033]: I0226 16:03:53.334322 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" event={"ID":"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8","Type":"ContainerStarted","Data":"8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c"} Feb 26 16:03:53 crc kubenswrapper[5033]: I0226 16:03:53.334751 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:53 crc kubenswrapper[5033]: I0226 16:03:53.348566 5033 generic.go:334] "Generic (PLEG): container finished" podID="6410a07e-efb2-4773-8007-688038a9898e" containerID="88846ecdfcf0ca846f3ca53a24d1fe2b4ad8f940ade0f867c6bb6c964a133abd" exitCode=0 Feb 26 16:03:53 crc kubenswrapper[5033]: I0226 16:03:53.348647 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bknn" event={"ID":"6410a07e-efb2-4773-8007-688038a9898e","Type":"ContainerDied","Data":"88846ecdfcf0ca846f3ca53a24d1fe2b4ad8f940ade0f867c6bb6c964a133abd"} Feb 26 16:03:53 crc kubenswrapper[5033]: I0226 16:03:53.358167 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"973e0871-cf11-4259-9af4-05208084bda2","Type":"ContainerStarted","Data":"e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348"} Feb 26 16:03:53 crc kubenswrapper[5033]: I0226 16:03:53.371876 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" podStartSLOduration=3.371856723 podStartE2EDuration="3.371856723s" podCreationTimestamp="2026-02-26 16:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:53.371129326 +0000 UTC m=+1360.583727976" watchObservedRunningTime="2026-02-26 16:03:53.371856723 +0000 UTC m=+1360.584455373" Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.407672 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"973e0871-cf11-4259-9af4-05208084bda2","Type":"ContainerStarted","Data":"f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084"} Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.408595 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.418989 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0","Type":"ContainerStarted","Data":"0928a95859008e017abdb4f3911ce683acef72bfa6b55cb0c3fb61218d18e121"} Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.419072 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0","Type":"ContainerStarted","Data":"ae05a73127e8a5569d83520289b9d950307213560a9017a627b1bb097b267616"} Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.442582 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.44256044 podStartE2EDuration="4.44256044s" podCreationTimestamp="2026-02-26 16:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:54.438992981 +0000 UTC m=+1361.651591641" watchObservedRunningTime="2026-02-26 16:03:54.44256044 +0000 UTC m=+1361.655159090" Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.479924 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.504192226 podStartE2EDuration="4.479899443s" podCreationTimestamp="2026-02-26 16:03:50 +0000 UTC" firstStartedPulling="2026-02-26 16:03:51.701120848 +0000 UTC m=+1358.913719498" lastFinishedPulling="2026-02-26 16:03:52.676828065 +0000 UTC m=+1359.889426715" observedRunningTime="2026-02-26 16:03:54.468123478 +0000 UTC m=+1361.680722128" watchObservedRunningTime="2026-02-26 16:03:54.479899443 +0000 UTC m=+1361.692498093" Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.768458 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.815914 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bknn" Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.937407 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-config\") pod \"6410a07e-efb2-4773-8007-688038a9898e\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.937529 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-combined-ca-bundle\") pod \"6410a07e-efb2-4773-8007-688038a9898e\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.937604 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5jhv\" (UniqueName: \"kubernetes.io/projected/6410a07e-efb2-4773-8007-688038a9898e-kube-api-access-s5jhv\") pod \"6410a07e-efb2-4773-8007-688038a9898e\" (UID: \"6410a07e-efb2-4773-8007-688038a9898e\") " Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.963139 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6410a07e-efb2-4773-8007-688038a9898e-kube-api-access-s5jhv" (OuterVolumeSpecName: "kube-api-access-s5jhv") pod "6410a07e-efb2-4773-8007-688038a9898e" (UID: "6410a07e-efb2-4773-8007-688038a9898e"). InnerVolumeSpecName "kube-api-access-s5jhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.974950 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6410a07e-efb2-4773-8007-688038a9898e" (UID: "6410a07e-efb2-4773-8007-688038a9898e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:54 crc kubenswrapper[5033]: I0226 16:03:54.975574 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-config" (OuterVolumeSpecName: "config") pod "6410a07e-efb2-4773-8007-688038a9898e" (UID: "6410a07e-efb2-4773-8007-688038a9898e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.040865 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.040895 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6410a07e-efb2-4773-8007-688038a9898e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.040905 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5jhv\" (UniqueName: \"kubernetes.io/projected/6410a07e-efb2-4773-8007-688038a9898e-kube-api-access-s5jhv\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.288403 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 26 16:03:55 crc kubenswrapper[5033]: E0226 16:03:55.288781 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6410a07e-efb2-4773-8007-688038a9898e" containerName="neutron-db-sync" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.288797 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6410a07e-efb2-4773-8007-688038a9898e" containerName="neutron-db-sync" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.288949 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6410a07e-efb2-4773-8007-688038a9898e" containerName="neutron-db-sync" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.289611 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.291582 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.294468 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.295016 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7jqxx" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.297986 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.345587 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config-secret\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.345727 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.345812 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.345848 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwqk6\" (UniqueName: \"kubernetes.io/projected/c0f0a198-dc99-4384-8ab0-b978bf9275b5-kube-api-access-bwqk6\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.431431 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bknn" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.433413 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bknn" event={"ID":"6410a07e-efb2-4773-8007-688038a9898e","Type":"ContainerDied","Data":"201156ef11bb9c2bf2914374469531624d7b07a0c48cd39edb333f22dc3ea498"} Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.433477 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="201156ef11bb9c2bf2914374469531624d7b07a0c48cd39edb333f22dc3ea498" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.447311 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.447367 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.447386 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwqk6\" (UniqueName: \"kubernetes.io/projected/c0f0a198-dc99-4384-8ab0-b978bf9275b5-kube-api-access-bwqk6\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.447497 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config-secret\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.448983 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.452738 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config-secret\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.462984 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.480883 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwqk6\" (UniqueName: \"kubernetes.io/projected/c0f0a198-dc99-4384-8ab0-b978bf9275b5-kube-api-access-bwqk6\") pod \"openstackclient\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.572420 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-txvs4"] Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.572663 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" podUID="fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" containerName="dnsmasq-dns" containerID="cri-o://8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c" gracePeriod=10 Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.605582 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.654576 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dnr6c"] Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.657231 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dnr6c"] Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.657875 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.714515 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69ff95858d-bj8jj"] Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.716507 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.718509 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.718668 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.718868 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.718967 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gbh42" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.750893 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69ff95858d-bj8jj"] Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755318 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-ovndb-tls-certs\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755369 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-svc\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755446 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-config\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755473 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755524 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-combined-ca-bundle\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755589 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-config\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755627 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755660 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755696 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v7gz\" (UniqueName: \"kubernetes.io/projected/e4360f1d-93a6-47c3-84ba-96c8d202f420-kube-api-access-2v7gz\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755720 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p79l5\" (UniqueName: \"kubernetes.io/projected/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-kube-api-access-p79l5\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.755816 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-httpd-config\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.841147 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857557 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-ovndb-tls-certs\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857593 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-svc\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857642 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-config\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857663 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857696 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-combined-ca-bundle\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857749 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-config\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857782 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857808 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857834 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v7gz\" (UniqueName: \"kubernetes.io/projected/e4360f1d-93a6-47c3-84ba-96c8d202f420-kube-api-access-2v7gz\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857857 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p79l5\" (UniqueName: \"kubernetes.io/projected/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-kube-api-access-p79l5\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.857921 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-httpd-config\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.861453 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-config\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.862850 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-svc\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.863110 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-httpd-config\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.863245 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.863521 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.863731 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.865057 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-ovndb-tls-certs\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.880164 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-combined-ca-bundle\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.884023 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v7gz\" (UniqueName: \"kubernetes.io/projected/e4360f1d-93a6-47c3-84ba-96c8d202f420-kube-api-access-2v7gz\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.884595 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-config\") pod \"neutron-69ff95858d-bj8jj\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.885645 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p79l5\" (UniqueName: \"kubernetes.io/projected/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-kube-api-access-p79l5\") pod \"dnsmasq-dns-6578955fd5-dnr6c\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.974945 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:55 crc kubenswrapper[5033]: I0226 16:03:55.991018 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.228696 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.247679 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.273730 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fgmf\" (UniqueName: \"kubernetes.io/projected/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-kube-api-access-4fgmf\") pod \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.273903 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-config\") pod \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.273935 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-sb\") pod \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.273967 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-svc\") pod \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.274024 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-nb\") pod \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.274067 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-swift-storage-0\") pod \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\" (UID: \"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8\") " Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.275971 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.308802 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-kube-api-access-4fgmf" (OuterVolumeSpecName: "kube-api-access-4fgmf") pod "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" (UID: "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8"). InnerVolumeSpecName "kube-api-access-4fgmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.363705 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.370287 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" (UID: "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.371846 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-config" (OuterVolumeSpecName: "config") pod "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" (UID: "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.375629 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fgmf\" (UniqueName: \"kubernetes.io/projected/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-kube-api-access-4fgmf\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.375649 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.375659 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.392895 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" (UID: "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.409138 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" (UID: "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.445803 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" (UID: "fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.460818 5033 generic.go:334] "Generic (PLEG): container finished" podID="fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" containerID="8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c" exitCode=0 Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.460875 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" event={"ID":"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8","Type":"ContainerDied","Data":"8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c"} Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.460901 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" event={"ID":"fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8","Type":"ContainerDied","Data":"0e2e27e4727a8f7148f1bf9ad60a6a7d0afcb17a681874bf1d73bf38a91f3e8e"} Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.460917 5033 scope.go:117] "RemoveContainer" containerID="8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.461025 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8b5d5c5-txvs4" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.465662 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="973e0871-cf11-4259-9af4-05208084bda2" containerName="cinder-api-log" containerID="cri-o://e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348" gracePeriod=30 Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.465781 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c0f0a198-dc99-4384-8ab0-b978bf9275b5","Type":"ContainerStarted","Data":"92f6158f5168a5a03532d3c852b5632b8d255f7678802be2f94addf3b88a32d4"} Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.466250 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="973e0871-cf11-4259-9af4-05208084bda2" containerName="cinder-api" containerID="cri-o://f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084" gracePeriod=30 Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.477417 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.477447 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.477456 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.509841 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-txvs4"] Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.511437 5033 scope.go:117] "RemoveContainer" containerID="d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.523814 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cc8b5d5c5-txvs4"] Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.568605 5033 scope.go:117] "RemoveContainer" containerID="8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c" Feb 26 16:03:56 crc kubenswrapper[5033]: E0226 16:03:56.569045 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c\": container with ID starting with 8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c not found: ID does not exist" containerID="8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.569092 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c"} err="failed to get container status \"8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c\": rpc error: code = NotFound desc = could not find container \"8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c\": container with ID starting with 8c8c403219fa5d77064cfff5dc84a0d3c969f5bac0961db8f65174f657dd261c not found: ID does not exist" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.569120 5033 scope.go:117] "RemoveContainer" containerID="d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff" Feb 26 16:03:56 crc kubenswrapper[5033]: E0226 16:03:56.569371 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff\": container with ID starting with d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff not found: ID does not exist" containerID="d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.569393 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff"} err="failed to get container status \"d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff\": rpc error: code = NotFound desc = could not find container \"d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff\": container with ID starting with d14f201c3b16ca145511df15200634b0e37f62d2255ad7c231401f280430b8ff not found: ID does not exist" Feb 26 16:03:56 crc kubenswrapper[5033]: I0226 16:03:56.591435 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dnr6c"] Feb 26 16:03:56 crc kubenswrapper[5033]: W0226 16:03:56.604080 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22ffc2b4_a76b_48a9_81ff_1afa3bf72bb0.slice/crio-95c3ba818d7cbb9c7d0b98c3e2f660f05d8b6d0e245a7a100ee13d8692b98e84 WatchSource:0}: Error finding container 95c3ba818d7cbb9c7d0b98c3e2f660f05d8b6d0e245a7a100ee13d8692b98e84: Status 404 returned error can't find the container with id 95c3ba818d7cbb9c7d0b98c3e2f660f05d8b6d0e245a7a100ee13d8692b98e84 Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.186901 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69ff95858d-bj8jj"] Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.368410 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.404208 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data-custom\") pod \"973e0871-cf11-4259-9af4-05208084bda2\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.404280 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-scripts\") pod \"973e0871-cf11-4259-9af4-05208084bda2\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.404319 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/973e0871-cf11-4259-9af4-05208084bda2-etc-machine-id\") pod \"973e0871-cf11-4259-9af4-05208084bda2\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.404416 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-combined-ca-bundle\") pod \"973e0871-cf11-4259-9af4-05208084bda2\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.404542 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973e0871-cf11-4259-9af4-05208084bda2-logs\") pod \"973e0871-cf11-4259-9af4-05208084bda2\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.404571 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data\") pod \"973e0871-cf11-4259-9af4-05208084bda2\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.404598 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz29s\" (UniqueName: \"kubernetes.io/projected/973e0871-cf11-4259-9af4-05208084bda2-kube-api-access-xz29s\") pod \"973e0871-cf11-4259-9af4-05208084bda2\" (UID: \"973e0871-cf11-4259-9af4-05208084bda2\") " Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.405839 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/973e0871-cf11-4259-9af4-05208084bda2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "973e0871-cf11-4259-9af4-05208084bda2" (UID: "973e0871-cf11-4259-9af4-05208084bda2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.406445 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/973e0871-cf11-4259-9af4-05208084bda2-logs" (OuterVolumeSpecName: "logs") pod "973e0871-cf11-4259-9af4-05208084bda2" (UID: "973e0871-cf11-4259-9af4-05208084bda2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.416086 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "973e0871-cf11-4259-9af4-05208084bda2" (UID: "973e0871-cf11-4259-9af4-05208084bda2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.416133 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973e0871-cf11-4259-9af4-05208084bda2-kube-api-access-xz29s" (OuterVolumeSpecName: "kube-api-access-xz29s") pod "973e0871-cf11-4259-9af4-05208084bda2" (UID: "973e0871-cf11-4259-9af4-05208084bda2"). InnerVolumeSpecName "kube-api-access-xz29s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.416433 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-scripts" (OuterVolumeSpecName: "scripts") pod "973e0871-cf11-4259-9af4-05208084bda2" (UID: "973e0871-cf11-4259-9af4-05208084bda2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.436722 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "973e0871-cf11-4259-9af4-05208084bda2" (UID: "973e0871-cf11-4259-9af4-05208084bda2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.484161 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data" (OuterVolumeSpecName: "config-data") pod "973e0871-cf11-4259-9af4-05208084bda2" (UID: "973e0871-cf11-4259-9af4-05208084bda2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.489345 5033 generic.go:334] "Generic (PLEG): container finished" podID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" containerID="d10243c2ce8cfd3b74cc4ecb2619ab97671f9b148f3e1a2df416a2662cd1934e" exitCode=0 Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.489831 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" event={"ID":"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0","Type":"ContainerDied","Data":"d10243c2ce8cfd3b74cc4ecb2619ab97671f9b148f3e1a2df416a2662cd1934e"} Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.490226 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" event={"ID":"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0","Type":"ContainerStarted","Data":"95c3ba818d7cbb9c7d0b98c3e2f660f05d8b6d0e245a7a100ee13d8692b98e84"} Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.500110 5033 generic.go:334] "Generic (PLEG): container finished" podID="973e0871-cf11-4259-9af4-05208084bda2" containerID="f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084" exitCode=0 Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.500139 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.500156 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"973e0871-cf11-4259-9af4-05208084bda2","Type":"ContainerDied","Data":"f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084"} Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.500179 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"973e0871-cf11-4259-9af4-05208084bda2","Type":"ContainerDied","Data":"e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348"} Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.500194 5033 scope.go:117] "RemoveContainer" containerID="f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.500144 5033 generic.go:334] "Generic (PLEG): container finished" podID="973e0871-cf11-4259-9af4-05208084bda2" containerID="e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348" exitCode=143 Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.500736 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"973e0871-cf11-4259-9af4-05208084bda2","Type":"ContainerDied","Data":"229beeaf8f0454e68c76732cc1f104360dd015d35069d0c2147a12a4b6ec931a"} Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.506643 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/973e0871-cf11-4259-9af4-05208084bda2-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.506670 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.506681 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz29s\" (UniqueName: \"kubernetes.io/projected/973e0871-cf11-4259-9af4-05208084bda2-kube-api-access-xz29s\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.506690 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.506698 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.506708 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/973e0871-cf11-4259-9af4-05208084bda2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.506716 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973e0871-cf11-4259-9af4-05208084bda2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.523401 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69ff95858d-bj8jj" event={"ID":"e4360f1d-93a6-47c3-84ba-96c8d202f420","Type":"ContainerStarted","Data":"80f2f5f965ce1fd1ed8d986dc596ed2ef004d9abb11a22323605038c8eda0bf4"} Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.591426 5033 scope.go:117] "RemoveContainer" containerID="e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.591583 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.598568 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.615133 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:03:57 crc kubenswrapper[5033]: E0226 16:03:57.615533 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973e0871-cf11-4259-9af4-05208084bda2" containerName="cinder-api" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.615555 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="973e0871-cf11-4259-9af4-05208084bda2" containerName="cinder-api" Feb 26 16:03:57 crc kubenswrapper[5033]: E0226 16:03:57.615570 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973e0871-cf11-4259-9af4-05208084bda2" containerName="cinder-api-log" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.615576 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="973e0871-cf11-4259-9af4-05208084bda2" containerName="cinder-api-log" Feb 26 16:03:57 crc kubenswrapper[5033]: E0226 16:03:57.615588 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" containerName="dnsmasq-dns" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.615594 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" containerName="dnsmasq-dns" Feb 26 16:03:57 crc kubenswrapper[5033]: E0226 16:03:57.615606 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" containerName="init" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.615612 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" containerName="init" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.615810 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="973e0871-cf11-4259-9af4-05208084bda2" containerName="cinder-api-log" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.615831 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="973e0871-cf11-4259-9af4-05208084bda2" containerName="cinder-api" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.615849 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" containerName="dnsmasq-dns" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.616730 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.621282 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.621662 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.622592 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.645082 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973e0871-cf11-4259-9af4-05208084bda2" path="/var/lib/kubelet/pods/973e0871-cf11-4259-9af4-05208084bda2/volumes" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.645745 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8" path="/var/lib/kubelet/pods/fc273a06-6ad8-4ea3-b38b-2ebf93f3d2f8/volumes" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.649125 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.659424 5033 scope.go:117] "RemoveContainer" containerID="f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084" Feb 26 16:03:57 crc kubenswrapper[5033]: E0226 16:03:57.660183 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084\": container with ID starting with f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084 not found: ID does not exist" containerID="f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.660222 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084"} err="failed to get container status \"f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084\": rpc error: code = NotFound desc = could not find container \"f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084\": container with ID starting with f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084 not found: ID does not exist" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.660250 5033 scope.go:117] "RemoveContainer" containerID="e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348" Feb 26 16:03:57 crc kubenswrapper[5033]: E0226 16:03:57.660700 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348\": container with ID starting with e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348 not found: ID does not exist" containerID="e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.660724 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348"} err="failed to get container status \"e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348\": rpc error: code = NotFound desc = could not find container \"e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348\": container with ID starting with e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348 not found: ID does not exist" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.660740 5033 scope.go:117] "RemoveContainer" containerID="f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.661765 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084"} err="failed to get container status \"f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084\": rpc error: code = NotFound desc = could not find container \"f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084\": container with ID starting with f1af75412a9140e0ba58afe4b531dbdf17bf751c908d7d3c55c218aaa86d5084 not found: ID does not exist" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.661806 5033 scope.go:117] "RemoveContainer" containerID="e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.663490 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348"} err="failed to get container status \"e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348\": rpc error: code = NotFound desc = could not find container \"e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348\": container with ID starting with e2682766c41eb33851174b3a2fbc06e2979f6defca24e2a54297e12738152348 not found: ID does not exist" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.714223 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.714589 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-logs\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.714611 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.714640 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.714661 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.714718 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.714775 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data-custom\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.714824 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5zjr\" (UniqueName: \"kubernetes.io/projected/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-kube-api-access-s5zjr\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.714873 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-scripts\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.717726 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:36698->10.217.0.161:9311: read: connection reset by peer" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.718062 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79658967bb-s277d" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:36712->10.217.0.161:9311: read: connection reset by peer" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.818818 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.818872 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-logs\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.818889 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.818910 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.818931 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.818973 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.819013 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data-custom\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.819048 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zjr\" (UniqueName: \"kubernetes.io/projected/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-kube-api-access-s5zjr\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.819086 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-scripts\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.819397 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.820864 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-logs\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.823764 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.823878 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-scripts\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.828101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data-custom\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.828850 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.830867 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.839219 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.850540 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5zjr\" (UniqueName: \"kubernetes.io/projected/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-kube-api-access-s5zjr\") pod \"cinder-api-0\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " pod="openstack/cinder-api-0" Feb 26 16:03:57 crc kubenswrapper[5033]: I0226 16:03:57.951753 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.268804 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d9577fb9-j9mmc"] Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.270665 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.272558 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.273952 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.307186 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d9577fb9-j9mmc"] Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.315625 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.330671 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-ovndb-tls-certs\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.330728 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-combined-ca-bundle\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.330801 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-internal-tls-certs\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.330837 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2vrn\" (UniqueName: \"kubernetes.io/projected/db7073ea-074d-41d7-b114-9d6b8e02c3a0-kube-api-access-m2vrn\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.330861 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-public-tls-certs\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.330885 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-httpd-config\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.330908 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-config\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.432621 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8tj6\" (UniqueName: \"kubernetes.io/projected/c2821d0f-99b6-4333-a90b-d82302ca28e4-kube-api-access-q8tj6\") pod \"c2821d0f-99b6-4333-a90b-d82302ca28e4\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.432695 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data-custom\") pod \"c2821d0f-99b6-4333-a90b-d82302ca28e4\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.432766 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data\") pod \"c2821d0f-99b6-4333-a90b-d82302ca28e4\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.432792 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2821d0f-99b6-4333-a90b-d82302ca28e4-logs\") pod \"c2821d0f-99b6-4333-a90b-d82302ca28e4\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.432823 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-combined-ca-bundle\") pod \"c2821d0f-99b6-4333-a90b-d82302ca28e4\" (UID: \"c2821d0f-99b6-4333-a90b-d82302ca28e4\") " Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.433069 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-internal-tls-certs\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.433112 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2vrn\" (UniqueName: \"kubernetes.io/projected/db7073ea-074d-41d7-b114-9d6b8e02c3a0-kube-api-access-m2vrn\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.433133 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-public-tls-certs\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.433151 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-httpd-config\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.433168 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-config\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.433219 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-ovndb-tls-certs\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.433249 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-combined-ca-bundle\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.438590 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-combined-ca-bundle\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.443582 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2821d0f-99b6-4333-a90b-d82302ca28e4-logs" (OuterVolumeSpecName: "logs") pod "c2821d0f-99b6-4333-a90b-d82302ca28e4" (UID: "c2821d0f-99b6-4333-a90b-d82302ca28e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.448150 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-internal-tls-certs\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.451835 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-ovndb-tls-certs\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.454005 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c2821d0f-99b6-4333-a90b-d82302ca28e4" (UID: "c2821d0f-99b6-4333-a90b-d82302ca28e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.454674 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-httpd-config\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.456739 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-public-tls-certs\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.468181 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2vrn\" (UniqueName: \"kubernetes.io/projected/db7073ea-074d-41d7-b114-9d6b8e02c3a0-kube-api-access-m2vrn\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.468594 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2821d0f-99b6-4333-a90b-d82302ca28e4-kube-api-access-q8tj6" (OuterVolumeSpecName: "kube-api-access-q8tj6") pod "c2821d0f-99b6-4333-a90b-d82302ca28e4" (UID: "c2821d0f-99b6-4333-a90b-d82302ca28e4"). InnerVolumeSpecName "kube-api-access-q8tj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.468762 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-config\") pod \"neutron-6d9577fb9-j9mmc\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.495615 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2821d0f-99b6-4333-a90b-d82302ca28e4" (UID: "c2821d0f-99b6-4333-a90b-d82302ca28e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.539447 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2821d0f-99b6-4333-a90b-d82302ca28e4-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.539482 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.539493 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8tj6\" (UniqueName: \"kubernetes.io/projected/c2821d0f-99b6-4333-a90b-d82302ca28e4-kube-api-access-q8tj6\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.539509 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.544323 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69ff95858d-bj8jj" event={"ID":"e4360f1d-93a6-47c3-84ba-96c8d202f420","Type":"ContainerStarted","Data":"a1d35abe934158b73fb8b32736a06c60904c589aa9b0dcc2008ad42cec09970b"} Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.544404 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69ff95858d-bj8jj" event={"ID":"e4360f1d-93a6-47c3-84ba-96c8d202f420","Type":"ContainerStarted","Data":"b9845a8a0279ae511f3247bf2d4e4c405ffc83d62da247751f9d4e639942c8d8"} Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.545950 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.549718 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data" (OuterVolumeSpecName: "config-data") pod "c2821d0f-99b6-4333-a90b-d82302ca28e4" (UID: "c2821d0f-99b6-4333-a90b-d82302ca28e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.551940 5033 generic.go:334] "Generic (PLEG): container finished" podID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerID="a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4" exitCode=0 Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.552140 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79658967bb-s277d" event={"ID":"c2821d0f-99b6-4333-a90b-d82302ca28e4","Type":"ContainerDied","Data":"a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4"} Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.552843 5033 scope.go:117] "RemoveContainer" containerID="a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.553735 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79658967bb-s277d" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.552398 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79658967bb-s277d" event={"ID":"c2821d0f-99b6-4333-a90b-d82302ca28e4","Type":"ContainerDied","Data":"f9341db30e3ff3d81bf340b8e9df276c0e11b157b66e3d9893775f68cfb317c8"} Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.557900 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" event={"ID":"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0","Type":"ContainerStarted","Data":"bac4d373cb8ce5d5ed040fd30c9e2f5715f564e4c50b091436c89ba2333feaac"} Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.558345 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.575057 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69ff95858d-bj8jj" podStartSLOduration=3.575039203 podStartE2EDuration="3.575039203s" podCreationTimestamp="2026-02-26 16:03:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:58.565282861 +0000 UTC m=+1365.777881531" watchObservedRunningTime="2026-02-26 16:03:58.575039203 +0000 UTC m=+1365.787637853" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.575643 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.605682 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" podStartSLOduration=3.605660108 podStartE2EDuration="3.605660108s" podCreationTimestamp="2026-02-26 16:03:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:58.594598322 +0000 UTC m=+1365.807196972" watchObservedRunningTime="2026-02-26 16:03:58.605660108 +0000 UTC m=+1365.818258758" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.608518 5033 scope.go:117] "RemoveContainer" containerID="f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.613235 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.640864 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2821d0f-99b6-4333-a90b-d82302ca28e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.654608 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79658967bb-s277d"] Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.665453 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-79658967bb-s277d"] Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.681067 5033 scope.go:117] "RemoveContainer" containerID="a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4" Feb 26 16:03:58 crc kubenswrapper[5033]: E0226 16:03:58.681917 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4\": container with ID starting with a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4 not found: ID does not exist" containerID="a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.681960 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4"} err="failed to get container status \"a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4\": rpc error: code = NotFound desc = could not find container \"a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4\": container with ID starting with a123f174d4815ef650bd50647c34f7bc1abfa27b14e505b9948e3741cc6357a4 not found: ID does not exist" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.681984 5033 scope.go:117] "RemoveContainer" containerID="f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351" Feb 26 16:03:58 crc kubenswrapper[5033]: E0226 16:03:58.682486 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351\": container with ID starting with f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351 not found: ID does not exist" containerID="f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351" Feb 26 16:03:58 crc kubenswrapper[5033]: I0226 16:03:58.682515 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351"} err="failed to get container status \"f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351\": rpc error: code = NotFound desc = could not find container \"f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351\": container with ID starting with f880ecc598b1cc5ea78a4bf6231fb96925b71886483b552afe6512142d8f5351 not found: ID does not exist" Feb 26 16:03:59 crc kubenswrapper[5033]: I0226 16:03:59.291894 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d9577fb9-j9mmc"] Feb 26 16:03:59 crc kubenswrapper[5033]: I0226 16:03:59.579842 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ab4047a-07ab-4c15-88eb-7dbefe5bea78","Type":"ContainerStarted","Data":"aa1ee9867435cb326cccdc9dafbd435a111a0e341d91a8b67afb8b92f8d1ca85"} Feb 26 16:03:59 crc kubenswrapper[5033]: I0226 16:03:59.580085 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ab4047a-07ab-4c15-88eb-7dbefe5bea78","Type":"ContainerStarted","Data":"0be02a6fc7a422b8cd78e3b3ae0f7f1d02a6ae81055223b5327c59826ff6849f"} Feb 26 16:03:59 crc kubenswrapper[5033]: I0226 16:03:59.587530 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d9577fb9-j9mmc" event={"ID":"db7073ea-074d-41d7-b114-9d6b8e02c3a0","Type":"ContainerStarted","Data":"50dbdb11bbdf4960d6f832423d69852e515d76f298f32272a7e3daba2e425277"} Feb 26 16:03:59 crc kubenswrapper[5033]: I0226 16:03:59.655494 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" path="/var/lib/kubelet/pods/c2821d0f-99b6-4333-a90b-d82302ca28e4/volumes" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.134501 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535364-6vq2m"] Feb 26 16:04:00 crc kubenswrapper[5033]: E0226 16:04:00.135471 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api-log" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.135541 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api-log" Feb 26 16:04:00 crc kubenswrapper[5033]: E0226 16:04:00.135640 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.135693 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.135917 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.135988 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2821d0f-99b6-4333-a90b-d82302ca28e4" containerName="barbican-api-log" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.137816 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535364-6vq2m" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.139843 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.140443 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.145624 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.146678 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535364-6vq2m"] Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.177549 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnb57\" (UniqueName: \"kubernetes.io/projected/46f5bf1f-dc74-40b4-beb7-4fbe927804c0-kube-api-access-qnb57\") pod \"auto-csr-approver-29535364-6vq2m\" (UID: \"46f5bf1f-dc74-40b4-beb7-4fbe927804c0\") " pod="openshift-infra/auto-csr-approver-29535364-6vq2m" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.279490 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnb57\" (UniqueName: \"kubernetes.io/projected/46f5bf1f-dc74-40b4-beb7-4fbe927804c0-kube-api-access-qnb57\") pod \"auto-csr-approver-29535364-6vq2m\" (UID: \"46f5bf1f-dc74-40b4-beb7-4fbe927804c0\") " pod="openshift-infra/auto-csr-approver-29535364-6vq2m" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.303702 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnb57\" (UniqueName: \"kubernetes.io/projected/46f5bf1f-dc74-40b4-beb7-4fbe927804c0-kube-api-access-qnb57\") pod \"auto-csr-approver-29535364-6vq2m\" (UID: \"46f5bf1f-dc74-40b4-beb7-4fbe927804c0\") " pod="openshift-infra/auto-csr-approver-29535364-6vq2m" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.469777 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535364-6vq2m" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.636607 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ab4047a-07ab-4c15-88eb-7dbefe5bea78","Type":"ContainerStarted","Data":"063b18146099b5100ca56d54d72414c86d85a242467f6339a74297d196991539"} Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.646062 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d9577fb9-j9mmc" event={"ID":"db7073ea-074d-41d7-b114-9d6b8e02c3a0","Type":"ContainerStarted","Data":"fb11d0b59395826f637bc346a58329527565a9575674426c6ee93c2cfdc8dac0"} Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.646112 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d9577fb9-j9mmc" event={"ID":"db7073ea-074d-41d7-b114-9d6b8e02c3a0","Type":"ContainerStarted","Data":"6f4850a9a8f812df409a9a6fb3e745a63952c033253853ae16812e130cbe9138"} Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.646141 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:04:00 crc kubenswrapper[5033]: I0226 16:04:00.679704 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d9577fb9-j9mmc" podStartSLOduration=2.67968596 podStartE2EDuration="2.67968596s" podCreationTimestamp="2026-02-26 16:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:00.672248673 +0000 UTC m=+1367.884847333" watchObservedRunningTime="2026-02-26 16:04:00.67968596 +0000 UTC m=+1367.892284610" Feb 26 16:04:01 crc kubenswrapper[5033]: I0226 16:04:01.009571 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535364-6vq2m"] Feb 26 16:04:01 crc kubenswrapper[5033]: I0226 16:04:01.192511 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 16:04:01 crc kubenswrapper[5033]: I0226 16:04:01.234234 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:04:01 crc kubenswrapper[5033]: I0226 16:04:01.657247 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerName="cinder-scheduler" containerID="cri-o://ae05a73127e8a5569d83520289b9d950307213560a9017a627b1bb097b267616" gracePeriod=30 Feb 26 16:04:01 crc kubenswrapper[5033]: I0226 16:04:01.657587 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535364-6vq2m" event={"ID":"46f5bf1f-dc74-40b4-beb7-4fbe927804c0","Type":"ContainerStarted","Data":"74e79a31eebbd1cad5101d45799d1fa4c3ee3b500274264c3f39a40b3b42882f"} Feb 26 16:04:01 crc kubenswrapper[5033]: I0226 16:04:01.658749 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 16:04:01 crc kubenswrapper[5033]: I0226 16:04:01.659150 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerName="probe" containerID="cri-o://0928a95859008e017abdb4f3911ce683acef72bfa6b55cb0c3fb61218d18e121" gracePeriod=30 Feb 26 16:04:01 crc kubenswrapper[5033]: I0226 16:04:01.692462 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.692435299 podStartE2EDuration="4.692435299s" podCreationTimestamp="2026-02-26 16:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:01.681885905 +0000 UTC m=+1368.894484555" watchObservedRunningTime="2026-02-26 16:04:01.692435299 +0000 UTC m=+1368.905033949" Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.491641 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.492281 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerName="glance-log" containerID="cri-o://5b44a06a2b3944d4f7000b47c862b2ab7a0f04f864ceac7276e4a15e1ad9a689" gracePeriod=30 Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.492811 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerName="glance-httpd" containerID="cri-o://751099a6fd3491807a70d3af4718e1c4d1a7f27761350ddb2b8727da704f7e06" gracePeriod=30 Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.625656 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.632976 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.689728 5033 generic.go:334] "Generic (PLEG): container finished" podID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerID="5b44a06a2b3944d4f7000b47c862b2ab7a0f04f864ceac7276e4a15e1ad9a689" exitCode=143 Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.689818 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"19e1a391-4e54-4cee-9c88-34c9e99d71fa","Type":"ContainerDied","Data":"5b44a06a2b3944d4f7000b47c862b2ab7a0f04f864ceac7276e4a15e1ad9a689"} Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.694343 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535364-6vq2m" event={"ID":"46f5bf1f-dc74-40b4-beb7-4fbe927804c0","Type":"ContainerStarted","Data":"4afe6aff4b824c652ac3295bec5f930e6e6d034934db5c1edaa5c7cba85591a9"} Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.712996 5033 generic.go:334] "Generic (PLEG): container finished" podID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerID="0928a95859008e017abdb4f3911ce683acef72bfa6b55cb0c3fb61218d18e121" exitCode=0 Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.713875 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0","Type":"ContainerDied","Data":"0928a95859008e017abdb4f3911ce683acef72bfa6b55cb0c3fb61218d18e121"} Feb 26 16:04:02 crc kubenswrapper[5033]: I0226 16:04:02.721002 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535364-6vq2m" podStartSLOduration=1.473926366 podStartE2EDuration="2.720980734s" podCreationTimestamp="2026-02-26 16:04:00 +0000 UTC" firstStartedPulling="2026-02-26 16:04:01.010034456 +0000 UTC m=+1368.222633106" lastFinishedPulling="2026-02-26 16:04:02.257088824 +0000 UTC m=+1369.469687474" observedRunningTime="2026-02-26 16:04:02.705298991 +0000 UTC m=+1369.917897651" watchObservedRunningTime="2026-02-26 16:04:02.720980734 +0000 UTC m=+1369.933579384" Feb 26 16:04:03 crc kubenswrapper[5033]: I0226 16:04:03.667755 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:03 crc kubenswrapper[5033]: I0226 16:04:03.668686 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerName="glance-log" containerID="cri-o://22ddbe7584d58799a625aebf692f002a308be7926f76b849641146aacf642ea8" gracePeriod=30 Feb 26 16:04:03 crc kubenswrapper[5033]: I0226 16:04:03.668871 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerName="glance-httpd" containerID="cri-o://1e75b7b42268d9506f3f067be14ed00342281227703735eed095f02cda5dca8e" gracePeriod=30 Feb 26 16:04:03 crc kubenswrapper[5033]: I0226 16:04:03.731241 5033 generic.go:334] "Generic (PLEG): container finished" podID="46f5bf1f-dc74-40b4-beb7-4fbe927804c0" containerID="4afe6aff4b824c652ac3295bec5f930e6e6d034934db5c1edaa5c7cba85591a9" exitCode=0 Feb 26 16:04:03 crc kubenswrapper[5033]: I0226 16:04:03.731333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535364-6vq2m" event={"ID":"46f5bf1f-dc74-40b4-beb7-4fbe927804c0","Type":"ContainerDied","Data":"4afe6aff4b824c652ac3295bec5f930e6e6d034934db5c1edaa5c7cba85591a9"} Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.251229 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-c4c558dd7-x5jl5"] Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.253793 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.256324 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.256600 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.267575 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.272584 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c4c558dd7-x5jl5"] Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.284218 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-log-httpd\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.284326 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-internal-tls-certs\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.284390 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-etc-swift\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.284483 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-config-data\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.284569 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-combined-ca-bundle\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.284598 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsml4\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-kube-api-access-lsml4\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.284641 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-run-httpd\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.284725 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-public-tls-certs\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.350575 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.350873 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="ceilometer-central-agent" containerID="cri-o://18b936b2b45c65229294f50f96bb7bfc8e3d7858b5abc87cf9855a5c7bb44e8a" gracePeriod=30 Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.351028 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="proxy-httpd" containerID="cri-o://1b43d07fa03fbf179ff0065c49a4d5153ac07e4f5e097e96635a86e920a046a4" gracePeriod=30 Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.351092 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="sg-core" containerID="cri-o://d785b0a3a4d8ca4d33004dd1cb2d5aa0b2cdde86c1d586f47f28f2eefc320789" gracePeriod=30 Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.351142 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="ceilometer-notification-agent" containerID="cri-o://65615cebbcd9363429c94ffd2cd583a0fc62f0931d4c41d25b34b33293acb9e1" gracePeriod=30 Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.386380 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-log-httpd\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.386485 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-internal-tls-certs\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.386521 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-etc-swift\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.386594 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-config-data\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.386678 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-combined-ca-bundle\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.386732 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsml4\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-kube-api-access-lsml4\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.386789 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-run-httpd\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.386862 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-log-httpd\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.386883 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-public-tls-certs\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.388604 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-run-httpd\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.395586 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-public-tls-certs\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.397560 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-etc-swift\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.398392 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-combined-ca-bundle\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.399222 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-config-data\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.406980 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-internal-tls-certs\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.408020 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsml4\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-kube-api-access-lsml4\") pod \"swift-proxy-c4c558dd7-x5jl5\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.452664 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.157:3000/\": read tcp 10.217.0.2:47238->10.217.0.157:3000: read: connection reset by peer" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.453058 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.157:3000/\": dial tcp 10.217.0.157:3000: connect: connection refused" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.575764 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.742757 5033 generic.go:334] "Generic (PLEG): container finished" podID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerID="1b43d07fa03fbf179ff0065c49a4d5153ac07e4f5e097e96635a86e920a046a4" exitCode=0 Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.743055 5033 generic.go:334] "Generic (PLEG): container finished" podID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerID="d785b0a3a4d8ca4d33004dd1cb2d5aa0b2cdde86c1d586f47f28f2eefc320789" exitCode=2 Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.742922 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerDied","Data":"1b43d07fa03fbf179ff0065c49a4d5153ac07e4f5e097e96635a86e920a046a4"} Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.743121 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerDied","Data":"d785b0a3a4d8ca4d33004dd1cb2d5aa0b2cdde86c1d586f47f28f2eefc320789"} Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.745782 5033 generic.go:334] "Generic (PLEG): container finished" podID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerID="22ddbe7584d58799a625aebf692f002a308be7926f76b849641146aacf642ea8" exitCode=143 Feb 26 16:04:04 crc kubenswrapper[5033]: I0226 16:04:04.745897 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1","Type":"ContainerDied","Data":"22ddbe7584d58799a625aebf692f002a308be7926f76b849641146aacf642ea8"} Feb 26 16:04:05 crc kubenswrapper[5033]: I0226 16:04:05.757585 5033 generic.go:334] "Generic (PLEG): container finished" podID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerID="751099a6fd3491807a70d3af4718e1c4d1a7f27761350ddb2b8727da704f7e06" exitCode=0 Feb 26 16:04:05 crc kubenswrapper[5033]: I0226 16:04:05.757675 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"19e1a391-4e54-4cee-9c88-34c9e99d71fa","Type":"ContainerDied","Data":"751099a6fd3491807a70d3af4718e1c4d1a7f27761350ddb2b8727da704f7e06"} Feb 26 16:04:05 crc kubenswrapper[5033]: I0226 16:04:05.760651 5033 generic.go:334] "Generic (PLEG): container finished" podID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerID="65615cebbcd9363429c94ffd2cd583a0fc62f0931d4c41d25b34b33293acb9e1" exitCode=0 Feb 26 16:04:05 crc kubenswrapper[5033]: I0226 16:04:05.760684 5033 generic.go:334] "Generic (PLEG): container finished" podID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerID="18b936b2b45c65229294f50f96bb7bfc8e3d7858b5abc87cf9855a5c7bb44e8a" exitCode=0 Feb 26 16:04:05 crc kubenswrapper[5033]: I0226 16:04:05.760708 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerDied","Data":"65615cebbcd9363429c94ffd2cd583a0fc62f0931d4c41d25b34b33293acb9e1"} Feb 26 16:04:05 crc kubenswrapper[5033]: I0226 16:04:05.760736 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerDied","Data":"18b936b2b45c65229294f50f96bb7bfc8e3d7858b5abc87cf9855a5c7bb44e8a"} Feb 26 16:04:05 crc kubenswrapper[5033]: I0226 16:04:05.976950 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:04:06 crc kubenswrapper[5033]: I0226 16:04:06.047269 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-p5bpd"] Feb 26 16:04:06 crc kubenswrapper[5033]: I0226 16:04:06.047534 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" podUID="b7246c5a-8508-4f21-ac7a-76d218044d4c" containerName="dnsmasq-dns" containerID="cri-o://e1e88b8703395f870140ff3011ae32e83bb5f9661f7c45eb76a45cdcdd83c106" gracePeriod=10 Feb 26 16:04:06 crc kubenswrapper[5033]: I0226 16:04:06.774489 5033 generic.go:334] "Generic (PLEG): container finished" podID="b7246c5a-8508-4f21-ac7a-76d218044d4c" containerID="e1e88b8703395f870140ff3011ae32e83bb5f9661f7c45eb76a45cdcdd83c106" exitCode=0 Feb 26 16:04:06 crc kubenswrapper[5033]: I0226 16:04:06.774543 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" event={"ID":"b7246c5a-8508-4f21-ac7a-76d218044d4c","Type":"ContainerDied","Data":"e1e88b8703395f870140ff3011ae32e83bb5f9661f7c45eb76a45cdcdd83c106"} Feb 26 16:04:06 crc kubenswrapper[5033]: I0226 16:04:06.778089 5033 generic.go:334] "Generic (PLEG): container finished" podID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerID="ae05a73127e8a5569d83520289b9d950307213560a9017a627b1bb097b267616" exitCode=0 Feb 26 16:04:06 crc kubenswrapper[5033]: I0226 16:04:06.778142 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0","Type":"ContainerDied","Data":"ae05a73127e8a5569d83520289b9d950307213560a9017a627b1bb097b267616"} Feb 26 16:04:07 crc kubenswrapper[5033]: I0226 16:04:07.790127 5033 generic.go:334] "Generic (PLEG): container finished" podID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerID="1e75b7b42268d9506f3f067be14ed00342281227703735eed095f02cda5dca8e" exitCode=0 Feb 26 16:04:07 crc kubenswrapper[5033]: I0226 16:04:07.790204 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1","Type":"ContainerDied","Data":"1e75b7b42268d9506f3f067be14ed00342281227703735eed095f02cda5dca8e"} Feb 26 16:04:08 crc kubenswrapper[5033]: I0226 16:04:08.713674 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" podUID="b7246c5a-8508-4f21-ac7a-76d218044d4c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.160:5353: connect: connection refused" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.063345 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.662374 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zt7xr"] Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.663882 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.678005 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zt7xr"] Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.771279 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-k9qcs"] Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.776460 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.786685 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-43b7-account-create-update-gbdhs"] Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.787854 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.804606 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.811684 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k9qcs"] Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.824148 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn8mr\" (UniqueName: \"kubernetes.io/projected/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-kube-api-access-bn8mr\") pod \"nova-api-db-create-zt7xr\" (UID: \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\") " pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.824478 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-operator-scripts\") pod \"nova-api-db-create-zt7xr\" (UID: \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\") " pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.869507 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-43b7-account-create-update-gbdhs"] Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.902470 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-6sfh6"] Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.903921 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.910958 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6sfh6"] Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.926789 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvthx\" (UniqueName: \"kubernetes.io/projected/e6bddd46-8347-483a-b49d-470d41ee528c-kube-api-access-wvthx\") pod \"nova-cell0-db-create-k9qcs\" (UID: \"e6bddd46-8347-483a-b49d-470d41ee528c\") " pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.926861 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4v5z\" (UniqueName: \"kubernetes.io/projected/0dac4b71-5c96-482d-b6d2-656990c1d873-kube-api-access-j4v5z\") pod \"nova-api-43b7-account-create-update-gbdhs\" (UID: \"0dac4b71-5c96-482d-b6d2-656990c1d873\") " pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.926920 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-operator-scripts\") pod \"nova-api-db-create-zt7xr\" (UID: \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\") " pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.927010 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac4b71-5c96-482d-b6d2-656990c1d873-operator-scripts\") pod \"nova-api-43b7-account-create-update-gbdhs\" (UID: \"0dac4b71-5c96-482d-b6d2-656990c1d873\") " pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.927045 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn8mr\" (UniqueName: \"kubernetes.io/projected/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-kube-api-access-bn8mr\") pod \"nova-api-db-create-zt7xr\" (UID: \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\") " pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.927082 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bddd46-8347-483a-b49d-470d41ee528c-operator-scripts\") pod \"nova-cell0-db-create-k9qcs\" (UID: \"e6bddd46-8347-483a-b49d-470d41ee528c\") " pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.927538 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-operator-scripts\") pod \"nova-api-db-create-zt7xr\" (UID: \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\") " pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.954585 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn8mr\" (UniqueName: \"kubernetes.io/projected/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-kube-api-access-bn8mr\") pod \"nova-api-db-create-zt7xr\" (UID: \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\") " pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.989322 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-56c7-account-create-update-dqbl5"] Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.990566 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.994643 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 26 16:04:10 crc kubenswrapper[5033]: I0226 16:04:10.995113 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-56c7-account-create-update-dqbl5"] Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.002737 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.028290 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvthx\" (UniqueName: \"kubernetes.io/projected/e6bddd46-8347-483a-b49d-470d41ee528c-kube-api-access-wvthx\") pod \"nova-cell0-db-create-k9qcs\" (UID: \"e6bddd46-8347-483a-b49d-470d41ee528c\") " pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.028343 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrq9\" (UniqueName: \"kubernetes.io/projected/9c4673c4-7bff-4708-8ace-4931c3091789-kube-api-access-tsrq9\") pod \"nova-cell1-db-create-6sfh6\" (UID: \"9c4673c4-7bff-4708-8ace-4931c3091789\") " pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.028368 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4v5z\" (UniqueName: \"kubernetes.io/projected/0dac4b71-5c96-482d-b6d2-656990c1d873-kube-api-access-j4v5z\") pod \"nova-api-43b7-account-create-update-gbdhs\" (UID: \"0dac4b71-5c96-482d-b6d2-656990c1d873\") " pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.028386 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4673c4-7bff-4708-8ace-4931c3091789-operator-scripts\") pod \"nova-cell1-db-create-6sfh6\" (UID: \"9c4673c4-7bff-4708-8ace-4931c3091789\") " pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.028449 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac4b71-5c96-482d-b6d2-656990c1d873-operator-scripts\") pod \"nova-api-43b7-account-create-update-gbdhs\" (UID: \"0dac4b71-5c96-482d-b6d2-656990c1d873\") " pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.028480 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bddd46-8347-483a-b49d-470d41ee528c-operator-scripts\") pod \"nova-cell0-db-create-k9qcs\" (UID: \"e6bddd46-8347-483a-b49d-470d41ee528c\") " pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.029204 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bddd46-8347-483a-b49d-470d41ee528c-operator-scripts\") pod \"nova-cell0-db-create-k9qcs\" (UID: \"e6bddd46-8347-483a-b49d-470d41ee528c\") " pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.031622 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac4b71-5c96-482d-b6d2-656990c1d873-operator-scripts\") pod \"nova-api-43b7-account-create-update-gbdhs\" (UID: \"0dac4b71-5c96-482d-b6d2-656990c1d873\") " pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.049558 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvthx\" (UniqueName: \"kubernetes.io/projected/e6bddd46-8347-483a-b49d-470d41ee528c-kube-api-access-wvthx\") pod \"nova-cell0-db-create-k9qcs\" (UID: \"e6bddd46-8347-483a-b49d-470d41ee528c\") " pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.052798 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4v5z\" (UniqueName: \"kubernetes.io/projected/0dac4b71-5c96-482d-b6d2-656990c1d873-kube-api-access-j4v5z\") pod \"nova-api-43b7-account-create-update-gbdhs\" (UID: \"0dac4b71-5c96-482d-b6d2-656990c1d873\") " pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.113304 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.128780 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.130626 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hr6b\" (UniqueName: \"kubernetes.io/projected/b1a1bbea-b471-4f30-abe1-de75f9641b24-kube-api-access-5hr6b\") pod \"nova-cell0-56c7-account-create-update-dqbl5\" (UID: \"b1a1bbea-b471-4f30-abe1-de75f9641b24\") " pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.130934 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrq9\" (UniqueName: \"kubernetes.io/projected/9c4673c4-7bff-4708-8ace-4931c3091789-kube-api-access-tsrq9\") pod \"nova-cell1-db-create-6sfh6\" (UID: \"9c4673c4-7bff-4708-8ace-4931c3091789\") " pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.130997 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a1bbea-b471-4f30-abe1-de75f9641b24-operator-scripts\") pod \"nova-cell0-56c7-account-create-update-dqbl5\" (UID: \"b1a1bbea-b471-4f30-abe1-de75f9641b24\") " pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.131066 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4673c4-7bff-4708-8ace-4931c3091789-operator-scripts\") pod \"nova-cell1-db-create-6sfh6\" (UID: \"9c4673c4-7bff-4708-8ace-4931c3091789\") " pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.132309 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4673c4-7bff-4708-8ace-4931c3091789-operator-scripts\") pod \"nova-cell1-db-create-6sfh6\" (UID: \"9c4673c4-7bff-4708-8ace-4931c3091789\") " pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.157632 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrq9\" (UniqueName: \"kubernetes.io/projected/9c4673c4-7bff-4708-8ace-4931c3091789-kube-api-access-tsrq9\") pod \"nova-cell1-db-create-6sfh6\" (UID: \"9c4673c4-7bff-4708-8ace-4931c3091789\") " pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.177869 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-041e-account-create-update-6zjnq"] Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.179462 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.182629 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.190637 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-041e-account-create-update-6zjnq"] Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.224954 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.233627 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hr6b\" (UniqueName: \"kubernetes.io/projected/b1a1bbea-b471-4f30-abe1-de75f9641b24-kube-api-access-5hr6b\") pod \"nova-cell0-56c7-account-create-update-dqbl5\" (UID: \"b1a1bbea-b471-4f30-abe1-de75f9641b24\") " pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.233741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a1bbea-b471-4f30-abe1-de75f9641b24-operator-scripts\") pod \"nova-cell0-56c7-account-create-update-dqbl5\" (UID: \"b1a1bbea-b471-4f30-abe1-de75f9641b24\") " pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.234680 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a1bbea-b471-4f30-abe1-de75f9641b24-operator-scripts\") pod \"nova-cell0-56c7-account-create-update-dqbl5\" (UID: \"b1a1bbea-b471-4f30-abe1-de75f9641b24\") " pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.260627 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hr6b\" (UniqueName: \"kubernetes.io/projected/b1a1bbea-b471-4f30-abe1-de75f9641b24-kube-api-access-5hr6b\") pod \"nova-cell0-56c7-account-create-update-dqbl5\" (UID: \"b1a1bbea-b471-4f30-abe1-de75f9641b24\") " pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.283852 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535364-6vq2m" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.335428 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbdvv\" (UniqueName: \"kubernetes.io/projected/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-kube-api-access-xbdvv\") pod \"nova-cell1-041e-account-create-update-6zjnq\" (UID: \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\") " pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.335753 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-operator-scripts\") pod \"nova-cell1-041e-account-create-update-6zjnq\" (UID: \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\") " pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.342663 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.437005 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnb57\" (UniqueName: \"kubernetes.io/projected/46f5bf1f-dc74-40b4-beb7-4fbe927804c0-kube-api-access-qnb57\") pod \"46f5bf1f-dc74-40b4-beb7-4fbe927804c0\" (UID: \"46f5bf1f-dc74-40b4-beb7-4fbe927804c0\") " Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.437544 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-operator-scripts\") pod \"nova-cell1-041e-account-create-update-6zjnq\" (UID: \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\") " pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.437753 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbdvv\" (UniqueName: \"kubernetes.io/projected/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-kube-api-access-xbdvv\") pod \"nova-cell1-041e-account-create-update-6zjnq\" (UID: \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\") " pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.438347 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-operator-scripts\") pod \"nova-cell1-041e-account-create-update-6zjnq\" (UID: \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\") " pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.441015 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f5bf1f-dc74-40b4-beb7-4fbe927804c0-kube-api-access-qnb57" (OuterVolumeSpecName: "kube-api-access-qnb57") pod "46f5bf1f-dc74-40b4-beb7-4fbe927804c0" (UID: "46f5bf1f-dc74-40b4-beb7-4fbe927804c0"). InnerVolumeSpecName "kube-api-access-qnb57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.455831 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbdvv\" (UniqueName: \"kubernetes.io/projected/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-kube-api-access-xbdvv\") pod \"nova-cell1-041e-account-create-update-6zjnq\" (UID: \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\") " pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.539284 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnb57\" (UniqueName: \"kubernetes.io/projected/46f5bf1f-dc74-40b4-beb7-4fbe927804c0-kube-api-access-qnb57\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.586631 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.839598 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535364-6vq2m" event={"ID":"46f5bf1f-dc74-40b4-beb7-4fbe927804c0","Type":"ContainerDied","Data":"74e79a31eebbd1cad5101d45799d1fa4c3ee3b500274264c3f39a40b3b42882f"} Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.839889 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74e79a31eebbd1cad5101d45799d1fa4c3ee3b500274264c3f39a40b3b42882f" Feb 26 16:04:11 crc kubenswrapper[5033]: I0226 16:04:11.839663 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535364-6vq2m" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.394742 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535358-zt5pt"] Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.402012 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535358-zt5pt"] Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.528081 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.581985 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.582073 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-combined-ca-bundle\") pod \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.582107 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-httpd-run\") pod \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.582136 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2ph8\" (UniqueName: \"kubernetes.io/projected/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-kube-api-access-r2ph8\") pod \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.582174 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-config-data\") pod \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.582209 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-internal-tls-certs\") pod \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.582282 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-logs\") pod \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.582321 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-scripts\") pod \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\" (UID: \"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.585727 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-logs" (OuterVolumeSpecName: "logs") pod "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" (UID: "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.585953 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" (UID: "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.602021 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-scripts" (OuterVolumeSpecName: "scripts") pod "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" (UID: "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.602107 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" (UID: "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.602191 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-kube-api-access-r2ph8" (OuterVolumeSpecName: "kube-api-access-r2ph8") pod "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" (UID: "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1"). InnerVolumeSpecName "kube-api-access-r2ph8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.640535 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" (UID: "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.685597 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.685631 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.685642 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.685672 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2ph8\" (UniqueName: \"kubernetes.io/projected/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-kube-api-access-r2ph8\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.685682 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.685689 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.693797 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" (UID: "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.719017 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.738976 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-config-data" (OuterVolumeSpecName: "config-data") pod "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" (UID: "bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.745249 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.800843 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-run-httpd\") pod \"d36cde35-f717-4cea-a07c-fbec49bcef15\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.800894 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2jff\" (UniqueName: \"kubernetes.io/projected/d36cde35-f717-4cea-a07c-fbec49bcef15-kube-api-access-c2jff\") pod \"d36cde35-f717-4cea-a07c-fbec49bcef15\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.801018 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-scripts\") pod \"d36cde35-f717-4cea-a07c-fbec49bcef15\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.801546 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d36cde35-f717-4cea-a07c-fbec49bcef15" (UID: "d36cde35-f717-4cea-a07c-fbec49bcef15"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.802791 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-config-data\") pod \"d36cde35-f717-4cea-a07c-fbec49bcef15\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.802858 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-sg-core-conf-yaml\") pod \"d36cde35-f717-4cea-a07c-fbec49bcef15\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.802898 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-combined-ca-bundle\") pod \"d36cde35-f717-4cea-a07c-fbec49bcef15\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.802933 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-log-httpd\") pod \"d36cde35-f717-4cea-a07c-fbec49bcef15\" (UID: \"d36cde35-f717-4cea-a07c-fbec49bcef15\") " Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.804197 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.804221 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.804236 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.804248 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.804942 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-scripts" (OuterVolumeSpecName: "scripts") pod "d36cde35-f717-4cea-a07c-fbec49bcef15" (UID: "d36cde35-f717-4cea-a07c-fbec49bcef15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.805202 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d36cde35-f717-4cea-a07c-fbec49bcef15" (UID: "d36cde35-f717-4cea-a07c-fbec49bcef15"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.806706 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36cde35-f717-4cea-a07c-fbec49bcef15-kube-api-access-c2jff" (OuterVolumeSpecName: "kube-api-access-c2jff") pod "d36cde35-f717-4cea-a07c-fbec49bcef15" (UID: "d36cde35-f717-4cea-a07c-fbec49bcef15"). InnerVolumeSpecName "kube-api-access-c2jff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.841583 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d36cde35-f717-4cea-a07c-fbec49bcef15" (UID: "d36cde35-f717-4cea-a07c-fbec49bcef15"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.858951 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c0f0a198-dc99-4384-8ab0-b978bf9275b5","Type":"ContainerStarted","Data":"f4a8a899d4825519a257a55cdd9c58e86c24aafa460378cd50a7461c75a3be78"} Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.871296 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d36cde35-f717-4cea-a07c-fbec49bcef15","Type":"ContainerDied","Data":"02f32c7357b261818718ce799ef777668e2443c47c12341f93a4ba024d83241a"} Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.871362 5033 scope.go:117] "RemoveContainer" containerID="1b43d07fa03fbf179ff0065c49a4d5153ac07e4f5e097e96635a86e920a046a4" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.871550 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.908946 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1","Type":"ContainerDied","Data":"07f3716167aecd2312843e52a1f3985dfc3430b96be6c6e5df230daf57d12b46"} Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.909100 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.909459 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.909495 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d36cde35-f717-4cea-a07c-fbec49bcef15-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.909510 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2jff\" (UniqueName: \"kubernetes.io/projected/d36cde35-f717-4cea-a07c-fbec49bcef15-kube-api-access-c2jff\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.910108 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.913879 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.120453056 podStartE2EDuration="17.91363003s" podCreationTimestamp="2026-02-26 16:03:55 +0000 UTC" firstStartedPulling="2026-02-26 16:03:56.37429346 +0000 UTC m=+1363.586892110" lastFinishedPulling="2026-02-26 16:04:12.167470434 +0000 UTC m=+1379.380069084" observedRunningTime="2026-02-26 16:04:12.875321954 +0000 UTC m=+1380.087920624" watchObservedRunningTime="2026-02-26 16:04:12.91363003 +0000 UTC m=+1380.126228690" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.943648 5033 scope.go:117] "RemoveContainer" containerID="d785b0a3a4d8ca4d33004dd1cb2d5aa0b2cdde86c1d586f47f28f2eefc320789" Feb 26 16:04:12 crc kubenswrapper[5033]: I0226 16:04:12.958763 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-config-data" (OuterVolumeSpecName: "config-data") pod "d36cde35-f717-4cea-a07c-fbec49bcef15" (UID: "d36cde35-f717-4cea-a07c-fbec49bcef15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:12.997885 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.012854 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.016757 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.027370 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d36cde35-f717-4cea-a07c-fbec49bcef15" (UID: "d36cde35-f717-4cea-a07c-fbec49bcef15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.038535 5033 scope.go:117] "RemoveContainer" containerID="65615cebbcd9363429c94ffd2cd583a0fc62f0931d4c41d25b34b33293acb9e1" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.049105 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.049941 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="ceilometer-notification-agent" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.049986 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="ceilometer-notification-agent" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.050000 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerName="glance-log" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050006 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerName="glance-log" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.050017 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f5bf1f-dc74-40b4-beb7-4fbe927804c0" containerName="oc" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050023 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f5bf1f-dc74-40b4-beb7-4fbe927804c0" containerName="oc" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.050034 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="sg-core" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050040 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="sg-core" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.050052 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="proxy-httpd" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050057 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="proxy-httpd" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.050070 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerName="glance-httpd" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050076 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerName="glance-httpd" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.050086 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="ceilometer-central-agent" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050091 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="ceilometer-central-agent" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050247 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="proxy-httpd" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050273 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerName="glance-log" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050289 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="ceilometer-central-agent" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050301 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="sg-core" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050313 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f5bf1f-dc74-40b4-beb7-4fbe927804c0" containerName="oc" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050327 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" containerName="ceilometer-notification-agent" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.050337 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" containerName="glance-httpd" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.051250 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.053540 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.053584 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.078690 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.122404 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36cde35-f717-4cea-a07c-fbec49bcef15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.193158 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.199587 5033 scope.go:117] "RemoveContainer" containerID="18b936b2b45c65229294f50f96bb7bfc8e3d7858b5abc87cf9855a5c7bb44e8a" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.228974 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-swift-storage-0\") pod \"b7246c5a-8508-4f21-ac7a-76d218044d4c\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.229064 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-svc\") pod \"b7246c5a-8508-4f21-ac7a-76d218044d4c\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.229903 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-config\") pod \"b7246c5a-8508-4f21-ac7a-76d218044d4c\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.230066 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgzpd\" (UniqueName: \"kubernetes.io/projected/b7246c5a-8508-4f21-ac7a-76d218044d4c-kube-api-access-kgzpd\") pod \"b7246c5a-8508-4f21-ac7a-76d218044d4c\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.230100 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-sb\") pod \"b7246c5a-8508-4f21-ac7a-76d218044d4c\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.230632 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-nb\") pod \"b7246c5a-8508-4f21-ac7a-76d218044d4c\" (UID: \"b7246c5a-8508-4f21-ac7a-76d218044d4c\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.231217 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.233313 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-logs\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.233365 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.233428 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.233459 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.233487 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.233502 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgfhn\" (UniqueName: \"kubernetes.io/projected/bf0b01b9-0085-4552-8a7d-1e7359dad79b-kube-api-access-vgfhn\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.233571 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.233618 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.249685 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7246c5a-8508-4f21-ac7a-76d218044d4c-kube-api-access-kgzpd" (OuterVolumeSpecName: "kube-api-access-kgzpd") pod "b7246c5a-8508-4f21-ac7a-76d218044d4c" (UID: "b7246c5a-8508-4f21-ac7a-76d218044d4c"). InnerVolumeSpecName "kube-api-access-kgzpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.266866 5033 scope.go:117] "RemoveContainer" containerID="1e75b7b42268d9506f3f067be14ed00342281227703735eed095f02cda5dca8e" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.299108 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.309797 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336060 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-logs\") pod \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336210 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-scripts\") pod \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336294 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336329 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-combined-ca-bundle\") pod \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336369 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-httpd-run\") pod \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336438 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsjl9\" (UniqueName: \"kubernetes.io/projected/19e1a391-4e54-4cee-9c88-34c9e99d71fa-kube-api-access-bsjl9\") pod \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336476 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-etc-machine-id\") pod \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336507 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-public-tls-certs\") pod \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336535 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-combined-ca-bundle\") pod \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336557 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-scripts\") pod \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336605 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data-custom\") pod \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336636 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-config-data\") pod \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\" (UID: \"19e1a391-4e54-4cee-9c88-34c9e99d71fa\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336708 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmvwg\" (UniqueName: \"kubernetes.io/projected/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-kube-api-access-hmvwg\") pod \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.336750 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data\") pod \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\" (UID: \"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0\") " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.337002 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.337045 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.337079 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.337102 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgfhn\" (UniqueName: \"kubernetes.io/projected/bf0b01b9-0085-4552-8a7d-1e7359dad79b-kube-api-access-vgfhn\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.337169 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.337216 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.337324 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-logs\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.337361 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.337440 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgzpd\" (UniqueName: \"kubernetes.io/projected/b7246c5a-8508-4f21-ac7a-76d218044d4c-kube-api-access-kgzpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.339580 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "19e1a391-4e54-4cee-9c88-34c9e99d71fa" (UID: "19e1a391-4e54-4cee-9c88-34c9e99d71fa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.340693 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-logs" (OuterVolumeSpecName: "logs") pod "19e1a391-4e54-4cee-9c88-34c9e99d71fa" (UID: "19e1a391-4e54-4cee-9c88-34c9e99d71fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.342976 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7246c5a-8508-4f21-ac7a-76d218044d4c" (UID: "b7246c5a-8508-4f21-ac7a-76d218044d4c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.344166 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" (UID: "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.344496 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.343720 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.348169 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.351869 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-scripts" (OuterVolumeSpecName: "scripts") pod "19e1a391-4e54-4cee-9c88-34c9e99d71fa" (UID: "19e1a391-4e54-4cee-9c88-34c9e99d71fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.353594 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-logs\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.354672 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" (UID: "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.359186 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "19e1a391-4e54-4cee-9c88-34c9e99d71fa" (UID: "19e1a391-4e54-4cee-9c88-34c9e99d71fa"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.365530 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.368930 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-scripts" (OuterVolumeSpecName: "scripts") pod "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" (UID: "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.369076 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e1a391-4e54-4cee-9c88-34c9e99d71fa-kube-api-access-bsjl9" (OuterVolumeSpecName: "kube-api-access-bsjl9") pod "19e1a391-4e54-4cee-9c88-34c9e99d71fa" (UID: "19e1a391-4e54-4cee-9c88-34c9e99d71fa"). InnerVolumeSpecName "kube-api-access-bsjl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.374812 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgfhn\" (UniqueName: \"kubernetes.io/projected/bf0b01b9-0085-4552-8a7d-1e7359dad79b-kube-api-access-vgfhn\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.380204 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-kube-api-access-hmvwg" (OuterVolumeSpecName: "kube-api-access-hmvwg") pod "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" (UID: "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0"). InnerVolumeSpecName "kube-api-access-hmvwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.401817 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.403300 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.403736 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerName="glance-log" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.403759 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerName="glance-log" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.403778 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7246c5a-8508-4f21-ac7a-76d218044d4c" containerName="init" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.403788 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7246c5a-8508-4f21-ac7a-76d218044d4c" containerName="init" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.403803 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerName="cinder-scheduler" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.403810 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerName="cinder-scheduler" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.403829 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerName="probe" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.403837 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerName="probe" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.403864 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerName="glance-httpd" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.403873 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerName="glance-httpd" Feb 26 16:04:13 crc kubenswrapper[5033]: E0226 16:04:13.403888 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7246c5a-8508-4f21-ac7a-76d218044d4c" containerName="dnsmasq-dns" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.403896 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7246c5a-8508-4f21-ac7a-76d218044d4c" containerName="dnsmasq-dns" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.404099 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerName="probe" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.404115 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7246c5a-8508-4f21-ac7a-76d218044d4c" containerName="dnsmasq-dns" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.404129 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerName="glance-httpd" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.404140 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" containerName="cinder-scheduler" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.404154 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" containerName="glance-log" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.406163 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.408215 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.408731 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.419124 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.423947 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.439693 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-scripts\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440059 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-config-data\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440119 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-log-httpd\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440136 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440160 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440199 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-run-httpd\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440246 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448n8\" (UniqueName: \"kubernetes.io/projected/73414a21-a5e9-4ee9-a428-e302c86bf718-kube-api-access-448n8\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440456 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440480 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmvwg\" (UniqueName: \"kubernetes.io/projected/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-kube-api-access-hmvwg\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440497 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440505 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440513 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440606 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440618 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19e1a391-4e54-4cee-9c88-34c9e99d71fa-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440627 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsjl9\" (UniqueName: \"kubernetes.io/projected/19e1a391-4e54-4cee-9c88-34c9e99d71fa-kube-api-access-bsjl9\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440637 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.440645 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.475241 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.477464 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.478334 5033 scope.go:117] "RemoveContainer" containerID="22ddbe7584d58799a625aebf692f002a308be7926f76b849641146aacf642ea8" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.480432 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.502681 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-43b7-account-create-update-gbdhs"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.511727 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zt7xr"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.527284 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-041e-account-create-update-6zjnq"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.531986 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b7246c5a-8508-4f21-ac7a-76d218044d4c" (UID: "b7246c5a-8508-4f21-ac7a-76d218044d4c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.532196 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-config" (OuterVolumeSpecName: "config") pod "b7246c5a-8508-4f21-ac7a-76d218044d4c" (UID: "b7246c5a-8508-4f21-ac7a-76d218044d4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.542136 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.542944 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-run-httpd\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.542999 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448n8\" (UniqueName: \"kubernetes.io/projected/73414a21-a5e9-4ee9-a428-e302c86bf718-kube-api-access-448n8\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.543030 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-scripts\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.543074 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-config-data\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.543119 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-log-httpd\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.543135 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.543157 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.543202 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.543212 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.543221 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.548050 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-run-httpd\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.552366 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-log-httpd\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.563120 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.563922 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.564300 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448n8\" (UniqueName: \"kubernetes.io/projected/73414a21-a5e9-4ee9-a428-e302c86bf718-kube-api-access-448n8\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.564969 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-scripts\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: W0226 16:04:13.586554 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33d9c0d5_6eff_4216_a63e_cdc21eaacf42.slice/crio-29336dc6de96f21425b6bde6e2f96d3d15ec91651752f9e9124b4f96aa3b219d WatchSource:0}: Error finding container 29336dc6de96f21425b6bde6e2f96d3d15ec91651752f9e9124b4f96aa3b219d: Status 404 returned error can't find the container with id 29336dc6de96f21425b6bde6e2f96d3d15ec91651752f9e9124b4f96aa3b219d Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.588383 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19e1a391-4e54-4cee-9c88-34c9e99d71fa" (UID: "19e1a391-4e54-4cee-9c88-34c9e99d71fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.589102 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-config-data\") pod \"ceilometer-0\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.615104 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" (UID: "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.620656 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b7246c5a-8508-4f21-ac7a-76d218044d4c" (UID: "b7246c5a-8508-4f21-ac7a-76d218044d4c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.624518 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c4c558dd7-x5jl5"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.650460 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.653175 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.653213 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.664988 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-config-data" (OuterVolumeSpecName: "config-data") pod "19e1a391-4e54-4cee-9c88-34c9e99d71fa" (UID: "19e1a391-4e54-4cee-9c88-34c9e99d71fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.667314 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1" path="/var/lib/kubelet/pods/bd382a65-6478-4ecd-ab76-bc1f1fcf2fc1/volumes" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.670964 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16b9882-8189-44de-9058-d64a70399067" path="/var/lib/kubelet/pods/d16b9882-8189-44de-9058-d64a70399067/volumes" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.683699 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36cde35-f717-4cea-a07c-fbec49bcef15" path="/var/lib/kubelet/pods/d36cde35-f717-4cea-a07c-fbec49bcef15/volumes" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.702799 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b7246c5a-8508-4f21-ac7a-76d218044d4c" (UID: "b7246c5a-8508-4f21-ac7a-76d218044d4c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.703142 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "19e1a391-4e54-4cee-9c88-34c9e99d71fa" (UID: "19e1a391-4e54-4cee-9c88-34c9e99d71fa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.721567 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.758463 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7246c5a-8508-4f21-ac7a-76d218044d4c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.762632 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.762935 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e1a391-4e54-4cee-9c88-34c9e99d71fa-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.764341 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data" (OuterVolumeSpecName: "config-data") pod "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" (UID: "3f27d3bc-7305-451b-8dd6-a0b2765cc3b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.791546 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.815806 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6sfh6"] Feb 26 16:04:13 crc kubenswrapper[5033]: W0226 16:04:13.847593 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c4673c4_7bff_4708_8ace_4931c3091789.slice/crio-6af362895f9d07633a68edb21b9bc612c00c033c342c5865c8ae05e37c6c5847 WatchSource:0}: Error finding container 6af362895f9d07633a68edb21b9bc612c00c033c342c5865c8ae05e37c6c5847: Status 404 returned error can't find the container with id 6af362895f9d07633a68edb21b9bc612c00c033c342c5865c8ae05e37c6c5847 Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.867158 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.885758 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-56c7-account-create-update-dqbl5"] Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.902995 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k9qcs"] Feb 26 16:04:13 crc kubenswrapper[5033]: W0226 16:04:13.909796 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6bddd46_8347_483a_b49d_470d41ee528c.slice/crio-c9e4f6b3a01dcc7aa6369391762ad8c58e284954af1cfec2cb17fc80ae7ae86b WatchSource:0}: Error finding container c9e4f6b3a01dcc7aa6369391762ad8c58e284954af1cfec2cb17fc80ae7ae86b: Status 404 returned error can't find the container with id c9e4f6b3a01dcc7aa6369391762ad8c58e284954af1cfec2cb17fc80ae7ae86b Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.936204 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.939324 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-041e-account-create-update-6zjnq" event={"ID":"33d9c0d5-6eff-4216-a63e-cdc21eaacf42","Type":"ContainerStarted","Data":"29336dc6de96f21425b6bde6e2f96d3d15ec91651752f9e9124b4f96aa3b219d"} Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.942968 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6sfh6" event={"ID":"9c4673c4-7bff-4708-8ace-4931c3091789","Type":"ContainerStarted","Data":"6af362895f9d07633a68edb21b9bc612c00c033c342c5865c8ae05e37c6c5847"} Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.982445 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" event={"ID":"b7246c5a-8508-4f21-ac7a-76d218044d4c","Type":"ContainerDied","Data":"a5f59e5232250a8dac761019a6c5365ff522ffcae235e4f64d0c84d2376e9b72"} Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.984842 5033 scope.go:117] "RemoveContainer" containerID="e1e88b8703395f870140ff3011ae32e83bb5f9661f7c45eb76a45cdcdd83c106" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.986810 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-p5bpd" Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.993492 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f27d3bc-7305-451b-8dd6-a0b2765cc3b0","Type":"ContainerDied","Data":"ddd88155fc5bcf454cc0588f82a7b837678f5ba4e22ca17821b2b1f703c7f0c8"} Feb 26 16:04:13 crc kubenswrapper[5033]: I0226 16:04:13.993588 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.001173 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zt7xr" event={"ID":"73d77fb6-14aa-4c7d-9ad1-093e95473c0f","Type":"ContainerStarted","Data":"7a96599c666be19f05a81fcbada27de8f9287bd7cc3d9a4f0ab7fa6dfac624aa"} Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.008473 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-43b7-account-create-update-gbdhs" event={"ID":"0dac4b71-5c96-482d-b6d2-656990c1d873","Type":"ContainerStarted","Data":"50a746810b8fae79d068701d71b64e846909ad4152f61270e9bc2c205fd55b5a"} Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.011076 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c4c558dd7-x5jl5" event={"ID":"eb635e2d-e92d-4f18-be4b-972e388e8c25","Type":"ContainerStarted","Data":"b5d444dfc4ae3c7793f31a10c5f76fd337e67d0e948feee939f1da607736576e"} Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.013957 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.024645 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"19e1a391-4e54-4cee-9c88-34c9e99d71fa","Type":"ContainerDied","Data":"2d61b32ec1248749b78b8ad80dcf4818c724a02d34739f7f44fcfb315ea4733c"} Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.111760 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n4jjz"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.114475 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.130296 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4jjz"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.172751 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-utilities\") pod \"redhat-marketplace-n4jjz\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.172801 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxsn\" (UniqueName: \"kubernetes.io/projected/7d74d149-c9dd-4307-9d7b-63f9e56570f6-kube-api-access-rbxsn\") pod \"redhat-marketplace-n4jjz\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.172880 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-catalog-content\") pod \"redhat-marketplace-n4jjz\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.274204 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-utilities\") pod \"redhat-marketplace-n4jjz\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.274499 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxsn\" (UniqueName: \"kubernetes.io/projected/7d74d149-c9dd-4307-9d7b-63f9e56570f6-kube-api-access-rbxsn\") pod \"redhat-marketplace-n4jjz\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.274561 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-catalog-content\") pod \"redhat-marketplace-n4jjz\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.275098 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-catalog-content\") pod \"redhat-marketplace-n4jjz\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.275158 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-utilities\") pod \"redhat-marketplace-n4jjz\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.276591 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-p5bpd"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.288390 5033 scope.go:117] "RemoveContainer" containerID="f904a5ad2956da4d61a1720665d277c271058e99bcae935155400081d797e70a" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.288469 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-p5bpd"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.297946 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.312335 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.324986 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.339899 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxsn\" (UniqueName: \"kubernetes.io/projected/7d74d149-c9dd-4307-9d7b-63f9e56570f6-kube-api-access-rbxsn\") pod \"redhat-marketplace-n4jjz\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.351094 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.352551 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.359966 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.362509 5033 scope.go:117] "RemoveContainer" containerID="0928a95859008e017abdb4f3911ce683acef72bfa6b55cb0c3fb61218d18e121" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.362893 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.373617 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.381855 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c26g5\" (UniqueName: \"kubernetes.io/projected/801b1613-3ed1-423d-85cf-6b87f6711034-kube-api-access-c26g5\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.381927 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.381950 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801b1613-3ed1-423d-85cf-6b87f6711034-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.381964 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-scripts\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.381981 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.382065 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.395405 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.418614 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.423546 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.457786 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.469808 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.484741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.485296 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c26g5\" (UniqueName: \"kubernetes.io/projected/801b1613-3ed1-423d-85cf-6b87f6711034-kube-api-access-c26g5\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.485436 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.485470 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801b1613-3ed1-423d-85cf-6b87f6711034-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.485514 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801b1613-3ed1-423d-85cf-6b87f6711034-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.485546 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-scripts\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.485590 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.504633 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-scripts\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.507215 5033 scope.go:117] "RemoveContainer" containerID="ae05a73127e8a5569d83520289b9d950307213560a9017a627b1bb097b267616" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.511599 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.514189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.516279 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.539108 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c26g5\" (UniqueName: \"kubernetes.io/projected/801b1613-3ed1-423d-85cf-6b87f6711034-kube-api-access-c26g5\") pod \"cinder-scheduler-0\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.548200 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.577502 5033 scope.go:117] "RemoveContainer" containerID="751099a6fd3491807a70d3af4718e1c4d1a7f27761350ddb2b8727da704f7e06" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.590233 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-config-data\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.590345 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-scripts\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.590421 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-logs\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.590463 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.590499 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.591525 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh6v9\" (UniqueName: \"kubernetes.io/projected/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-kube-api-access-sh6v9\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.591579 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.591621 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.630605 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.642134 5033 scope.go:117] "RemoveContainer" containerID="5b44a06a2b3944d4f7000b47c862b2ab7a0f04f864ceac7276e4a15e1ad9a689" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.708977 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-logs\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.709051 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.709094 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.709196 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh6v9\" (UniqueName: \"kubernetes.io/projected/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-kube-api-access-sh6v9\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.709242 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.709350 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.709389 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-config-data\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.709562 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-scripts\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.709595 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-logs\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.709956 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.711291 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.714960 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-scripts\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.717913 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.725390 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.730011 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-config-data\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.750006 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh6v9\" (UniqueName: \"kubernetes.io/projected/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-kube-api-access-sh6v9\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.756381 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.766200 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:14 crc kubenswrapper[5033]: W0226 16:04:14.778377 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73414a21_a5e9_4ee9_a428_e302c86bf718.slice/crio-8a282aed93c57372e00723d4d17457662f230e2c9266050abee6f77e74856ab1 WatchSource:0}: Error finding container 8a282aed93c57372e00723d4d17457662f230e2c9266050abee6f77e74856ab1: Status 404 returned error can't find the container with id 8a282aed93c57372e00723d4d17457662f230e2c9266050abee6f77e74856ab1 Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.786557 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:04:14 crc kubenswrapper[5033]: I0226 16:04:14.836706 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.040646 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-041e-account-create-update-6zjnq" event={"ID":"33d9c0d5-6eff-4216-a63e-cdc21eaacf42","Type":"ContainerStarted","Data":"601ec1341d1e2b8c27d9c9b09bc6aa50cda535ffeee2dea5a0f7b8a4a08ee53d"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.068953 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c4c558dd7-x5jl5" event={"ID":"eb635e2d-e92d-4f18-be4b-972e388e8c25","Type":"ContainerStarted","Data":"6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.069683 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-041e-account-create-update-6zjnq" podStartSLOduration=4.069669768 podStartE2EDuration="4.069669768s" podCreationTimestamp="2026-02-26 16:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:15.058467299 +0000 UTC m=+1382.271065949" watchObservedRunningTime="2026-02-26 16:04:15.069669768 +0000 UTC m=+1382.282268428" Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.083665 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" event={"ID":"b1a1bbea-b471-4f30-abe1-de75f9641b24","Type":"ContainerStarted","Data":"5b1ffc7d5c02c07b9555b87eb62633cee02ea5a2f677dd7257675723ea705b37"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.083721 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" event={"ID":"b1a1bbea-b471-4f30-abe1-de75f9641b24","Type":"ContainerStarted","Data":"ac9ffeb75c1260e5e8a6ae80483cb4d84ed2b83e72a47ebaa3dc1ab25a290a87"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.097798 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-43b7-account-create-update-gbdhs" event={"ID":"0dac4b71-5c96-482d-b6d2-656990c1d873","Type":"ContainerStarted","Data":"a4dfe992eb2b9293a5a29973827b5efe4e44f2fdd0cbef2c70caa804c30efe24"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.103755 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bf0b01b9-0085-4552-8a7d-1e7359dad79b","Type":"ContainerStarted","Data":"56ec88a7fa9446b4543b463ffdb1a7c796f944337d736fcb3994797aeedb19eb"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.115126 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6sfh6" event={"ID":"9c4673c4-7bff-4708-8ace-4931c3091789","Type":"ContainerStarted","Data":"1e8f8f1fb9302e352c7430cffae0a9881509b338bd958c62de8c5b00c66ea405"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.115621 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" podStartSLOduration=5.115608985 podStartE2EDuration="5.115608985s" podCreationTimestamp="2026-02-26 16:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:15.109246516 +0000 UTC m=+1382.321845186" watchObservedRunningTime="2026-02-26 16:04:15.115608985 +0000 UTC m=+1382.328207625" Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.128461 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerStarted","Data":"8a282aed93c57372e00723d4d17457662f230e2c9266050abee6f77e74856ab1"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.149250 5033 generic.go:334] "Generic (PLEG): container finished" podID="73d77fb6-14aa-4c7d-9ad1-093e95473c0f" containerID="7290302f1aa5e4c3e4db1afa9200af7fbef86671c0765f651efc74f6caefccdc" exitCode=0 Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.149435 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zt7xr" event={"ID":"73d77fb6-14aa-4c7d-9ad1-093e95473c0f","Type":"ContainerDied","Data":"7290302f1aa5e4c3e4db1afa9200af7fbef86671c0765f651efc74f6caefccdc"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.154468 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-43b7-account-create-update-gbdhs" podStartSLOduration=5.154443635 podStartE2EDuration="5.154443635s" podCreationTimestamp="2026-02-26 16:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:15.123735258 +0000 UTC m=+1382.336333898" watchObservedRunningTime="2026-02-26 16:04:15.154443635 +0000 UTC m=+1382.367042285" Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.167705 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-6sfh6" podStartSLOduration=5.167676575 podStartE2EDuration="5.167676575s" podCreationTimestamp="2026-02-26 16:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:15.139724427 +0000 UTC m=+1382.352323097" watchObservedRunningTime="2026-02-26 16:04:15.167676575 +0000 UTC m=+1382.380275225" Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.179164 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k9qcs" event={"ID":"e6bddd46-8347-483a-b49d-470d41ee528c","Type":"ContainerStarted","Data":"3be7d1a82e288ba046a1c6b4bbc24dae70e39ba857a65876854c21c435f5b1de"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.179203 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k9qcs" event={"ID":"e6bddd46-8347-483a-b49d-470d41ee528c","Type":"ContainerStarted","Data":"c9e4f6b3a01dcc7aa6369391762ad8c58e284954af1cfec2cb17fc80ae7ae86b"} Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.262616 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-k9qcs" podStartSLOduration=5.262600114 podStartE2EDuration="5.262600114s" podCreationTimestamp="2026-02-26 16:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:15.206034703 +0000 UTC m=+1382.418633363" watchObservedRunningTime="2026-02-26 16:04:15.262600114 +0000 UTC m=+1382.475198764" Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.271065 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4jjz"] Feb 26 16:04:15 crc kubenswrapper[5033]: W0226 16:04:15.293277 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d74d149_c9dd_4307_9d7b_63f9e56570f6.slice/crio-eb4a21c61bffcc98f1a81feacc6505f5508d2b43e9cc07776f87b4a4a32fbeca WatchSource:0}: Error finding container eb4a21c61bffcc98f1a81feacc6505f5508d2b43e9cc07776f87b4a4a32fbeca: Status 404 returned error can't find the container with id eb4a21c61bffcc98f1a81feacc6505f5508d2b43e9cc07776f87b4a4a32fbeca Feb 26 16:04:15 crc kubenswrapper[5033]: W0226 16:04:15.489404 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801b1613_3ed1_423d_85cf_6b87f6711034.slice/crio-3ff873b87303a9670e15bafd9f8f11b63f65948e80aa866f81029ce7d6cdc4af WatchSource:0}: Error finding container 3ff873b87303a9670e15bafd9f8f11b63f65948e80aa866f81029ce7d6cdc4af: Status 404 returned error can't find the container with id 3ff873b87303a9670e15bafd9f8f11b63f65948e80aa866f81029ce7d6cdc4af Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.492193 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.645709 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e1a391-4e54-4cee-9c88-34c9e99d71fa" path="/var/lib/kubelet/pods/19e1a391-4e54-4cee-9c88-34c9e99d71fa/volumes" Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.646990 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f27d3bc-7305-451b-8dd6-a0b2765cc3b0" path="/var/lib/kubelet/pods/3f27d3bc-7305-451b-8dd6-a0b2765cc3b0/volumes" Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.648654 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7246c5a-8508-4f21-ac7a-76d218044d4c" path="/var/lib/kubelet/pods/b7246c5a-8508-4f21-ac7a-76d218044d4c/volumes" Feb 26 16:04:15 crc kubenswrapper[5033]: I0226 16:04:15.716858 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.193115 5033 generic.go:334] "Generic (PLEG): container finished" podID="33d9c0d5-6eff-4216-a63e-cdc21eaacf42" containerID="601ec1341d1e2b8c27d9c9b09bc6aa50cda535ffeee2dea5a0f7b8a4a08ee53d" exitCode=0 Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.193298 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-041e-account-create-update-6zjnq" event={"ID":"33d9c0d5-6eff-4216-a63e-cdc21eaacf42","Type":"ContainerDied","Data":"601ec1341d1e2b8c27d9c9b09bc6aa50cda535ffeee2dea5a0f7b8a4a08ee53d"} Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.197140 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bf0b01b9-0085-4552-8a7d-1e7359dad79b","Type":"ContainerStarted","Data":"a46ddf9c39fa3f9620a8c8a0c8752e122ad6d05de939c29de0a166524be100e8"} Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.201576 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c4c558dd7-x5jl5" event={"ID":"eb635e2d-e92d-4f18-be4b-972e388e8c25","Type":"ContainerStarted","Data":"9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa"} Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.202449 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.202509 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.205773 5033 generic.go:334] "Generic (PLEG): container finished" podID="9c4673c4-7bff-4708-8ace-4931c3091789" containerID="1e8f8f1fb9302e352c7430cffae0a9881509b338bd958c62de8c5b00c66ea405" exitCode=0 Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.205940 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6sfh6" event={"ID":"9c4673c4-7bff-4708-8ace-4931c3091789","Type":"ContainerDied","Data":"1e8f8f1fb9302e352c7430cffae0a9881509b338bd958c62de8c5b00c66ea405"} Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.213538 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"801b1613-3ed1-423d-85cf-6b87f6711034","Type":"ContainerStarted","Data":"3ff873b87303a9670e15bafd9f8f11b63f65948e80aa866f81029ce7d6cdc4af"} Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.219675 5033 generic.go:334] "Generic (PLEG): container finished" podID="e6bddd46-8347-483a-b49d-470d41ee528c" containerID="3be7d1a82e288ba046a1c6b4bbc24dae70e39ba857a65876854c21c435f5b1de" exitCode=0 Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.219739 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k9qcs" event={"ID":"e6bddd46-8347-483a-b49d-470d41ee528c","Type":"ContainerDied","Data":"3be7d1a82e288ba046a1c6b4bbc24dae70e39ba857a65876854c21c435f5b1de"} Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.223072 5033 generic.go:334] "Generic (PLEG): container finished" podID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerID="6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9" exitCode=0 Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.223150 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4jjz" event={"ID":"7d74d149-c9dd-4307-9d7b-63f9e56570f6","Type":"ContainerDied","Data":"6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9"} Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.223182 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4jjz" event={"ID":"7d74d149-c9dd-4307-9d7b-63f9e56570f6","Type":"ContainerStarted","Data":"eb4a21c61bffcc98f1a81feacc6505f5508d2b43e9cc07776f87b4a4a32fbeca"} Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.224920 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3","Type":"ContainerStarted","Data":"100335d3bfe986ed47539164838c321a85292a0e9696af813e0ac893ab0a9b0b"} Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.245147 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-c4c558dd7-x5jl5" podStartSLOduration=12.24511909 podStartE2EDuration="12.24511909s" podCreationTimestamp="2026-02-26 16:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:16.23311661 +0000 UTC m=+1383.445715260" watchObservedRunningTime="2026-02-26 16:04:16.24511909 +0000 UTC m=+1383.457717740" Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.743619 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.858440 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-operator-scripts\") pod \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\" (UID: \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\") " Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.858524 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn8mr\" (UniqueName: \"kubernetes.io/projected/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-kube-api-access-bn8mr\") pod \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\" (UID: \"73d77fb6-14aa-4c7d-9ad1-093e95473c0f\") " Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.860221 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73d77fb6-14aa-4c7d-9ad1-093e95473c0f" (UID: "73d77fb6-14aa-4c7d-9ad1-093e95473c0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.869328 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-kube-api-access-bn8mr" (OuterVolumeSpecName: "kube-api-access-bn8mr") pod "73d77fb6-14aa-4c7d-9ad1-093e95473c0f" (UID: "73d77fb6-14aa-4c7d-9ad1-093e95473c0f"). InnerVolumeSpecName "kube-api-access-bn8mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.960408 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn8mr\" (UniqueName: \"kubernetes.io/projected/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-kube-api-access-bn8mr\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:16 crc kubenswrapper[5033]: I0226 16:04:16.960441 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73d77fb6-14aa-4c7d-9ad1-093e95473c0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.280856 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerStarted","Data":"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8"} Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.283184 5033 generic.go:334] "Generic (PLEG): container finished" podID="b1a1bbea-b471-4f30-abe1-de75f9641b24" containerID="5b1ffc7d5c02c07b9555b87eb62633cee02ea5a2f677dd7257675723ea705b37" exitCode=0 Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.283237 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" event={"ID":"b1a1bbea-b471-4f30-abe1-de75f9641b24","Type":"ContainerDied","Data":"5b1ffc7d5c02c07b9555b87eb62633cee02ea5a2f677dd7257675723ea705b37"} Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.300392 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"801b1613-3ed1-423d-85cf-6b87f6711034","Type":"ContainerStarted","Data":"594bc92b21cdaf12c49af3fc339f3186f938691726d83e69d1538a692373df92"} Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.329844 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zt7xr" event={"ID":"73d77fb6-14aa-4c7d-9ad1-093e95473c0f","Type":"ContainerDied","Data":"7a96599c666be19f05a81fcbada27de8f9287bd7cc3d9a4f0ab7fa6dfac624aa"} Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.329898 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a96599c666be19f05a81fcbada27de8f9287bd7cc3d9a4f0ab7fa6dfac624aa" Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.329975 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zt7xr" Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.349618 5033 generic.go:334] "Generic (PLEG): container finished" podID="0dac4b71-5c96-482d-b6d2-656990c1d873" containerID="a4dfe992eb2b9293a5a29973827b5efe4e44f2fdd0cbef2c70caa804c30efe24" exitCode=0 Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.349723 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-43b7-account-create-update-gbdhs" event={"ID":"0dac4b71-5c96-482d-b6d2-656990c1d873","Type":"ContainerDied","Data":"a4dfe992eb2b9293a5a29973827b5efe4e44f2fdd0cbef2c70caa804c30efe24"} Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.359895 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3","Type":"ContainerStarted","Data":"3a534aa8128f826e879da52f7e79951a6fad47306bfecf8692f8608dfd09639b"} Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.369634 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bf0b01b9-0085-4552-8a7d-1e7359dad79b","Type":"ContainerStarted","Data":"e00abe8f8586a8c6516438836f22e8cd867774bd04666cb5631a9962be629702"} Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.778724 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.790218 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.790040593 podStartE2EDuration="5.790040593s" podCreationTimestamp="2026-02-26 16:04:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:17.396667734 +0000 UTC m=+1384.609266384" watchObservedRunningTime="2026-02-26 16:04:17.790040593 +0000 UTC m=+1385.002639243" Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.893753 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bddd46-8347-483a-b49d-470d41ee528c-operator-scripts\") pod \"e6bddd46-8347-483a-b49d-470d41ee528c\" (UID: \"e6bddd46-8347-483a-b49d-470d41ee528c\") " Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.893810 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvthx\" (UniqueName: \"kubernetes.io/projected/e6bddd46-8347-483a-b49d-470d41ee528c-kube-api-access-wvthx\") pod \"e6bddd46-8347-483a-b49d-470d41ee528c\" (UID: \"e6bddd46-8347-483a-b49d-470d41ee528c\") " Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.896925 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bddd46-8347-483a-b49d-470d41ee528c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6bddd46-8347-483a-b49d-470d41ee528c" (UID: "e6bddd46-8347-483a-b49d-470d41ee528c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.900346 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bddd46-8347-483a-b49d-470d41ee528c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.906858 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6bddd46-8347-483a-b49d-470d41ee528c-kube-api-access-wvthx" (OuterVolumeSpecName: "kube-api-access-wvthx") pod "e6bddd46-8347-483a-b49d-470d41ee528c" (UID: "e6bddd46-8347-483a-b49d-470d41ee528c"). InnerVolumeSpecName "kube-api-access-wvthx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.983225 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:17 crc kubenswrapper[5033]: I0226 16:04:17.989514 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.001736 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvthx\" (UniqueName: \"kubernetes.io/projected/e6bddd46-8347-483a-b49d-470d41ee528c-kube-api-access-wvthx\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.104478 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4673c4-7bff-4708-8ace-4931c3091789-operator-scripts\") pod \"9c4673c4-7bff-4708-8ace-4931c3091789\" (UID: \"9c4673c4-7bff-4708-8ace-4931c3091789\") " Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.104661 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbdvv\" (UniqueName: \"kubernetes.io/projected/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-kube-api-access-xbdvv\") pod \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\" (UID: \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\") " Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.104992 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4673c4-7bff-4708-8ace-4931c3091789-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c4673c4-7bff-4708-8ace-4931c3091789" (UID: "9c4673c4-7bff-4708-8ace-4931c3091789"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.105027 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsrq9\" (UniqueName: \"kubernetes.io/projected/9c4673c4-7bff-4708-8ace-4931c3091789-kube-api-access-tsrq9\") pod \"9c4673c4-7bff-4708-8ace-4931c3091789\" (UID: \"9c4673c4-7bff-4708-8ace-4931c3091789\") " Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.105100 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-operator-scripts\") pod \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\" (UID: \"33d9c0d5-6eff-4216-a63e-cdc21eaacf42\") " Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.105478 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c4673c4-7bff-4708-8ace-4931c3091789-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.105847 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33d9c0d5-6eff-4216-a63e-cdc21eaacf42" (UID: "33d9c0d5-6eff-4216-a63e-cdc21eaacf42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.113895 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4673c4-7bff-4708-8ace-4931c3091789-kube-api-access-tsrq9" (OuterVolumeSpecName: "kube-api-access-tsrq9") pod "9c4673c4-7bff-4708-8ace-4931c3091789" (UID: "9c4673c4-7bff-4708-8ace-4931c3091789"). InnerVolumeSpecName "kube-api-access-tsrq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.119859 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-kube-api-access-xbdvv" (OuterVolumeSpecName: "kube-api-access-xbdvv") pod "33d9c0d5-6eff-4216-a63e-cdc21eaacf42" (UID: "33d9c0d5-6eff-4216-a63e-cdc21eaacf42"). InnerVolumeSpecName "kube-api-access-xbdvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.207037 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.207089 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbdvv\" (UniqueName: \"kubernetes.io/projected/33d9c0d5-6eff-4216-a63e-cdc21eaacf42-kube-api-access-xbdvv\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.207104 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsrq9\" (UniqueName: \"kubernetes.io/projected/9c4673c4-7bff-4708-8ace-4931c3091789-kube-api-access-tsrq9\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.375336 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6sfh6" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.375346 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6sfh6" event={"ID":"9c4673c4-7bff-4708-8ace-4931c3091789","Type":"ContainerDied","Data":"6af362895f9d07633a68edb21b9bc612c00c033c342c5865c8ae05e37c6c5847"} Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.375382 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6af362895f9d07633a68edb21b9bc612c00c033c342c5865c8ae05e37c6c5847" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.376594 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k9qcs" event={"ID":"e6bddd46-8347-483a-b49d-470d41ee528c","Type":"ContainerDied","Data":"c9e4f6b3a01dcc7aa6369391762ad8c58e284954af1cfec2cb17fc80ae7ae86b"} Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.376617 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9e4f6b3a01dcc7aa6369391762ad8c58e284954af1cfec2cb17fc80ae7ae86b" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.376668 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k9qcs" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.380172 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-041e-account-create-update-6zjnq" event={"ID":"33d9c0d5-6eff-4216-a63e-cdc21eaacf42","Type":"ContainerDied","Data":"29336dc6de96f21425b6bde6e2f96d3d15ec91651752f9e9124b4f96aa3b219d"} Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.380211 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29336dc6de96f21425b6bde6e2f96d3d15ec91651752f9e9124b4f96aa3b219d" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.380396 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-041e-account-create-update-6zjnq" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.820624 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.825824 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.931087 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a1bbea-b471-4f30-abe1-de75f9641b24-operator-scripts\") pod \"b1a1bbea-b471-4f30-abe1-de75f9641b24\" (UID: \"b1a1bbea-b471-4f30-abe1-de75f9641b24\") " Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.931216 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4v5z\" (UniqueName: \"kubernetes.io/projected/0dac4b71-5c96-482d-b6d2-656990c1d873-kube-api-access-j4v5z\") pod \"0dac4b71-5c96-482d-b6d2-656990c1d873\" (UID: \"0dac4b71-5c96-482d-b6d2-656990c1d873\") " Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.931318 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac4b71-5c96-482d-b6d2-656990c1d873-operator-scripts\") pod \"0dac4b71-5c96-482d-b6d2-656990c1d873\" (UID: \"0dac4b71-5c96-482d-b6d2-656990c1d873\") " Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.931372 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hr6b\" (UniqueName: \"kubernetes.io/projected/b1a1bbea-b471-4f30-abe1-de75f9641b24-kube-api-access-5hr6b\") pod \"b1a1bbea-b471-4f30-abe1-de75f9641b24\" (UID: \"b1a1bbea-b471-4f30-abe1-de75f9641b24\") " Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.931628 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a1bbea-b471-4f30-abe1-de75f9641b24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1a1bbea-b471-4f30-abe1-de75f9641b24" (UID: "b1a1bbea-b471-4f30-abe1-de75f9641b24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.932142 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a1bbea-b471-4f30-abe1-de75f9641b24-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.933288 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dac4b71-5c96-482d-b6d2-656990c1d873-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0dac4b71-5c96-482d-b6d2-656990c1d873" (UID: "0dac4b71-5c96-482d-b6d2-656990c1d873"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.935315 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a1bbea-b471-4f30-abe1-de75f9641b24-kube-api-access-5hr6b" (OuterVolumeSpecName: "kube-api-access-5hr6b") pod "b1a1bbea-b471-4f30-abe1-de75f9641b24" (UID: "b1a1bbea-b471-4f30-abe1-de75f9641b24"). InnerVolumeSpecName "kube-api-access-5hr6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[5033]: I0226 16:04:18.935757 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dac4b71-5c96-482d-b6d2-656990c1d873-kube-api-access-j4v5z" (OuterVolumeSpecName: "kube-api-access-j4v5z") pod "0dac4b71-5c96-482d-b6d2-656990c1d873" (UID: "0dac4b71-5c96-482d-b6d2-656990c1d873"). InnerVolumeSpecName "kube-api-access-j4v5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.033847 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4v5z\" (UniqueName: \"kubernetes.io/projected/0dac4b71-5c96-482d-b6d2-656990c1d873-kube-api-access-j4v5z\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.033881 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dac4b71-5c96-482d-b6d2-656990c1d873-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.033894 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hr6b\" (UniqueName: \"kubernetes.io/projected/b1a1bbea-b471-4f30-abe1-de75f9641b24-kube-api-access-5hr6b\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.394548 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-43b7-account-create-update-gbdhs" event={"ID":"0dac4b71-5c96-482d-b6d2-656990c1d873","Type":"ContainerDied","Data":"50a746810b8fae79d068701d71b64e846909ad4152f61270e9bc2c205fd55b5a"} Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.394930 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50a746810b8fae79d068701d71b64e846909ad4152f61270e9bc2c205fd55b5a" Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.395026 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-43b7-account-create-update-gbdhs" Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.422693 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3","Type":"ContainerStarted","Data":"2f2bce733fa63476566ab5bf20d7a8fb1718f0f35fccb3db14100b8bf61575d0"} Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.430707 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" event={"ID":"b1a1bbea-b471-4f30-abe1-de75f9641b24","Type":"ContainerDied","Data":"ac9ffeb75c1260e5e8a6ae80483cb4d84ed2b83e72a47ebaa3dc1ab25a290a87"} Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.430758 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac9ffeb75c1260e5e8a6ae80483cb4d84ed2b83e72a47ebaa3dc1ab25a290a87" Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.430846 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-56c7-account-create-update-dqbl5" Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.436519 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"801b1613-3ed1-423d-85cf-6b87f6711034","Type":"ContainerStarted","Data":"81b00d0b1c13dc1cda83736d47b6fac79d1887078858c3ef5eaedcb3b16e0883"} Feb 26 16:04:19 crc kubenswrapper[5033]: I0226 16:04:19.588719 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:20 crc kubenswrapper[5033]: I0226 16:04:20.189968 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:04:20 crc kubenswrapper[5033]: I0226 16:04:20.190058 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.602445 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lkrpq"] Feb 26 16:04:21 crc kubenswrapper[5033]: E0226 16:04:21.603028 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bddd46-8347-483a-b49d-470d41ee528c" containerName="mariadb-database-create" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603039 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bddd46-8347-483a-b49d-470d41ee528c" containerName="mariadb-database-create" Feb 26 16:04:21 crc kubenswrapper[5033]: E0226 16:04:21.603051 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d77fb6-14aa-4c7d-9ad1-093e95473c0f" containerName="mariadb-database-create" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603058 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d77fb6-14aa-4c7d-9ad1-093e95473c0f" containerName="mariadb-database-create" Feb 26 16:04:21 crc kubenswrapper[5033]: E0226 16:04:21.603074 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4673c4-7bff-4708-8ace-4931c3091789" containerName="mariadb-database-create" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603080 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4673c4-7bff-4708-8ace-4931c3091789" containerName="mariadb-database-create" Feb 26 16:04:21 crc kubenswrapper[5033]: E0226 16:04:21.603104 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a1bbea-b471-4f30-abe1-de75f9641b24" containerName="mariadb-account-create-update" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603110 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a1bbea-b471-4f30-abe1-de75f9641b24" containerName="mariadb-account-create-update" Feb 26 16:04:21 crc kubenswrapper[5033]: E0226 16:04:21.603120 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dac4b71-5c96-482d-b6d2-656990c1d873" containerName="mariadb-account-create-update" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603126 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dac4b71-5c96-482d-b6d2-656990c1d873" containerName="mariadb-account-create-update" Feb 26 16:04:21 crc kubenswrapper[5033]: E0226 16:04:21.603139 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d9c0d5-6eff-4216-a63e-cdc21eaacf42" containerName="mariadb-account-create-update" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603144 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d9c0d5-6eff-4216-a63e-cdc21eaacf42" containerName="mariadb-account-create-update" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603326 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6bddd46-8347-483a-b49d-470d41ee528c" containerName="mariadb-database-create" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603340 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dac4b71-5c96-482d-b6d2-656990c1d873" containerName="mariadb-account-create-update" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603347 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d77fb6-14aa-4c7d-9ad1-093e95473c0f" containerName="mariadb-database-create" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603354 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d9c0d5-6eff-4216-a63e-cdc21eaacf42" containerName="mariadb-account-create-update" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603366 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a1bbea-b471-4f30-abe1-de75f9641b24" containerName="mariadb-account-create-update" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603381 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4673c4-7bff-4708-8ace-4931c3091789" containerName="mariadb-database-create" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.603904 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.607676 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.607918 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.608037 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-n8cn8" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.617132 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lkrpq"] Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.685469 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6txww\" (UniqueName: \"kubernetes.io/projected/329480df-036d-4b95-ad02-08a923116026-kube-api-access-6txww\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.685525 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.685640 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-config-data\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.686037 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-scripts\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.787869 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-scripts\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.787960 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6txww\" (UniqueName: \"kubernetes.io/projected/329480df-036d-4b95-ad02-08a923116026-kube-api-access-6txww\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.788016 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.788081 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-config-data\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.794460 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.794881 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-scripts\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.806244 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-config-data\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.813900 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6txww\" (UniqueName: \"kubernetes.io/projected/329480df-036d-4b95-ad02-08a923116026-kube-api-access-6txww\") pod \"nova-cell0-conductor-db-sync-lkrpq\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:21 crc kubenswrapper[5033]: I0226 16:04:21.925213 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:22 crc kubenswrapper[5033]: I0226 16:04:22.501397 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.501372636 podStartE2EDuration="8.501372636s" podCreationTimestamp="2026-02-26 16:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:22.480503405 +0000 UTC m=+1389.693102065" watchObservedRunningTime="2026-02-26 16:04:22.501372636 +0000 UTC m=+1389.713971286" Feb 26 16:04:22 crc kubenswrapper[5033]: I0226 16:04:22.530381 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=8.53035581 podStartE2EDuration="8.53035581s" podCreationTimestamp="2026-02-26 16:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:22.52516179 +0000 UTC m=+1389.737760440" watchObservedRunningTime="2026-02-26 16:04:22.53035581 +0000 UTC m=+1389.742954460" Feb 26 16:04:23 crc kubenswrapper[5033]: I0226 16:04:23.126570 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lkrpq"] Feb 26 16:04:23 crc kubenswrapper[5033]: W0226 16:04:23.129363 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329480df_036d_4b95_ad02_08a923116026.slice/crio-22b348405f45b8dd89ca284be1c8deb32933f1f41828eb214c33efe14fa85993 WatchSource:0}: Error finding container 22b348405f45b8dd89ca284be1c8deb32933f1f41828eb214c33efe14fa85993: Status 404 returned error can't find the container with id 22b348405f45b8dd89ca284be1c8deb32933f1f41828eb214c33efe14fa85993 Feb 26 16:04:23 crc kubenswrapper[5033]: I0226 16:04:23.473188 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerStarted","Data":"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d"} Feb 26 16:04:23 crc kubenswrapper[5033]: I0226 16:04:23.484527 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[5033]: I0226 16:04:23.484563 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[5033]: I0226 16:04:23.485162 5033 generic.go:334] "Generic (PLEG): container finished" podID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerID="1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268" exitCode=0 Feb 26 16:04:23 crc kubenswrapper[5033]: I0226 16:04:23.485288 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4jjz" event={"ID":"7d74d149-c9dd-4307-9d7b-63f9e56570f6","Type":"ContainerDied","Data":"1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268"} Feb 26 16:04:23 crc kubenswrapper[5033]: I0226 16:04:23.493264 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lkrpq" event={"ID":"329480df-036d-4b95-ad02-08a923116026","Type":"ContainerStarted","Data":"22b348405f45b8dd89ca284be1c8deb32933f1f41828eb214c33efe14fa85993"} Feb 26 16:04:23 crc kubenswrapper[5033]: I0226 16:04:23.543374 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[5033]: I0226 16:04:23.543828 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.521276 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerStarted","Data":"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216"} Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.525324 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4jjz" event={"ID":"7d74d149-c9dd-4307-9d7b-63f9e56570f6","Type":"ContainerStarted","Data":"56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686"} Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.525742 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.527523 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.554845 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n4jjz" podStartSLOduration=2.826106682 podStartE2EDuration="10.554823714s" podCreationTimestamp="2026-02-26 16:04:14 +0000 UTC" firstStartedPulling="2026-02-26 16:04:16.225360287 +0000 UTC m=+1383.437958937" lastFinishedPulling="2026-02-26 16:04:23.954077319 +0000 UTC m=+1391.166675969" observedRunningTime="2026-02-26 16:04:24.546426575 +0000 UTC m=+1391.759025225" watchObservedRunningTime="2026-02-26 16:04:24.554823714 +0000 UTC m=+1391.767422364" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.589070 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.634053 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.634095 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.788337 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.803227 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.839982 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.840162 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.896595 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 16:04:24 crc kubenswrapper[5033]: I0226 16:04:24.912118 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[5033]: I0226 16:04:25.102432 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 16:04:25 crc kubenswrapper[5033]: I0226 16:04:25.538701 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[5033]: I0226 16:04:25.539046 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[5033]: I0226 16:04:25.742053 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-n4jjz" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="registry-server" probeResult="failure" output=< Feb 26 16:04:25 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 16:04:25 crc kubenswrapper[5033]: > Feb 26 16:04:26 crc kubenswrapper[5033]: I0226 16:04:25.999934 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:04:26 crc kubenswrapper[5033]: I0226 16:04:26.437936 5033 scope.go:117] "RemoveContainer" containerID="f9b9bdb8473f14761b65adb072ad4c4b45ed533735f4281f6c4543402f18d5e5" Feb 26 16:04:26 crc kubenswrapper[5033]: I0226 16:04:26.555546 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerStarted","Data":"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f"} Feb 26 16:04:26 crc kubenswrapper[5033]: I0226 16:04:26.555612 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:04:26 crc kubenswrapper[5033]: I0226 16:04:26.555659 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:04:26 crc kubenswrapper[5033]: I0226 16:04:26.556147 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="ceilometer-central-agent" containerID="cri-o://ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8" gracePeriod=30 Feb 26 16:04:26 crc kubenswrapper[5033]: I0226 16:04:26.556228 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="sg-core" containerID="cri-o://a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216" gracePeriod=30 Feb 26 16:04:26 crc kubenswrapper[5033]: I0226 16:04:26.556283 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="ceilometer-notification-agent" containerID="cri-o://e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d" gracePeriod=30 Feb 26 16:04:26 crc kubenswrapper[5033]: I0226 16:04:26.556271 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="proxy-httpd" containerID="cri-o://55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f" gracePeriod=30 Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.302970 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.303576 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.347117 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.909120535 podStartE2EDuration="14.347093314s" podCreationTimestamp="2026-02-26 16:04:13 +0000 UTC" firstStartedPulling="2026-02-26 16:04:14.871106812 +0000 UTC m=+1382.083705462" lastFinishedPulling="2026-02-26 16:04:25.309079591 +0000 UTC m=+1392.521678241" observedRunningTime="2026-02-26 16:04:26.596362845 +0000 UTC m=+1393.808961495" watchObservedRunningTime="2026-02-26 16:04:27.347093314 +0000 UTC m=+1394.559691964" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.433481 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.543735 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-log-httpd\") pod \"73414a21-a5e9-4ee9-a428-e302c86bf718\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.544033 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448n8\" (UniqueName: \"kubernetes.io/projected/73414a21-a5e9-4ee9-a428-e302c86bf718-kube-api-access-448n8\") pod \"73414a21-a5e9-4ee9-a428-e302c86bf718\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.544142 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-scripts\") pod \"73414a21-a5e9-4ee9-a428-e302c86bf718\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.544172 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-sg-core-conf-yaml\") pod \"73414a21-a5e9-4ee9-a428-e302c86bf718\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.544215 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-run-httpd\") pod \"73414a21-a5e9-4ee9-a428-e302c86bf718\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.544383 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-config-data\") pod \"73414a21-a5e9-4ee9-a428-e302c86bf718\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.544417 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-combined-ca-bundle\") pod \"73414a21-a5e9-4ee9-a428-e302c86bf718\" (UID: \"73414a21-a5e9-4ee9-a428-e302c86bf718\") " Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.545977 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "73414a21-a5e9-4ee9-a428-e302c86bf718" (UID: "73414a21-a5e9-4ee9-a428-e302c86bf718"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.546068 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "73414a21-a5e9-4ee9-a428-e302c86bf718" (UID: "73414a21-a5e9-4ee9-a428-e302c86bf718"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.557410 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-scripts" (OuterVolumeSpecName: "scripts") pod "73414a21-a5e9-4ee9-a428-e302c86bf718" (UID: "73414a21-a5e9-4ee9-a428-e302c86bf718"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.557560 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73414a21-a5e9-4ee9-a428-e302c86bf718-kube-api-access-448n8" (OuterVolumeSpecName: "kube-api-access-448n8") pod "73414a21-a5e9-4ee9-a428-e302c86bf718" (UID: "73414a21-a5e9-4ee9-a428-e302c86bf718"). InnerVolumeSpecName "kube-api-access-448n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.591896 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "73414a21-a5e9-4ee9-a428-e302c86bf718" (UID: "73414a21-a5e9-4ee9-a428-e302c86bf718"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.598824 5033 generic.go:334] "Generic (PLEG): container finished" podID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerID="55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f" exitCode=0 Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.598895 5033 generic.go:334] "Generic (PLEG): container finished" podID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerID="a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216" exitCode=2 Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.598909 5033 generic.go:334] "Generic (PLEG): container finished" podID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerID="e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d" exitCode=0 Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.598919 5033 generic.go:334] "Generic (PLEG): container finished" podID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerID="ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8" exitCode=0 Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.599644 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.600315 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerDied","Data":"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f"} Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.600391 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerDied","Data":"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216"} Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.600409 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerDied","Data":"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d"} Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.600419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerDied","Data":"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8"} Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.600428 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73414a21-a5e9-4ee9-a428-e302c86bf718","Type":"ContainerDied","Data":"8a282aed93c57372e00723d4d17457662f230e2c9266050abee6f77e74856ab1"} Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.600456 5033 scope.go:117] "RemoveContainer" containerID="55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.600599 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.600612 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.650057 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.650735 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.650767 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.650779 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73414a21-a5e9-4ee9-a428-e302c86bf718-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.650789 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448n8\" (UniqueName: \"kubernetes.io/projected/73414a21-a5e9-4ee9-a428-e302c86bf718-kube-api-access-448n8\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.665689 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73414a21-a5e9-4ee9-a428-e302c86bf718" (UID: "73414a21-a5e9-4ee9-a428-e302c86bf718"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.704746 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-config-data" (OuterVolumeSpecName: "config-data") pod "73414a21-a5e9-4ee9-a428-e302c86bf718" (UID: "73414a21-a5e9-4ee9-a428-e302c86bf718"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.716102 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.752339 5033 scope.go:117] "RemoveContainer" containerID="a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.754407 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.754497 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73414a21-a5e9-4ee9-a428-e302c86bf718-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.773919 5033 scope.go:117] "RemoveContainer" containerID="e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.797877 5033 scope.go:117] "RemoveContainer" containerID="ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.831383 5033 scope.go:117] "RemoveContainer" containerID="55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f" Feb 26 16:04:27 crc kubenswrapper[5033]: E0226 16:04:27.831957 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f\": container with ID starting with 55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f not found: ID does not exist" containerID="55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.832000 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f"} err="failed to get container status \"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f\": rpc error: code = NotFound desc = could not find container \"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f\": container with ID starting with 55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.832030 5033 scope.go:117] "RemoveContainer" containerID="a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216" Feb 26 16:04:27 crc kubenswrapper[5033]: E0226 16:04:27.832404 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216\": container with ID starting with a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216 not found: ID does not exist" containerID="a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.832444 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216"} err="failed to get container status \"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216\": rpc error: code = NotFound desc = could not find container \"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216\": container with ID starting with a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216 not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.832464 5033 scope.go:117] "RemoveContainer" containerID="e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d" Feb 26 16:04:27 crc kubenswrapper[5033]: E0226 16:04:27.832760 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d\": container with ID starting with e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d not found: ID does not exist" containerID="e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.832791 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d"} err="failed to get container status \"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d\": rpc error: code = NotFound desc = could not find container \"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d\": container with ID starting with e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.832811 5033 scope.go:117] "RemoveContainer" containerID="ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8" Feb 26 16:04:27 crc kubenswrapper[5033]: E0226 16:04:27.833369 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8\": container with ID starting with ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8 not found: ID does not exist" containerID="ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.833399 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8"} err="failed to get container status \"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8\": rpc error: code = NotFound desc = could not find container \"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8\": container with ID starting with ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8 not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.833418 5033 scope.go:117] "RemoveContainer" containerID="55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.833774 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f"} err="failed to get container status \"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f\": rpc error: code = NotFound desc = could not find container \"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f\": container with ID starting with 55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.833796 5033 scope.go:117] "RemoveContainer" containerID="a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.834082 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216"} err="failed to get container status \"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216\": rpc error: code = NotFound desc = could not find container \"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216\": container with ID starting with a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216 not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.834105 5033 scope.go:117] "RemoveContainer" containerID="e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.834354 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d"} err="failed to get container status \"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d\": rpc error: code = NotFound desc = could not find container \"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d\": container with ID starting with e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.834379 5033 scope.go:117] "RemoveContainer" containerID="ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.834544 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8"} err="failed to get container status \"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8\": rpc error: code = NotFound desc = could not find container \"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8\": container with ID starting with ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8 not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.834564 5033 scope.go:117] "RemoveContainer" containerID="55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.834756 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f"} err="failed to get container status \"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f\": rpc error: code = NotFound desc = could not find container \"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f\": container with ID starting with 55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.834778 5033 scope.go:117] "RemoveContainer" containerID="a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.835026 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216"} err="failed to get container status \"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216\": rpc error: code = NotFound desc = could not find container \"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216\": container with ID starting with a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216 not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.835048 5033 scope.go:117] "RemoveContainer" containerID="e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.836387 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d"} err="failed to get container status \"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d\": rpc error: code = NotFound desc = could not find container \"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d\": container with ID starting with e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.836422 5033 scope.go:117] "RemoveContainer" containerID="ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.836662 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8"} err="failed to get container status \"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8\": rpc error: code = NotFound desc = could not find container \"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8\": container with ID starting with ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8 not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.836686 5033 scope.go:117] "RemoveContainer" containerID="55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.836878 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f"} err="failed to get container status \"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f\": rpc error: code = NotFound desc = could not find container \"55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f\": container with ID starting with 55642d784d0f3c220037d0011255978ec578aa44c7654b8b299bf20b7817807f not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.836900 5033 scope.go:117] "RemoveContainer" containerID="a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.837339 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216"} err="failed to get container status \"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216\": rpc error: code = NotFound desc = could not find container \"a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216\": container with ID starting with a0ac7fe80f7a01d58dfc41feb7eed96af689547333331d399600473179ce2216 not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.837407 5033 scope.go:117] "RemoveContainer" containerID="e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.837753 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d"} err="failed to get container status \"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d\": rpc error: code = NotFound desc = could not find container \"e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d\": container with ID starting with e20babdf256b2dadea261601d29291993d3bbca9c49e06e9aa782fd8e63f880d not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.837780 5033 scope.go:117] "RemoveContainer" containerID="ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.838028 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8"} err="failed to get container status \"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8\": rpc error: code = NotFound desc = could not find container \"ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8\": container with ID starting with ec3e822560c8af56d425b27945c8e06900c58f74fdc3cd0daa64efeca6e067a8 not found: ID does not exist" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.943236 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.956872 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.973634 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:27 crc kubenswrapper[5033]: E0226 16:04:27.974085 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="ceilometer-central-agent" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.974097 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="ceilometer-central-agent" Feb 26 16:04:27 crc kubenswrapper[5033]: E0226 16:04:27.974114 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="proxy-httpd" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.974120 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="proxy-httpd" Feb 26 16:04:27 crc kubenswrapper[5033]: E0226 16:04:27.974133 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="ceilometer-notification-agent" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.974140 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="ceilometer-notification-agent" Feb 26 16:04:27 crc kubenswrapper[5033]: E0226 16:04:27.974160 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="sg-core" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.974169 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="sg-core" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.974450 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="proxy-httpd" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.974470 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="ceilometer-central-agent" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.974513 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="ceilometer-notification-agent" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.974529 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" containerName="sg-core" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.976628 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.979617 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:04:27 crc kubenswrapper[5033]: I0226 16:04:27.979899 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.012621 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.056975 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.059220 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-scripts\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.059324 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.059353 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-log-httpd\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.059463 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-run-httpd\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.059512 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.059535 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-config-data\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.059633 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4tsg\" (UniqueName: \"kubernetes.io/projected/f3d45cd1-5637-4000-b869-79b268fe47d0-kube-api-access-k4tsg\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.161353 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4tsg\" (UniqueName: \"kubernetes.io/projected/f3d45cd1-5637-4000-b869-79b268fe47d0-kube-api-access-k4tsg\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.161399 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-scripts\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.161446 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.161464 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-log-httpd\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.161551 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-run-httpd\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.161590 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.161607 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-config-data\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.165371 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-log-httpd\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.165442 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-run-httpd\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.169097 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.169524 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-config-data\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.172846 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-scripts\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.173727 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.189149 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4tsg\" (UniqueName: \"kubernetes.io/projected/f3d45cd1-5637-4000-b869-79b268fe47d0-kube-api-access-k4tsg\") pod \"ceilometer-0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.307366 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.637645 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.700445 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69ff95858d-bj8jj"] Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.700667 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69ff95858d-bj8jj" podUID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerName="neutron-api" containerID="cri-o://a1d35abe934158b73fb8b32736a06c60904c589aa9b0dcc2008ad42cec09970b" gracePeriod=30 Feb 26 16:04:28 crc kubenswrapper[5033]: I0226 16:04:28.701098 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69ff95858d-bj8jj" podUID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerName="neutron-httpd" containerID="cri-o://b9845a8a0279ae511f3247bf2d4e4c405ffc83d62da247751f9d4e639942c8d8" gracePeriod=30 Feb 26 16:04:29 crc kubenswrapper[5033]: I0226 16:04:29.623047 5033 generic.go:334] "Generic (PLEG): container finished" podID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerID="b9845a8a0279ae511f3247bf2d4e4c405ffc83d62da247751f9d4e639942c8d8" exitCode=0 Feb 26 16:04:29 crc kubenswrapper[5033]: I0226 16:04:29.624313 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69ff95858d-bj8jj" event={"ID":"e4360f1d-93a6-47c3-84ba-96c8d202f420","Type":"ContainerDied","Data":"b9845a8a0279ae511f3247bf2d4e4c405ffc83d62da247751f9d4e639942c8d8"} Feb 26 16:04:29 crc kubenswrapper[5033]: I0226 16:04:29.649948 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73414a21-a5e9-4ee9-a428-e302c86bf718" path="/var/lib/kubelet/pods/73414a21-a5e9-4ee9-a428-e302c86bf718/volumes" Feb 26 16:04:30 crc kubenswrapper[5033]: I0226 16:04:30.647449 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:32 crc kubenswrapper[5033]: I0226 16:04:32.664188 5033 generic.go:334] "Generic (PLEG): container finished" podID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerID="a1d35abe934158b73fb8b32736a06c60904c589aa9b0dcc2008ad42cec09970b" exitCode=0 Feb 26 16:04:32 crc kubenswrapper[5033]: I0226 16:04:32.664289 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69ff95858d-bj8jj" event={"ID":"e4360f1d-93a6-47c3-84ba-96c8d202f420","Type":"ContainerDied","Data":"a1d35abe934158b73fb8b32736a06c60904c589aa9b0dcc2008ad42cec09970b"} Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.248060 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.384176 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-ovndb-tls-certs\") pod \"e4360f1d-93a6-47c3-84ba-96c8d202f420\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.384232 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-combined-ca-bundle\") pod \"e4360f1d-93a6-47c3-84ba-96c8d202f420\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.384302 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-httpd-config\") pod \"e4360f1d-93a6-47c3-84ba-96c8d202f420\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.384367 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v7gz\" (UniqueName: \"kubernetes.io/projected/e4360f1d-93a6-47c3-84ba-96c8d202f420-kube-api-access-2v7gz\") pod \"e4360f1d-93a6-47c3-84ba-96c8d202f420\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.384395 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-config\") pod \"e4360f1d-93a6-47c3-84ba-96c8d202f420\" (UID: \"e4360f1d-93a6-47c3-84ba-96c8d202f420\") " Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.391144 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e4360f1d-93a6-47c3-84ba-96c8d202f420" (UID: "e4360f1d-93a6-47c3-84ba-96c8d202f420"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.391170 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4360f1d-93a6-47c3-84ba-96c8d202f420-kube-api-access-2v7gz" (OuterVolumeSpecName: "kube-api-access-2v7gz") pod "e4360f1d-93a6-47c3-84ba-96c8d202f420" (UID: "e4360f1d-93a6-47c3-84ba-96c8d202f420"). InnerVolumeSpecName "kube-api-access-2v7gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.426898 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.465394 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-config" (OuterVolumeSpecName: "config") pod "e4360f1d-93a6-47c3-84ba-96c8d202f420" (UID: "e4360f1d-93a6-47c3-84ba-96c8d202f420"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.483734 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4360f1d-93a6-47c3-84ba-96c8d202f420" (UID: "e4360f1d-93a6-47c3-84ba-96c8d202f420"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.486159 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.486191 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.486202 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v7gz\" (UniqueName: \"kubernetes.io/projected/e4360f1d-93a6-47c3-84ba-96c8d202f420-kube-api-access-2v7gz\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.486214 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.497243 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e4360f1d-93a6-47c3-84ba-96c8d202f420" (UID: "e4360f1d-93a6-47c3-84ba-96c8d202f420"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.587983 5033 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4360f1d-93a6-47c3-84ba-96c8d202f420-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.681973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69ff95858d-bj8jj" event={"ID":"e4360f1d-93a6-47c3-84ba-96c8d202f420","Type":"ContainerDied","Data":"80f2f5f965ce1fd1ed8d986dc596ed2ef004d9abb11a22323605038c8eda0bf4"} Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.682005 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69ff95858d-bj8jj" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.682028 5033 scope.go:117] "RemoveContainer" containerID="b9845a8a0279ae511f3247bf2d4e4c405ffc83d62da247751f9d4e639942c8d8" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.683944 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerStarted","Data":"5f31d64af04e4cc81eb2450157c92547974558088ae15720e619eeb20fcbcbb9"} Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.686961 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lkrpq" event={"ID":"329480df-036d-4b95-ad02-08a923116026","Type":"ContainerStarted","Data":"1a1d164d9741ffa6240e670361f34560b7319eec8e30291e5bd6ef38daf1fe8d"} Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.714181 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lkrpq" podStartSLOduration=2.895662091 podStartE2EDuration="13.714154728s" podCreationTimestamp="2026-02-26 16:04:21 +0000 UTC" firstStartedPulling="2026-02-26 16:04:23.140628994 +0000 UTC m=+1390.353227644" lastFinishedPulling="2026-02-26 16:04:33.959121631 +0000 UTC m=+1401.171720281" observedRunningTime="2026-02-26 16:04:34.704133207 +0000 UTC m=+1401.916731877" watchObservedRunningTime="2026-02-26 16:04:34.714154728 +0000 UTC m=+1401.926753398" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.714801 5033 scope.go:117] "RemoveContainer" containerID="a1d35abe934158b73fb8b32736a06c60904c589aa9b0dcc2008ad42cec09970b" Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.737910 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69ff95858d-bj8jj"] Feb 26 16:04:34 crc kubenswrapper[5033]: I0226 16:04:34.746979 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-69ff95858d-bj8jj"] Feb 26 16:04:34 crc kubenswrapper[5033]: E0226 16:04:34.872623 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4360f1d_93a6_47c3_84ba_96c8d202f420.slice/crio-80f2f5f965ce1fd1ed8d986dc596ed2ef004d9abb11a22323605038c8eda0bf4\": RecentStats: unable to find data in memory cache]" Feb 26 16:04:35 crc kubenswrapper[5033]: I0226 16:04:35.643310 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4360f1d-93a6-47c3-84ba-96c8d202f420" path="/var/lib/kubelet/pods/e4360f1d-93a6-47c3-84ba-96c8d202f420/volumes" Feb 26 16:04:35 crc kubenswrapper[5033]: I0226 16:04:35.684037 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-n4jjz" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="registry-server" probeResult="failure" output=< Feb 26 16:04:35 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 16:04:35 crc kubenswrapper[5033]: > Feb 26 16:04:35 crc kubenswrapper[5033]: I0226 16:04:35.700605 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerStarted","Data":"58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4"} Feb 26 16:04:36 crc kubenswrapper[5033]: I0226 16:04:36.711703 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerStarted","Data":"fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961"} Feb 26 16:04:37 crc kubenswrapper[5033]: I0226 16:04:37.729412 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerStarted","Data":"381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e"} Feb 26 16:04:38 crc kubenswrapper[5033]: I0226 16:04:38.739467 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerStarted","Data":"beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3"} Feb 26 16:04:38 crc kubenswrapper[5033]: I0226 16:04:38.739842 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:04:38 crc kubenswrapper[5033]: I0226 16:04:38.739652 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="proxy-httpd" containerID="cri-o://beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3" gracePeriod=30 Feb 26 16:04:38 crc kubenswrapper[5033]: I0226 16:04:38.739649 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="ceilometer-central-agent" containerID="cri-o://58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4" gracePeriod=30 Feb 26 16:04:38 crc kubenswrapper[5033]: I0226 16:04:38.739685 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="sg-core" containerID="cri-o://381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e" gracePeriod=30 Feb 26 16:04:38 crc kubenswrapper[5033]: I0226 16:04:38.739702 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="ceilometer-notification-agent" containerID="cri-o://fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961" gracePeriod=30 Feb 26 16:04:38 crc kubenswrapper[5033]: I0226 16:04:38.765109 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.885973786 podStartE2EDuration="11.765091276s" podCreationTimestamp="2026-02-26 16:04:27 +0000 UTC" firstStartedPulling="2026-02-26 16:04:34.419588275 +0000 UTC m=+1401.632186945" lastFinishedPulling="2026-02-26 16:04:38.298705785 +0000 UTC m=+1405.511304435" observedRunningTime="2026-02-26 16:04:38.761191699 +0000 UTC m=+1405.973790349" watchObservedRunningTime="2026-02-26 16:04:38.765091276 +0000 UTC m=+1405.977689926" Feb 26 16:04:39 crc kubenswrapper[5033]: I0226 16:04:39.756707 5033 generic.go:334] "Generic (PLEG): container finished" podID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerID="beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3" exitCode=0 Feb 26 16:04:39 crc kubenswrapper[5033]: I0226 16:04:39.756765 5033 generic.go:334] "Generic (PLEG): container finished" podID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerID="381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e" exitCode=2 Feb 26 16:04:39 crc kubenswrapper[5033]: I0226 16:04:39.756784 5033 generic.go:334] "Generic (PLEG): container finished" podID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerID="fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961" exitCode=0 Feb 26 16:04:39 crc kubenswrapper[5033]: I0226 16:04:39.756822 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerDied","Data":"beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3"} Feb 26 16:04:39 crc kubenswrapper[5033]: I0226 16:04:39.756869 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerDied","Data":"381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e"} Feb 26 16:04:39 crc kubenswrapper[5033]: I0226 16:04:39.756895 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerDied","Data":"fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961"} Feb 26 16:04:44 crc kubenswrapper[5033]: I0226 16:04:44.688045 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:44 crc kubenswrapper[5033]: I0226 16:04:44.749525 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:45 crc kubenswrapper[5033]: I0226 16:04:45.302134 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4jjz"] Feb 26 16:04:45 crc kubenswrapper[5033]: I0226 16:04:45.842111 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n4jjz" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="registry-server" containerID="cri-o://56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686" gracePeriod=2 Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.334895 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.404687 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-catalog-content\") pod \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.404747 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-utilities\") pod \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.404812 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbxsn\" (UniqueName: \"kubernetes.io/projected/7d74d149-c9dd-4307-9d7b-63f9e56570f6-kube-api-access-rbxsn\") pod \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\" (UID: \"7d74d149-c9dd-4307-9d7b-63f9e56570f6\") " Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.405708 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-utilities" (OuterVolumeSpecName: "utilities") pod "7d74d149-c9dd-4307-9d7b-63f9e56570f6" (UID: "7d74d149-c9dd-4307-9d7b-63f9e56570f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.410633 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d74d149-c9dd-4307-9d7b-63f9e56570f6-kube-api-access-rbxsn" (OuterVolumeSpecName: "kube-api-access-rbxsn") pod "7d74d149-c9dd-4307-9d7b-63f9e56570f6" (UID: "7d74d149-c9dd-4307-9d7b-63f9e56570f6"). InnerVolumeSpecName "kube-api-access-rbxsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.434568 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d74d149-c9dd-4307-9d7b-63f9e56570f6" (UID: "7d74d149-c9dd-4307-9d7b-63f9e56570f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.506757 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbxsn\" (UniqueName: \"kubernetes.io/projected/7d74d149-c9dd-4307-9d7b-63f9e56570f6-kube-api-access-rbxsn\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.507006 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.507124 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d74d149-c9dd-4307-9d7b-63f9e56570f6-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.855470 5033 generic.go:334] "Generic (PLEG): container finished" podID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerID="56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686" exitCode=0 Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.855549 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4jjz" event={"ID":"7d74d149-c9dd-4307-9d7b-63f9e56570f6","Type":"ContainerDied","Data":"56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686"} Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.855597 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n4jjz" event={"ID":"7d74d149-c9dd-4307-9d7b-63f9e56570f6","Type":"ContainerDied","Data":"eb4a21c61bffcc98f1a81feacc6505f5508d2b43e9cc07776f87b4a4a32fbeca"} Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.855624 5033 scope.go:117] "RemoveContainer" containerID="56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.855706 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n4jjz" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.895740 5033 scope.go:117] "RemoveContainer" containerID="1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.897139 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4jjz"] Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.917748 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n4jjz"] Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.928158 5033 scope.go:117] "RemoveContainer" containerID="6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.966768 5033 scope.go:117] "RemoveContainer" containerID="56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686" Feb 26 16:04:46 crc kubenswrapper[5033]: E0226 16:04:46.968561 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686\": container with ID starting with 56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686 not found: ID does not exist" containerID="56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.968643 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686"} err="failed to get container status \"56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686\": rpc error: code = NotFound desc = could not find container \"56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686\": container with ID starting with 56dd16c29eb45b4ee5e209c977f49b7c943a55689f2ca0c622e9091ece00a686 not found: ID does not exist" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.968697 5033 scope.go:117] "RemoveContainer" containerID="1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268" Feb 26 16:04:46 crc kubenswrapper[5033]: E0226 16:04:46.969075 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268\": container with ID starting with 1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268 not found: ID does not exist" containerID="1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.969105 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268"} err="failed to get container status \"1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268\": rpc error: code = NotFound desc = could not find container \"1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268\": container with ID starting with 1208aa6ad11100ef302384ab5acf122f6dc947fa705607cc60f9c62d2af45268 not found: ID does not exist" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.969122 5033 scope.go:117] "RemoveContainer" containerID="6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9" Feb 26 16:04:46 crc kubenswrapper[5033]: E0226 16:04:46.969502 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9\": container with ID starting with 6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9 not found: ID does not exist" containerID="6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9" Feb 26 16:04:46 crc kubenswrapper[5033]: I0226 16:04:46.969543 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9"} err="failed to get container status \"6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9\": rpc error: code = NotFound desc = could not find container \"6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9\": container with ID starting with 6e63ba0cb95fec956472de50adf74b720ee2c5395b25b7773ffd73e1338361a9 not found: ID does not exist" Feb 26 16:04:47 crc kubenswrapper[5033]: I0226 16:04:47.639377 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" path="/var/lib/kubelet/pods/7d74d149-c9dd-4307-9d7b-63f9e56570f6/volumes" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.364874 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.537446 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-log-httpd\") pod \"f3d45cd1-5637-4000-b869-79b268fe47d0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.537515 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-scripts\") pod \"f3d45cd1-5637-4000-b869-79b268fe47d0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.537577 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4tsg\" (UniqueName: \"kubernetes.io/projected/f3d45cd1-5637-4000-b869-79b268fe47d0-kube-api-access-k4tsg\") pod \"f3d45cd1-5637-4000-b869-79b268fe47d0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.537680 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-config-data\") pod \"f3d45cd1-5637-4000-b869-79b268fe47d0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.537708 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-sg-core-conf-yaml\") pod \"f3d45cd1-5637-4000-b869-79b268fe47d0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.537768 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-combined-ca-bundle\") pod \"f3d45cd1-5637-4000-b869-79b268fe47d0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.537800 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-run-httpd\") pod \"f3d45cd1-5637-4000-b869-79b268fe47d0\" (UID: \"f3d45cd1-5637-4000-b869-79b268fe47d0\") " Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.538401 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f3d45cd1-5637-4000-b869-79b268fe47d0" (UID: "f3d45cd1-5637-4000-b869-79b268fe47d0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.538655 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f3d45cd1-5637-4000-b869-79b268fe47d0" (UID: "f3d45cd1-5637-4000-b869-79b268fe47d0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.544950 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-scripts" (OuterVolumeSpecName: "scripts") pod "f3d45cd1-5637-4000-b869-79b268fe47d0" (UID: "f3d45cd1-5637-4000-b869-79b268fe47d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.545050 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d45cd1-5637-4000-b869-79b268fe47d0-kube-api-access-k4tsg" (OuterVolumeSpecName: "kube-api-access-k4tsg") pod "f3d45cd1-5637-4000-b869-79b268fe47d0" (UID: "f3d45cd1-5637-4000-b869-79b268fe47d0"). InnerVolumeSpecName "kube-api-access-k4tsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.562660 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f3d45cd1-5637-4000-b869-79b268fe47d0" (UID: "f3d45cd1-5637-4000-b869-79b268fe47d0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.621751 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3d45cd1-5637-4000-b869-79b268fe47d0" (UID: "f3d45cd1-5637-4000-b869-79b268fe47d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.637156 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-config-data" (OuterVolumeSpecName: "config-data") pod "f3d45cd1-5637-4000-b869-79b268fe47d0" (UID: "f3d45cd1-5637-4000-b869-79b268fe47d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.640400 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.640522 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.640601 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.640701 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.640826 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3d45cd1-5637-4000-b869-79b268fe47d0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.640896 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d45cd1-5637-4000-b869-79b268fe47d0-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.640963 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4tsg\" (UniqueName: \"kubernetes.io/projected/f3d45cd1-5637-4000-b869-79b268fe47d0-kube-api-access-k4tsg\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.879772 5033 generic.go:334] "Generic (PLEG): container finished" podID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerID="58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4" exitCode=0 Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.879834 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerDied","Data":"58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4"} Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.879867 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3d45cd1-5637-4000-b869-79b268fe47d0","Type":"ContainerDied","Data":"5f31d64af04e4cc81eb2450157c92547974558088ae15720e619eeb20fcbcbb9"} Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.879865 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.879955 5033 scope.go:117] "RemoveContainer" containerID="beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.909244 5033 scope.go:117] "RemoveContainer" containerID="381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.921691 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.938322 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.956976 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:48 crc kubenswrapper[5033]: E0226 16:04:48.957374 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="ceilometer-notification-agent" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957392 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="ceilometer-notification-agent" Feb 26 16:04:48 crc kubenswrapper[5033]: E0226 16:04:48.957408 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerName="neutron-httpd" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957414 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerName="neutron-httpd" Feb 26 16:04:48 crc kubenswrapper[5033]: E0226 16:04:48.957427 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="sg-core" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957433 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="sg-core" Feb 26 16:04:48 crc kubenswrapper[5033]: E0226 16:04:48.957448 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="proxy-httpd" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957453 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="proxy-httpd" Feb 26 16:04:48 crc kubenswrapper[5033]: E0226 16:04:48.957463 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="extract-content" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957469 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="extract-content" Feb 26 16:04:48 crc kubenswrapper[5033]: E0226 16:04:48.957484 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerName="neutron-api" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957490 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerName="neutron-api" Feb 26 16:04:48 crc kubenswrapper[5033]: E0226 16:04:48.957500 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="registry-server" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957506 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="registry-server" Feb 26 16:04:48 crc kubenswrapper[5033]: E0226 16:04:48.957512 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="ceilometer-central-agent" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957518 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="ceilometer-central-agent" Feb 26 16:04:48 crc kubenswrapper[5033]: E0226 16:04:48.957534 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="extract-utilities" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957540 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="extract-utilities" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957714 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d74d149-c9dd-4307-9d7b-63f9e56570f6" containerName="registry-server" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957729 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="ceilometer-notification-agent" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957743 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="ceilometer-central-agent" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957751 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerName="neutron-httpd" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957760 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="sg-core" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957767 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4360f1d-93a6-47c3-84ba-96c8d202f420" containerName="neutron-api" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.957776 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" containerName="proxy-httpd" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.959291 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.962207 5033 scope.go:117] "RemoveContainer" containerID="fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.963270 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.964072 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:04:48 crc kubenswrapper[5033]: I0226 16:04:48.968369 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.003026 5033 scope.go:117] "RemoveContainer" containerID="58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.024983 5033 scope.go:117] "RemoveContainer" containerID="beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3" Feb 26 16:04:49 crc kubenswrapper[5033]: E0226 16:04:49.025542 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3\": container with ID starting with beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3 not found: ID does not exist" containerID="beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.025570 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3"} err="failed to get container status \"beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3\": rpc error: code = NotFound desc = could not find container \"beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3\": container with ID starting with beacf3f139e799d2afbc6755f1be94dd660d37196de9093281329570429611b3 not found: ID does not exist" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.025590 5033 scope.go:117] "RemoveContainer" containerID="381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e" Feb 26 16:04:49 crc kubenswrapper[5033]: E0226 16:04:49.025961 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e\": container with ID starting with 381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e not found: ID does not exist" containerID="381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.025996 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e"} err="failed to get container status \"381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e\": rpc error: code = NotFound desc = could not find container \"381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e\": container with ID starting with 381ed579c6700d0763e705002280d251ed270a399c77be69cc579c1a597a8d0e not found: ID does not exist" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.026009 5033 scope.go:117] "RemoveContainer" containerID="fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961" Feb 26 16:04:49 crc kubenswrapper[5033]: E0226 16:04:49.026388 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961\": container with ID starting with fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961 not found: ID does not exist" containerID="fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.026423 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961"} err="failed to get container status \"fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961\": rpc error: code = NotFound desc = could not find container \"fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961\": container with ID starting with fbf7dc9c702f99a2aabcfd3bab54851d825c3e06c2474381ed0122888a45d961 not found: ID does not exist" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.026442 5033 scope.go:117] "RemoveContainer" containerID="58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4" Feb 26 16:04:49 crc kubenswrapper[5033]: E0226 16:04:49.026738 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4\": container with ID starting with 58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4 not found: ID does not exist" containerID="58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.026763 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4"} err="failed to get container status \"58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4\": rpc error: code = NotFound desc = could not find container \"58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4\": container with ID starting with 58020c03114424e85637233e7715d54ce526ad317a106fb78d8b0dcad285e4c4 not found: ID does not exist" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.049188 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-config-data\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.049225 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkvf4\" (UniqueName: \"kubernetes.io/projected/80cc4e45-9683-48f8-8790-6b54b42d8b7f-kube-api-access-bkvf4\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.049304 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.049379 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-log-httpd\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.049508 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-run-httpd\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.049582 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-scripts\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.049665 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.150967 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.151361 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-log-httpd\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.151396 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-run-httpd\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.151420 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-scripts\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.151447 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.151469 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-config-data\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.151489 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkvf4\" (UniqueName: \"kubernetes.io/projected/80cc4e45-9683-48f8-8790-6b54b42d8b7f-kube-api-access-bkvf4\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.152278 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-log-httpd\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.152345 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-run-httpd\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.154754 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.155884 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-scripts\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.155925 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.157624 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-config-data\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.169669 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkvf4\" (UniqueName: \"kubernetes.io/projected/80cc4e45-9683-48f8-8790-6b54b42d8b7f-kube-api-access-bkvf4\") pod \"ceilometer-0\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.286837 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.638949 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d45cd1-5637-4000-b869-79b268fe47d0" path="/var/lib/kubelet/pods/f3d45cd1-5637-4000-b869-79b268fe47d0/volumes" Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.801137 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:49 crc kubenswrapper[5033]: I0226 16:04:49.890846 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerStarted","Data":"155c1a12324b17b5b7a5776f9b3c5ac70902bbe9fa58884acc14fccb6c8f8345"} Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.189531 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.189585 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.189624 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.190322 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ce282c1f9d55fd8756822a733df8c8b2d5c1a2cdad655e8d68dfed7a9540071"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.190373 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://7ce282c1f9d55fd8756822a733df8c8b2d5c1a2cdad655e8d68dfed7a9540071" gracePeriod=600 Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.903645 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="7ce282c1f9d55fd8756822a733df8c8b2d5c1a2cdad655e8d68dfed7a9540071" exitCode=0 Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.904159 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"7ce282c1f9d55fd8756822a733df8c8b2d5c1a2cdad655e8d68dfed7a9540071"} Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.904187 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465"} Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.904204 5033 scope.go:117] "RemoveContainer" containerID="1db84da97ee76a847298329e56a91cd397d799819a3948a986ed6808b2508c55" Feb 26 16:04:50 crc kubenswrapper[5033]: I0226 16:04:50.907653 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerStarted","Data":"dee9fbd05301ab5f47afe0a7e479dc4bc0f902a25bd2626ead447affce416022"} Feb 26 16:04:53 crc kubenswrapper[5033]: I0226 16:04:53.946286 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerStarted","Data":"24305472fa0cdb496d359a39a84c66512285d81602f08916e066bc8324c14c1b"} Feb 26 16:04:54 crc kubenswrapper[5033]: I0226 16:04:54.960143 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerStarted","Data":"8cc13573b7d9023710ae997b225e5ff23a3f72bde026d3f93ec9d6b11513d354"} Feb 26 16:04:55 crc kubenswrapper[5033]: I0226 16:04:55.974410 5033 generic.go:334] "Generic (PLEG): container finished" podID="329480df-036d-4b95-ad02-08a923116026" containerID="1a1d164d9741ffa6240e670361f34560b7319eec8e30291e5bd6ef38daf1fe8d" exitCode=0 Feb 26 16:04:55 crc kubenswrapper[5033]: I0226 16:04:55.975612 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lkrpq" event={"ID":"329480df-036d-4b95-ad02-08a923116026","Type":"ContainerDied","Data":"1a1d164d9741ffa6240e670361f34560b7319eec8e30291e5bd6ef38daf1fe8d"} Feb 26 16:04:56 crc kubenswrapper[5033]: I0226 16:04:56.988772 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerStarted","Data":"cbe9726cf4a217b2b29994ea6f8db6c5201d59ca21a2c14424b5e006d60cd949"} Feb 26 16:04:56 crc kubenswrapper[5033]: I0226 16:04:56.989308 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.379812 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.397897 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.318888131 podStartE2EDuration="9.397876312s" podCreationTimestamp="2026-02-26 16:04:48 +0000 UTC" firstStartedPulling="2026-02-26 16:04:49.807165605 +0000 UTC m=+1417.019764255" lastFinishedPulling="2026-02-26 16:04:55.886153786 +0000 UTC m=+1423.098752436" observedRunningTime="2026-02-26 16:04:57.016764128 +0000 UTC m=+1424.229362768" watchObservedRunningTime="2026-02-26 16:04:57.397876312 +0000 UTC m=+1424.610474972" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.486025 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6txww\" (UniqueName: \"kubernetes.io/projected/329480df-036d-4b95-ad02-08a923116026-kube-api-access-6txww\") pod \"329480df-036d-4b95-ad02-08a923116026\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.486137 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-config-data\") pod \"329480df-036d-4b95-ad02-08a923116026\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.486288 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-scripts\") pod \"329480df-036d-4b95-ad02-08a923116026\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.486411 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-combined-ca-bundle\") pod \"329480df-036d-4b95-ad02-08a923116026\" (UID: \"329480df-036d-4b95-ad02-08a923116026\") " Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.491837 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-scripts" (OuterVolumeSpecName: "scripts") pod "329480df-036d-4b95-ad02-08a923116026" (UID: "329480df-036d-4b95-ad02-08a923116026"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.493389 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329480df-036d-4b95-ad02-08a923116026-kube-api-access-6txww" (OuterVolumeSpecName: "kube-api-access-6txww") pod "329480df-036d-4b95-ad02-08a923116026" (UID: "329480df-036d-4b95-ad02-08a923116026"). InnerVolumeSpecName "kube-api-access-6txww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.519493 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "329480df-036d-4b95-ad02-08a923116026" (UID: "329480df-036d-4b95-ad02-08a923116026"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.520179 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-config-data" (OuterVolumeSpecName: "config-data") pod "329480df-036d-4b95-ad02-08a923116026" (UID: "329480df-036d-4b95-ad02-08a923116026"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.588566 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6txww\" (UniqueName: \"kubernetes.io/projected/329480df-036d-4b95-ad02-08a923116026-kube-api-access-6txww\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.588605 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.588617 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:57 crc kubenswrapper[5033]: I0226 16:04:57.588628 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329480df-036d-4b95-ad02-08a923116026-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.001982 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lkrpq" event={"ID":"329480df-036d-4b95-ad02-08a923116026","Type":"ContainerDied","Data":"22b348405f45b8dd89ca284be1c8deb32933f1f41828eb214c33efe14fa85993"} Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.002070 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lkrpq" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.002083 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22b348405f45b8dd89ca284be1c8deb32933f1f41828eb214c33efe14fa85993" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.121083 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 16:04:58 crc kubenswrapper[5033]: E0226 16:04:58.121859 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329480df-036d-4b95-ad02-08a923116026" containerName="nova-cell0-conductor-db-sync" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.121880 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="329480df-036d-4b95-ad02-08a923116026" containerName="nova-cell0-conductor-db-sync" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.122109 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="329480df-036d-4b95-ad02-08a923116026" containerName="nova-cell0-conductor-db-sync" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.123093 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.124854 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-n8cn8" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.126521 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.146118 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.199530 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.199600 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prxml\" (UniqueName: \"kubernetes.io/projected/0dcb38b4-0144-434f-adc1-63a2b63565ee-kube-api-access-prxml\") pod \"nova-cell0-conductor-0\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.199939 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.301989 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.302140 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.302165 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prxml\" (UniqueName: \"kubernetes.io/projected/0dcb38b4-0144-434f-adc1-63a2b63565ee-kube-api-access-prxml\") pod \"nova-cell0-conductor-0\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.314606 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.318785 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.318938 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prxml\" (UniqueName: \"kubernetes.io/projected/0dcb38b4-0144-434f-adc1-63a2b63565ee-kube-api-access-prxml\") pod \"nova-cell0-conductor-0\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.446362 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 16:04:58 crc kubenswrapper[5033]: I0226 16:04:58.965722 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 16:04:59 crc kubenswrapper[5033]: I0226 16:04:59.017452 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0dcb38b4-0144-434f-adc1-63a2b63565ee","Type":"ContainerStarted","Data":"7c17ccb8e9454aacd4f5e6d035fe1db72971529c243f838647ea2f392d7ce24c"} Feb 26 16:05:00 crc kubenswrapper[5033]: I0226 16:05:00.038542 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0dcb38b4-0144-434f-adc1-63a2b63565ee","Type":"ContainerStarted","Data":"b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea"} Feb 26 16:05:00 crc kubenswrapper[5033]: I0226 16:05:00.039660 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 26 16:05:00 crc kubenswrapper[5033]: I0226 16:05:00.058577 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.058559367 podStartE2EDuration="2.058559367s" podCreationTimestamp="2026-02-26 16:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:00.05632179 +0000 UTC m=+1427.268920440" watchObservedRunningTime="2026-02-26 16:05:00.058559367 +0000 UTC m=+1427.271158017" Feb 26 16:05:08 crc kubenswrapper[5033]: I0226 16:05:08.504897 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.056662 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-h5gx5"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.057772 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.061210 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.067952 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.118115 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5gx5"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.118176 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-config-data\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.118251 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-scripts\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.118607 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.118766 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4zn\" (UniqueName: \"kubernetes.io/projected/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-kube-api-access-6n4zn\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.144321 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mwzpm"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.154712 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mwzpm"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.154856 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.220981 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-catalog-content\") pod \"redhat-operators-mwzpm\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.221775 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4zn\" (UniqueName: \"kubernetes.io/projected/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-kube-api-access-6n4zn\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.221850 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-config-data\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.221883 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-scripts\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.221971 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjdt\" (UniqueName: \"kubernetes.io/projected/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-kube-api-access-ggjdt\") pod \"redhat-operators-mwzpm\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.221999 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.222020 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-utilities\") pod \"redhat-operators-mwzpm\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.235349 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.241753 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-scripts\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.245138 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-config-data\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.258087 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4zn\" (UniqueName: \"kubernetes.io/projected/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-kube-api-access-6n4zn\") pod \"nova-cell0-cell-mapping-h5gx5\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.324179 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggjdt\" (UniqueName: \"kubernetes.io/projected/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-kube-api-access-ggjdt\") pod \"redhat-operators-mwzpm\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.324277 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-utilities\") pod \"redhat-operators-mwzpm\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.324336 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-catalog-content\") pod \"redhat-operators-mwzpm\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.324943 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-utilities\") pod \"redhat-operators-mwzpm\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.325023 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-catalog-content\") pod \"redhat-operators-mwzpm\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.356975 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggjdt\" (UniqueName: \"kubernetes.io/projected/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-kube-api-access-ggjdt\") pod \"redhat-operators-mwzpm\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.367143 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.394157 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.408058 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.414064 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.454422 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.466486 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.472766 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.496982 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.505389 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.512624 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.514523 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.522236 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.543126 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdsh5\" (UniqueName: \"kubernetes.io/projected/806654e6-e126-4cf9-a2c4-b0deb91bf75d-kube-api-access-tdsh5\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.544475 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/806654e6-e126-4cf9-a2c4-b0deb91bf75d-logs\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.544622 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-config-data\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.544789 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.553673 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.606380 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.646749 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdsh5\" (UniqueName: \"kubernetes.io/projected/806654e6-e126-4cf9-a2c4-b0deb91bf75d-kube-api-access-tdsh5\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.649624 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/806654e6-e126-4cf9-a2c4-b0deb91bf75d-logs\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.649811 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-config-data\") pod \"nova-scheduler-0\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.649894 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-config-data\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.649981 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvqk7\" (UniqueName: \"kubernetes.io/projected/ad3621e8-de22-43d7-8777-4c4920277c13-kube-api-access-bvqk7\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.650244 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-config-data\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.650404 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9x74\" (UniqueName: \"kubernetes.io/projected/e111d211-eff3-4ec2-b648-16ec542c2900-kube-api-access-n9x74\") pod \"nova-scheduler-0\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.651237 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.651404 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3621e8-de22-43d7-8777-4c4920277c13-logs\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.651518 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.651647 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.652770 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/806654e6-e126-4cf9-a2c4-b0deb91bf75d-logs\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.657885 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-config-data\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.668824 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.672077 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdsh5\" (UniqueName: \"kubernetes.io/projected/806654e6-e126-4cf9-a2c4-b0deb91bf75d-kube-api-access-tdsh5\") pod \"nova-api-0\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.692463 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.693831 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.701984 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.709902 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nznck"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.713061 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.732217 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.758296 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9x74\" (UniqueName: \"kubernetes.io/projected/e111d211-eff3-4ec2-b648-16ec542c2900-kube-api-access-n9x74\") pod \"nova-scheduler-0\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.758356 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nznck"] Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.758741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.758774 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3621e8-de22-43d7-8777-4c4920277c13-logs\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.758805 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.758887 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-config-data\") pod \"nova-scheduler-0\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.758915 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvqk7\" (UniqueName: \"kubernetes.io/projected/ad3621e8-de22-43d7-8777-4c4920277c13-kube-api-access-bvqk7\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.758934 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-config-data\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.759341 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3621e8-de22-43d7-8777-4c4920277c13-logs\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.836045 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.860226 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxm4g\" (UniqueName: \"kubernetes.io/projected/215dcec4-4856-4ae9-962f-dce89506fe6e-kube-api-access-pxm4g\") pod \"nova-cell1-novncproxy-0\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.860291 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-svc\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.860373 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.860399 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.860552 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.860605 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-config\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.860642 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cgck\" (UniqueName: \"kubernetes.io/projected/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-kube-api-access-2cgck\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.860669 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.860686 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.927419 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-config-data\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.934315 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-config-data\") pod \"nova-scheduler-0\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.934682 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.934732 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9x74\" (UniqueName: \"kubernetes.io/projected/e111d211-eff3-4ec2-b648-16ec542c2900-kube-api-access-n9x74\") pod \"nova-scheduler-0\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.945389 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.952381 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvqk7\" (UniqueName: \"kubernetes.io/projected/ad3621e8-de22-43d7-8777-4c4920277c13-kube-api-access-bvqk7\") pod \"nova-metadata-0\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " pod="openstack/nova-metadata-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.962746 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxm4g\" (UniqueName: \"kubernetes.io/projected/215dcec4-4856-4ae9-962f-dce89506fe6e-kube-api-access-pxm4g\") pod \"nova-cell1-novncproxy-0\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.962825 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-svc\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.962883 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.962927 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.963000 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.963078 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-config\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.963150 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cgck\" (UniqueName: \"kubernetes.io/projected/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-kube-api-access-2cgck\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.963186 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:09 crc kubenswrapper[5033]: I0226 16:05:09.963230 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.074491 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-svc\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.074663 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.087647 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5gx5"] Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.089412 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.089439 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-config\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.089997 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.101811 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxm4g\" (UniqueName: \"kubernetes.io/projected/215dcec4-4856-4ae9-962f-dce89506fe6e-kube-api-access-pxm4g\") pod \"nova-cell1-novncproxy-0\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.101944 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.102019 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.103369 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cgck\" (UniqueName: \"kubernetes.io/projected/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-kube-api-access-2cgck\") pod \"dnsmasq-dns-bccf8f775-nznck\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.149889 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.151880 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5gx5" event={"ID":"c7261968-dd9e-4a9b-bdd9-58ee979d63ff","Type":"ContainerStarted","Data":"8ae922fbc3a01be6f95c7c6ca58195ebe09e078ea6fb887842f0cff3a9a74751"} Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.169883 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.183173 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mwzpm"] Feb 26 16:05:10 crc kubenswrapper[5033]: W0226 16:05:10.236373 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e4fe96_44fe_4cb0_94b3_f1df3b376c61.slice/crio-bce7120dedfaf7bb44e21b79bdfe144a4b1dfadb032af7d3ae4cbe754e9db0f5 WatchSource:0}: Error finding container bce7120dedfaf7bb44e21b79bdfe144a4b1dfadb032af7d3ae4cbe754e9db0f5: Status 404 returned error can't find the container with id bce7120dedfaf7bb44e21b79bdfe144a4b1dfadb032af7d3ae4cbe754e9db0f5 Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.327055 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.343443 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.738200 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.912942 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.932452 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.943397 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8h5d9"] Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.944625 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.947924 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.948158 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 26 16:05:10 crc kubenswrapper[5033]: I0226 16:05:10.960477 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8h5d9"] Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.091051 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5jm\" (UniqueName: \"kubernetes.io/projected/739b08e5-9e18-4d87-aded-724b5d2ab937-kube-api-access-sl5jm\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.091134 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-scripts\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.091231 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-config-data\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.091268 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.176951 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e111d211-eff3-4ec2-b648-16ec542c2900","Type":"ContainerStarted","Data":"d82c75ab16f7dfe38a381f8ac7f4df84c6d9c02fa60bfc7a064b12566db8c5d2"} Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.180463 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwzpm" event={"ID":"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61","Type":"ContainerStarted","Data":"db677ade2c2920f8d4e4b57c04f72f18837b7b6e5a2611f3694ff447174f6ddc"} Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.180509 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwzpm" event={"ID":"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61","Type":"ContainerStarted","Data":"bce7120dedfaf7bb44e21b79bdfe144a4b1dfadb032af7d3ae4cbe754e9db0f5"} Feb 26 16:05:11 crc kubenswrapper[5033]: W0226 16:05:11.182167 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod215dcec4_4856_4ae9_962f_dce89506fe6e.slice/crio-d6c23cd2788bd3d2e4ee60c15f0d79b19c0439764191d12b72db183902dc2460 WatchSource:0}: Error finding container d6c23cd2788bd3d2e4ee60c15f0d79b19c0439764191d12b72db183902dc2460: Status 404 returned error can't find the container with id d6c23cd2788bd3d2e4ee60c15f0d79b19c0439764191d12b72db183902dc2460 Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.182923 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.184549 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"806654e6-e126-4cf9-a2c4-b0deb91bf75d","Type":"ContainerStarted","Data":"ca5ec9afbde58cbe6b720a6c8d39d906173f07acc1bf6d872e21fad8501d5f8b"} Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.192744 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-config-data\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.192912 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.192960 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5jm\" (UniqueName: \"kubernetes.io/projected/739b08e5-9e18-4d87-aded-724b5d2ab937-kube-api-access-sl5jm\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.193002 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad3621e8-de22-43d7-8777-4c4920277c13","Type":"ContainerStarted","Data":"719ae6415404013b9917fe728dd0acf979f9e47d6a31ac9eeea58b761d655bd8"} Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.193024 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-scripts\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.198965 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-scripts\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.203789 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-config-data\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.207493 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.211843 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5gx5" event={"ID":"c7261968-dd9e-4a9b-bdd9-58ee979d63ff","Type":"ContainerStarted","Data":"0faa776ae6b63848c0d05a66870162df03f5366c75da8574c2561da35ab31883"} Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.218832 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5jm\" (UniqueName: \"kubernetes.io/projected/739b08e5-9e18-4d87-aded-724b5d2ab937-kube-api-access-sl5jm\") pod \"nova-cell1-conductor-db-sync-8h5d9\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.234318 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-h5gx5" podStartSLOduration=2.234301223 podStartE2EDuration="2.234301223s" podCreationTimestamp="2026-02-26 16:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:11.226397385 +0000 UTC m=+1438.438996035" watchObservedRunningTime="2026-02-26 16:05:11.234301223 +0000 UTC m=+1438.446899873" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.288149 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.323055 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nznck"] Feb 26 16:05:11 crc kubenswrapper[5033]: I0226 16:05:11.840227 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8h5d9"] Feb 26 16:05:12 crc kubenswrapper[5033]: I0226 16:05:12.238122 5033 generic.go:334] "Generic (PLEG): container finished" podID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerID="db677ade2c2920f8d4e4b57c04f72f18837b7b6e5a2611f3694ff447174f6ddc" exitCode=0 Feb 26 16:05:12 crc kubenswrapper[5033]: I0226 16:05:12.238531 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwzpm" event={"ID":"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61","Type":"ContainerDied","Data":"db677ade2c2920f8d4e4b57c04f72f18837b7b6e5a2611f3694ff447174f6ddc"} Feb 26 16:05:12 crc kubenswrapper[5033]: I0226 16:05:12.249684 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"215dcec4-4856-4ae9-962f-dce89506fe6e","Type":"ContainerStarted","Data":"d6c23cd2788bd3d2e4ee60c15f0d79b19c0439764191d12b72db183902dc2460"} Feb 26 16:05:12 crc kubenswrapper[5033]: I0226 16:05:12.259407 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8h5d9" event={"ID":"739b08e5-9e18-4d87-aded-724b5d2ab937","Type":"ContainerStarted","Data":"a3eb7e03f8fefb3acf5770d5fe9e48de5bd74bdc5f586a4270cbdd4bdb99df39"} Feb 26 16:05:12 crc kubenswrapper[5033]: I0226 16:05:12.259456 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8h5d9" event={"ID":"739b08e5-9e18-4d87-aded-724b5d2ab937","Type":"ContainerStarted","Data":"421408f12463c67a202d5069b8adb195e8eae4b9ab71d9a5d36e1f249c5683f8"} Feb 26 16:05:12 crc kubenswrapper[5033]: I0226 16:05:12.269651 5033 generic.go:334] "Generic (PLEG): container finished" podID="1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" containerID="1521926132cf89bcf3a78d3cf72fa7e05c7325a78bb48002c8d3030ca6645460" exitCode=0 Feb 26 16:05:12 crc kubenswrapper[5033]: I0226 16:05:12.269752 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nznck" event={"ID":"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4","Type":"ContainerDied","Data":"1521926132cf89bcf3a78d3cf72fa7e05c7325a78bb48002c8d3030ca6645460"} Feb 26 16:05:12 crc kubenswrapper[5033]: I0226 16:05:12.269810 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nznck" event={"ID":"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4","Type":"ContainerStarted","Data":"549e66a07903658736d752f8cdbd9222a862c59dad83562f770c87c356072bce"} Feb 26 16:05:12 crc kubenswrapper[5033]: I0226 16:05:12.301362 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8h5d9" podStartSLOduration=2.301343568 podStartE2EDuration="2.301343568s" podCreationTimestamp="2026-02-26 16:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:12.27898163 +0000 UTC m=+1439.491580280" watchObservedRunningTime="2026-02-26 16:05:12.301343568 +0000 UTC m=+1439.513942218" Feb 26 16:05:13 crc kubenswrapper[5033]: I0226 16:05:13.287058 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nznck" event={"ID":"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4","Type":"ContainerStarted","Data":"7ae2a7fc7a368e984faab908fd0ee5c2e4b43b945d8192bc35974c1d60d1b3bc"} Feb 26 16:05:13 crc kubenswrapper[5033]: I0226 16:05:13.288230 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:13 crc kubenswrapper[5033]: I0226 16:05:13.317565 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-nznck" podStartSLOduration=4.317541234 podStartE2EDuration="4.317541234s" podCreationTimestamp="2026-02-26 16:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:13.310619141 +0000 UTC m=+1440.523217791" watchObservedRunningTime="2026-02-26 16:05:13.317541234 +0000 UTC m=+1440.530139884" Feb 26 16:05:14 crc kubenswrapper[5033]: I0226 16:05:14.131499 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:05:14 crc kubenswrapper[5033]: I0226 16:05:14.143986 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.332976 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"806654e6-e126-4cf9-a2c4-b0deb91bf75d","Type":"ContainerStarted","Data":"89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668"} Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.333551 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"806654e6-e126-4cf9-a2c4-b0deb91bf75d","Type":"ContainerStarted","Data":"68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb"} Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.334781 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad3621e8-de22-43d7-8777-4c4920277c13","Type":"ContainerStarted","Data":"e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8"} Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.334843 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad3621e8-de22-43d7-8777-4c4920277c13","Type":"ContainerStarted","Data":"f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9"} Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.334877 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ad3621e8-de22-43d7-8777-4c4920277c13" containerName="nova-metadata-log" containerID="cri-o://f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9" gracePeriod=30 Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.334904 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ad3621e8-de22-43d7-8777-4c4920277c13" containerName="nova-metadata-metadata" containerID="cri-o://e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8" gracePeriod=30 Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.339247 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e111d211-eff3-4ec2-b648-16ec542c2900","Type":"ContainerStarted","Data":"1a22ec2ad63a01a43096fbc6bd4a74e45118cd108bd521934915dc64777de96c"} Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.348890 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwzpm" event={"ID":"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61","Type":"ContainerStarted","Data":"0b0bbbeff38dea3622af4d23269b5487bb30760e0f9113b56f6a4223d923304c"} Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.350988 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"215dcec4-4856-4ae9-962f-dce89506fe6e","Type":"ContainerStarted","Data":"72486677dd69741aa30e6caf62f343575f1effe8d24ad0d1094a1dd84492a921"} Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.351128 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="215dcec4-4856-4ae9-962f-dce89506fe6e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://72486677dd69741aa30e6caf62f343575f1effe8d24ad0d1094a1dd84492a921" gracePeriod=30 Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.358142 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.811787927 podStartE2EDuration="8.358117323s" podCreationTimestamp="2026-02-26 16:05:09 +0000 UTC" firstStartedPulling="2026-02-26 16:05:10.797412067 +0000 UTC m=+1438.010010717" lastFinishedPulling="2026-02-26 16:05:16.343741463 +0000 UTC m=+1443.556340113" observedRunningTime="2026-02-26 16:05:17.354679247 +0000 UTC m=+1444.567277907" watchObservedRunningTime="2026-02-26 16:05:17.358117323 +0000 UTC m=+1444.570715973" Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.378099 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.931298682 podStartE2EDuration="8.3780031s" podCreationTimestamp="2026-02-26 16:05:09 +0000 UTC" firstStartedPulling="2026-02-26 16:05:10.912615553 +0000 UTC m=+1438.125214213" lastFinishedPulling="2026-02-26 16:05:16.359319981 +0000 UTC m=+1443.571918631" observedRunningTime="2026-02-26 16:05:17.369474177 +0000 UTC m=+1444.582072837" watchObservedRunningTime="2026-02-26 16:05:17.3780031 +0000 UTC m=+1444.590601750" Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.458232 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.299377168 podStartE2EDuration="8.458212962s" podCreationTimestamp="2026-02-26 16:05:09 +0000 UTC" firstStartedPulling="2026-02-26 16:05:11.189912634 +0000 UTC m=+1438.402511284" lastFinishedPulling="2026-02-26 16:05:16.348748408 +0000 UTC m=+1443.561347078" observedRunningTime="2026-02-26 16:05:17.411755162 +0000 UTC m=+1444.624353822" watchObservedRunningTime="2026-02-26 16:05:17.458212962 +0000 UTC m=+1444.670811612" Feb 26 16:05:17 crc kubenswrapper[5033]: I0226 16:05:17.468244 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.048914037 podStartE2EDuration="8.468222602s" podCreationTimestamp="2026-02-26 16:05:09 +0000 UTC" firstStartedPulling="2026-02-26 16:05:10.929419362 +0000 UTC m=+1438.142018012" lastFinishedPulling="2026-02-26 16:05:16.348727927 +0000 UTC m=+1443.561326577" observedRunningTime="2026-02-26 16:05:17.439770072 +0000 UTC m=+1444.652368732" watchObservedRunningTime="2026-02-26 16:05:17.468222602 +0000 UTC m=+1444.680821252" Feb 26 16:05:18 crc kubenswrapper[5033]: I0226 16:05:18.362550 5033 generic.go:334] "Generic (PLEG): container finished" podID="ad3621e8-de22-43d7-8777-4c4920277c13" containerID="f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9" exitCode=143 Feb 26 16:05:18 crc kubenswrapper[5033]: I0226 16:05:18.362726 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad3621e8-de22-43d7-8777-4c4920277c13","Type":"ContainerDied","Data":"f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9"} Feb 26 16:05:18 crc kubenswrapper[5033]: I0226 16:05:18.366963 5033 generic.go:334] "Generic (PLEG): container finished" podID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerID="0b0bbbeff38dea3622af4d23269b5487bb30760e0f9113b56f6a4223d923304c" exitCode=0 Feb 26 16:05:18 crc kubenswrapper[5033]: I0226 16:05:18.367068 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwzpm" event={"ID":"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61","Type":"ContainerDied","Data":"0b0bbbeff38dea3622af4d23269b5487bb30760e0f9113b56f6a4223d923304c"} Feb 26 16:05:19 crc kubenswrapper[5033]: I0226 16:05:19.292573 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 16:05:19 crc kubenswrapper[5033]: I0226 16:05:19.836290 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:05:19 crc kubenswrapper[5033]: I0226 16:05:19.836771 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.150946 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.151005 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.171373 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.171430 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.198057 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.327292 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.345893 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.396056 5033 generic.go:334] "Generic (PLEG): container finished" podID="c7261968-dd9e-4a9b-bdd9-58ee979d63ff" containerID="0faa776ae6b63848c0d05a66870162df03f5366c75da8574c2561da35ab31883" exitCode=0 Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.396406 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5gx5" event={"ID":"c7261968-dd9e-4a9b-bdd9-58ee979d63ff","Type":"ContainerDied","Data":"0faa776ae6b63848c0d05a66870162df03f5366c75da8574c2561da35ab31883"} Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.433233 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dnr6c"] Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.433552 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" podUID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" containerName="dnsmasq-dns" containerID="cri-o://bac4d373cb8ce5d5ed040fd30c9e2f5715f564e4c50b091436c89ba2333feaac" gracePeriod=10 Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.438597 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.918461 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.918570 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:05:20 crc kubenswrapper[5033]: I0226 16:05:20.976197 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" podUID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: connect: connection refused" Feb 26 16:05:21 crc kubenswrapper[5033]: I0226 16:05:21.412030 5033 generic.go:334] "Generic (PLEG): container finished" podID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" containerID="bac4d373cb8ce5d5ed040fd30c9e2f5715f564e4c50b091436c89ba2333feaac" exitCode=0 Feb 26 16:05:21 crc kubenswrapper[5033]: I0226 16:05:21.412098 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" event={"ID":"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0","Type":"ContainerDied","Data":"bac4d373cb8ce5d5ed040fd30c9e2f5715f564e4c50b091436c89ba2333feaac"} Feb 26 16:05:21 crc kubenswrapper[5033]: I0226 16:05:21.947015 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:21 crc kubenswrapper[5033]: I0226 16:05:21.957278 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.023855 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-sb\") pod \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.024143 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-config\") pod \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.024222 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-nb\") pod \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.024311 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-scripts\") pod \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.024330 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n4zn\" (UniqueName: \"kubernetes.io/projected/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-kube-api-access-6n4zn\") pod \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.024362 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-svc\") pod \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.024389 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p79l5\" (UniqueName: \"kubernetes.io/projected/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-kube-api-access-p79l5\") pod \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.024406 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-swift-storage-0\") pod \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\" (UID: \"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.024436 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-config-data\") pod \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.024468 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-combined-ca-bundle\") pod \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\" (UID: \"c7261968-dd9e-4a9b-bdd9-58ee979d63ff\") " Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.033672 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-kube-api-access-6n4zn" (OuterVolumeSpecName: "kube-api-access-6n4zn") pod "c7261968-dd9e-4a9b-bdd9-58ee979d63ff" (UID: "c7261968-dd9e-4a9b-bdd9-58ee979d63ff"). InnerVolumeSpecName "kube-api-access-6n4zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.035401 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-kube-api-access-p79l5" (OuterVolumeSpecName: "kube-api-access-p79l5") pod "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" (UID: "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0"). InnerVolumeSpecName "kube-api-access-p79l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.042037 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-scripts" (OuterVolumeSpecName: "scripts") pod "c7261968-dd9e-4a9b-bdd9-58ee979d63ff" (UID: "c7261968-dd9e-4a9b-bdd9-58ee979d63ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.077433 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7261968-dd9e-4a9b-bdd9-58ee979d63ff" (UID: "c7261968-dd9e-4a9b-bdd9-58ee979d63ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.105086 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" (UID: "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.112961 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" (UID: "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.116012 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-config" (OuterVolumeSpecName: "config") pod "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" (UID: "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.123958 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" (UID: "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.124729 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-config-data" (OuterVolumeSpecName: "config-data") pod "c7261968-dd9e-4a9b-bdd9-58ee979d63ff" (UID: "c7261968-dd9e-4a9b-bdd9-58ee979d63ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.127540 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.127584 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n4zn\" (UniqueName: \"kubernetes.io/projected/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-kube-api-access-6n4zn\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.127599 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.127613 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p79l5\" (UniqueName: \"kubernetes.io/projected/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-kube-api-access-p79l5\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.127625 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.127635 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.127645 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7261968-dd9e-4a9b-bdd9-58ee979d63ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.127656 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.127666 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.140866 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" (UID: "22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.229139 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.424293 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.424326 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-dnr6c" event={"ID":"22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0","Type":"ContainerDied","Data":"95c3ba818d7cbb9c7d0b98c3e2f660f05d8b6d0e245a7a100ee13d8692b98e84"} Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.424949 5033 scope.go:117] "RemoveContainer" containerID="bac4d373cb8ce5d5ed040fd30c9e2f5715f564e4c50b091436c89ba2333feaac" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.426169 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5gx5" event={"ID":"c7261968-dd9e-4a9b-bdd9-58ee979d63ff","Type":"ContainerDied","Data":"8ae922fbc3a01be6f95c7c6ca58195ebe09e078ea6fb887842f0cff3a9a74751"} Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.426198 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ae922fbc3a01be6f95c7c6ca58195ebe09e078ea6fb887842f0cff3a9a74751" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.426249 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5gx5" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.433739 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwzpm" event={"ID":"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61","Type":"ContainerStarted","Data":"45be7f415ab0fee8f35208d38c1c974ead3ecabc533b9ab507a29fb9844f3d2b"} Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.475163 5033 scope.go:117] "RemoveContainer" containerID="d10243c2ce8cfd3b74cc4ecb2619ab97671f9b148f3e1a2df416a2662cd1934e" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.484175 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mwzpm" podStartSLOduration=4.273157925 podStartE2EDuration="13.484136107s" podCreationTimestamp="2026-02-26 16:05:09 +0000 UTC" firstStartedPulling="2026-02-26 16:05:12.244441677 +0000 UTC m=+1439.457040327" lastFinishedPulling="2026-02-26 16:05:21.455419869 +0000 UTC m=+1448.668018509" observedRunningTime="2026-02-26 16:05:22.469166434 +0000 UTC m=+1449.681765084" watchObservedRunningTime="2026-02-26 16:05:22.484136107 +0000 UTC m=+1449.696734757" Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.505306 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dnr6c"] Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.517584 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-dnr6c"] Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.620311 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.620560 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-log" containerID="cri-o://68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb" gracePeriod=30 Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.620620 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-api" containerID="cri-o://89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668" gracePeriod=30 Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.633054 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:05:22 crc kubenswrapper[5033]: I0226 16:05:22.633331 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e111d211-eff3-4ec2-b648-16ec542c2900" containerName="nova-scheduler-scheduler" containerID="cri-o://1a22ec2ad63a01a43096fbc6bd4a74e45118cd108bd521934915dc64777de96c" gracePeriod=30 Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.455081 5033 generic.go:334] "Generic (PLEG): container finished" podID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerID="68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb" exitCode=143 Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.455670 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"806654e6-e126-4cf9-a2c4-b0deb91bf75d","Type":"ContainerDied","Data":"68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb"} Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.459606 5033 generic.go:334] "Generic (PLEG): container finished" podID="e111d211-eff3-4ec2-b648-16ec542c2900" containerID="1a22ec2ad63a01a43096fbc6bd4a74e45118cd108bd521934915dc64777de96c" exitCode=0 Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.460861 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e111d211-eff3-4ec2-b648-16ec542c2900","Type":"ContainerDied","Data":"1a22ec2ad63a01a43096fbc6bd4a74e45118cd108bd521934915dc64777de96c"} Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.609384 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.640723 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" path="/var/lib/kubelet/pods/22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0/volumes" Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.690664 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-combined-ca-bundle\") pod \"e111d211-eff3-4ec2-b648-16ec542c2900\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.690780 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9x74\" (UniqueName: \"kubernetes.io/projected/e111d211-eff3-4ec2-b648-16ec542c2900-kube-api-access-n9x74\") pod \"e111d211-eff3-4ec2-b648-16ec542c2900\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.690906 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-config-data\") pod \"e111d211-eff3-4ec2-b648-16ec542c2900\" (UID: \"e111d211-eff3-4ec2-b648-16ec542c2900\") " Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.698760 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e111d211-eff3-4ec2-b648-16ec542c2900-kube-api-access-n9x74" (OuterVolumeSpecName: "kube-api-access-n9x74") pod "e111d211-eff3-4ec2-b648-16ec542c2900" (UID: "e111d211-eff3-4ec2-b648-16ec542c2900"). InnerVolumeSpecName "kube-api-access-n9x74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.725438 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e111d211-eff3-4ec2-b648-16ec542c2900" (UID: "e111d211-eff3-4ec2-b648-16ec542c2900"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.726237 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-config-data" (OuterVolumeSpecName: "config-data") pod "e111d211-eff3-4ec2-b648-16ec542c2900" (UID: "e111d211-eff3-4ec2-b648-16ec542c2900"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.792177 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.792220 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9x74\" (UniqueName: \"kubernetes.io/projected/e111d211-eff3-4ec2-b648-16ec542c2900-kube-api-access-n9x74\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:23 crc kubenswrapper[5033]: I0226 16:05:23.792235 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e111d211-eff3-4ec2-b648-16ec542c2900-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.470117 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e111d211-eff3-4ec2-b648-16ec542c2900","Type":"ContainerDied","Data":"d82c75ab16f7dfe38a381f8ac7f4df84c6d9c02fa60bfc7a064b12566db8c5d2"} Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.470163 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.470179 5033 scope.go:117] "RemoveContainer" containerID="1a22ec2ad63a01a43096fbc6bd4a74e45118cd108bd521934915dc64777de96c" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.516377 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.540103 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.569371 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:05:24 crc kubenswrapper[5033]: E0226 16:05:24.569889 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" containerName="dnsmasq-dns" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.569911 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" containerName="dnsmasq-dns" Feb 26 16:05:24 crc kubenswrapper[5033]: E0226 16:05:24.569934 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7261968-dd9e-4a9b-bdd9-58ee979d63ff" containerName="nova-manage" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.569941 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7261968-dd9e-4a9b-bdd9-58ee979d63ff" containerName="nova-manage" Feb 26 16:05:24 crc kubenswrapper[5033]: E0226 16:05:24.569951 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" containerName="init" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.569959 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" containerName="init" Feb 26 16:05:24 crc kubenswrapper[5033]: E0226 16:05:24.569977 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e111d211-eff3-4ec2-b648-16ec542c2900" containerName="nova-scheduler-scheduler" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.569983 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e111d211-eff3-4ec2-b648-16ec542c2900" containerName="nova-scheduler-scheduler" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.570160 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7261968-dd9e-4a9b-bdd9-58ee979d63ff" containerName="nova-manage" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.570176 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ffc2b4-a76b-48a9-81ff-1afa3bf72bb0" containerName="dnsmasq-dns" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.570188 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e111d211-eff3-4ec2-b648-16ec542c2900" containerName="nova-scheduler-scheduler" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.570810 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.571953 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.573848 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.708332 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.708710 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-config-data\") pod \"nova-scheduler-0\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.708743 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbgxc\" (UniqueName: \"kubernetes.io/projected/4b962555-2817-45b2-88e3-7618e6d7c74b-kube-api-access-fbgxc\") pod \"nova-scheduler-0\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.810722 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.810825 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-config-data\") pod \"nova-scheduler-0\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.810846 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbgxc\" (UniqueName: \"kubernetes.io/projected/4b962555-2817-45b2-88e3-7618e6d7c74b-kube-api-access-fbgxc\") pod \"nova-scheduler-0\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.825310 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-config-data\") pod \"nova-scheduler-0\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.825374 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.830603 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbgxc\" (UniqueName: \"kubernetes.io/projected/4b962555-2817-45b2-88e3-7618e6d7c74b-kube-api-access-fbgxc\") pod \"nova-scheduler-0\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " pod="openstack/nova-scheduler-0" Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.848246 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.848507 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="56172023-8cfb-4589-809f-d83c900f90b0" containerName="kube-state-metrics" containerID="cri-o://46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4" gracePeriod=30 Feb 26 16:05:24 crc kubenswrapper[5033]: I0226 16:05:24.895020 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.426241 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.498377 5033 generic.go:334] "Generic (PLEG): container finished" podID="56172023-8cfb-4589-809f-d83c900f90b0" containerID="46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4" exitCode=2 Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.498583 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56172023-8cfb-4589-809f-d83c900f90b0","Type":"ContainerDied","Data":"46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4"} Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.498734 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56172023-8cfb-4589-809f-d83c900f90b0","Type":"ContainerDied","Data":"92bdb30a43677f35f5aa40f7babb0dbbd052ce12c4a91c2bea8dd211c2ac8697"} Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.498755 5033 scope.go:117] "RemoveContainer" containerID="46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.498655 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.531454 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcht2\" (UniqueName: \"kubernetes.io/projected/56172023-8cfb-4589-809f-d83c900f90b0-kube-api-access-hcht2\") pod \"56172023-8cfb-4589-809f-d83c900f90b0\" (UID: \"56172023-8cfb-4589-809f-d83c900f90b0\") " Feb 26 16:05:25 crc kubenswrapper[5033]: W0226 16:05:25.537148 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b962555_2817_45b2_88e3_7618e6d7c74b.slice/crio-9628a6d80e2604137f5f4125c117002f5b3a51f039fc9b83f9ead186ab9f9e2e WatchSource:0}: Error finding container 9628a6d80e2604137f5f4125c117002f5b3a51f039fc9b83f9ead186ab9f9e2e: Status 404 returned error can't find the container with id 9628a6d80e2604137f5f4125c117002f5b3a51f039fc9b83f9ead186ab9f9e2e Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.538240 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.538692 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56172023-8cfb-4589-809f-d83c900f90b0-kube-api-access-hcht2" (OuterVolumeSpecName: "kube-api-access-hcht2") pod "56172023-8cfb-4589-809f-d83c900f90b0" (UID: "56172023-8cfb-4589-809f-d83c900f90b0"). InnerVolumeSpecName "kube-api-access-hcht2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.550114 5033 scope.go:117] "RemoveContainer" containerID="46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4" Feb 26 16:05:25 crc kubenswrapper[5033]: E0226 16:05:25.551102 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4\": container with ID starting with 46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4 not found: ID does not exist" containerID="46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.551128 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4"} err="failed to get container status \"46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4\": rpc error: code = NotFound desc = could not find container \"46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4\": container with ID starting with 46384b518a10671be8799482bdfd91bb0720fae542182ee8781741c12fde17f4 not found: ID does not exist" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.633759 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcht2\" (UniqueName: \"kubernetes.io/projected/56172023-8cfb-4589-809f-d83c900f90b0-kube-api-access-hcht2\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.641161 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e111d211-eff3-4ec2-b648-16ec542c2900" path="/var/lib/kubelet/pods/e111d211-eff3-4ec2-b648-16ec542c2900/volumes" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.822413 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.833136 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.851600 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:05:25 crc kubenswrapper[5033]: E0226 16:05:25.852429 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56172023-8cfb-4589-809f-d83c900f90b0" containerName="kube-state-metrics" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.852450 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="56172023-8cfb-4589-809f-d83c900f90b0" containerName="kube-state-metrics" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.852675 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="56172023-8cfb-4589-809f-d83c900f90b0" containerName="kube-state-metrics" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.853391 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.856197 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.856930 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.865964 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.941480 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.941603 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.941774 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndlpk\" (UniqueName: \"kubernetes.io/projected/783187df-a438-4fb2-97be-3565e08e88ac-kube-api-access-ndlpk\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:25 crc kubenswrapper[5033]: I0226 16:05:25.941839 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.042874 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.043033 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.043054 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndlpk\" (UniqueName: \"kubernetes.io/projected/783187df-a438-4fb2-97be-3565e08e88ac-kube-api-access-ndlpk\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.043148 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.047703 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.047731 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.053207 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.059090 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndlpk\" (UniqueName: \"kubernetes.io/projected/783187df-a438-4fb2-97be-3565e08e88ac-kube-api-access-ndlpk\") pod \"kube-state-metrics-0\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.240302 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.440546 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.452354 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-combined-ca-bundle\") pod \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.452754 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/806654e6-e126-4cf9-a2c4-b0deb91bf75d-logs\") pod \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.452847 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdsh5\" (UniqueName: \"kubernetes.io/projected/806654e6-e126-4cf9-a2c4-b0deb91bf75d-kube-api-access-tdsh5\") pod \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.452878 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-config-data\") pod \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\" (UID: \"806654e6-e126-4cf9-a2c4-b0deb91bf75d\") " Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.458046 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/806654e6-e126-4cf9-a2c4-b0deb91bf75d-logs" (OuterVolumeSpecName: "logs") pod "806654e6-e126-4cf9-a2c4-b0deb91bf75d" (UID: "806654e6-e126-4cf9-a2c4-b0deb91bf75d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.469239 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/806654e6-e126-4cf9-a2c4-b0deb91bf75d-kube-api-access-tdsh5" (OuterVolumeSpecName: "kube-api-access-tdsh5") pod "806654e6-e126-4cf9-a2c4-b0deb91bf75d" (UID: "806654e6-e126-4cf9-a2c4-b0deb91bf75d"). InnerVolumeSpecName "kube-api-access-tdsh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.509318 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "806654e6-e126-4cf9-a2c4-b0deb91bf75d" (UID: "806654e6-e126-4cf9-a2c4-b0deb91bf75d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.512446 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-config-data" (OuterVolumeSpecName: "config-data") pod "806654e6-e126-4cf9-a2c4-b0deb91bf75d" (UID: "806654e6-e126-4cf9-a2c4-b0deb91bf75d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.539080 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b962555-2817-45b2-88e3-7618e6d7c74b","Type":"ContainerStarted","Data":"f9f7328cb4211b752ac63a6e537eb7e195ecbfc52e91f427fc9cd85c2ff35cc2"} Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.539126 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b962555-2817-45b2-88e3-7618e6d7c74b","Type":"ContainerStarted","Data":"9628a6d80e2604137f5f4125c117002f5b3a51f039fc9b83f9ead186ab9f9e2e"} Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.548052 5033 generic.go:334] "Generic (PLEG): container finished" podID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerID="89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668" exitCode=0 Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.548102 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"806654e6-e126-4cf9-a2c4-b0deb91bf75d","Type":"ContainerDied","Data":"89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668"} Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.548163 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"806654e6-e126-4cf9-a2c4-b0deb91bf75d","Type":"ContainerDied","Data":"ca5ec9afbde58cbe6b720a6c8d39d906173f07acc1bf6d872e21fad8501d5f8b"} Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.548163 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.548185 5033 scope.go:117] "RemoveContainer" containerID="89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.557246 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.557337 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/806654e6-e126-4cf9-a2c4-b0deb91bf75d-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.557346 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdsh5\" (UniqueName: \"kubernetes.io/projected/806654e6-e126-4cf9-a2c4-b0deb91bf75d-kube-api-access-tdsh5\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.557355 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/806654e6-e126-4cf9-a2c4-b0deb91bf75d-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.562174 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5621360810000002 podStartE2EDuration="2.562136081s" podCreationTimestamp="2026-02-26 16:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:26.555199728 +0000 UTC m=+1453.767798378" watchObservedRunningTime="2026-02-26 16:05:26.562136081 +0000 UTC m=+1453.774734731" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.585445 5033 scope.go:117] "RemoveContainer" containerID="68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.607723 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.611681 5033 scope.go:117] "RemoveContainer" containerID="89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668" Feb 26 16:05:26 crc kubenswrapper[5033]: E0226 16:05:26.612154 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668\": container with ID starting with 89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668 not found: ID does not exist" containerID="89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.612179 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668"} err="failed to get container status \"89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668\": rpc error: code = NotFound desc = could not find container \"89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668\": container with ID starting with 89bce08fdf5a493de1c8bb0a708771859a0a7a1aa6edbb5cfeafcfa698bed668 not found: ID does not exist" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.612202 5033 scope.go:117] "RemoveContainer" containerID="68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb" Feb 26 16:05:26 crc kubenswrapper[5033]: E0226 16:05:26.612653 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb\": container with ID starting with 68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb not found: ID does not exist" containerID="68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.612676 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb"} err="failed to get container status \"68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb\": rpc error: code = NotFound desc = could not find container \"68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb\": container with ID starting with 68fc290df282466cbb6aa9169df8febad5c0928511ac8e4fc51acbc0acf5eecb not found: ID does not exist" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.627350 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.636928 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:26 crc kubenswrapper[5033]: E0226 16:05:26.637389 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-api" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.637406 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-api" Feb 26 16:05:26 crc kubenswrapper[5033]: E0226 16:05:26.637423 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-log" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.637429 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-log" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.637612 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-api" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.637628 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" containerName="nova-api-log" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.638549 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.644314 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.644604 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.661613 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c553b0-b824-4e07-b793-5464f712f5b3-logs\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.661851 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.661923 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jq8c\" (UniqueName: \"kubernetes.io/projected/49c553b0-b824-4e07-b793-5464f712f5b3-kube-api-access-9jq8c\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.662210 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-config-data\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.763046 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c553b0-b824-4e07-b793-5464f712f5b3-logs\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.763170 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.763571 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c553b0-b824-4e07-b793-5464f712f5b3-logs\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.763768 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jq8c\" (UniqueName: \"kubernetes.io/projected/49c553b0-b824-4e07-b793-5464f712f5b3-kube-api-access-9jq8c\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.763863 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-config-data\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.767067 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.767705 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-config-data\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.769977 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.779013 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.787572 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jq8c\" (UniqueName: \"kubernetes.io/projected/49c553b0-b824-4e07-b793-5464f712f5b3-kube-api-access-9jq8c\") pod \"nova-api-0\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " pod="openstack/nova-api-0" Feb 26 16:05:26 crc kubenswrapper[5033]: I0226 16:05:26.954814 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.046851 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.047373 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="ceilometer-central-agent" containerID="cri-o://dee9fbd05301ab5f47afe0a7e479dc4bc0f902a25bd2626ead447affce416022" gracePeriod=30 Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.047598 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="ceilometer-notification-agent" containerID="cri-o://24305472fa0cdb496d359a39a84c66512285d81602f08916e066bc8324c14c1b" gracePeriod=30 Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.047682 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="sg-core" containerID="cri-o://8cc13573b7d9023710ae997b225e5ff23a3f72bde026d3f93ec9d6b11513d354" gracePeriod=30 Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.047742 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="proxy-httpd" containerID="cri-o://cbe9726cf4a217b2b29994ea6f8db6c5201d59ca21a2c14424b5e006d60cd949" gracePeriod=30 Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.549855 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:27 crc kubenswrapper[5033]: W0226 16:05:27.555437 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49c553b0_b824_4e07_b793_5464f712f5b3.slice/crio-7aeb6e33eeb9fb1853dfed50acd55b708546d83d573361fb2cab4bc1fb30d225 WatchSource:0}: Error finding container 7aeb6e33eeb9fb1853dfed50acd55b708546d83d573361fb2cab4bc1fb30d225: Status 404 returned error can't find the container with id 7aeb6e33eeb9fb1853dfed50acd55b708546d83d573361fb2cab4bc1fb30d225 Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.569385 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"783187df-a438-4fb2-97be-3565e08e88ac","Type":"ContainerStarted","Data":"6cf62275935f253d6f1c1359b42290db1153e8c9f13019cbba8b8a9eee589f3f"} Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.572959 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49c553b0-b824-4e07-b793-5464f712f5b3","Type":"ContainerStarted","Data":"7aeb6e33eeb9fb1853dfed50acd55b708546d83d573361fb2cab4bc1fb30d225"} Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.575581 5033 generic.go:334] "Generic (PLEG): container finished" podID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerID="cbe9726cf4a217b2b29994ea6f8db6c5201d59ca21a2c14424b5e006d60cd949" exitCode=0 Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.575624 5033 generic.go:334] "Generic (PLEG): container finished" podID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerID="8cc13573b7d9023710ae997b225e5ff23a3f72bde026d3f93ec9d6b11513d354" exitCode=2 Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.575656 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerDied","Data":"cbe9726cf4a217b2b29994ea6f8db6c5201d59ca21a2c14424b5e006d60cd949"} Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.575706 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerDied","Data":"8cc13573b7d9023710ae997b225e5ff23a3f72bde026d3f93ec9d6b11513d354"} Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.640125 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56172023-8cfb-4589-809f-d83c900f90b0" path="/var/lib/kubelet/pods/56172023-8cfb-4589-809f-d83c900f90b0/volumes" Feb 26 16:05:27 crc kubenswrapper[5033]: I0226 16:05:27.640975 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="806654e6-e126-4cf9-a2c4-b0deb91bf75d" path="/var/lib/kubelet/pods/806654e6-e126-4cf9-a2c4-b0deb91bf75d/volumes" Feb 26 16:05:28 crc kubenswrapper[5033]: I0226 16:05:28.615739 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49c553b0-b824-4e07-b793-5464f712f5b3","Type":"ContainerStarted","Data":"92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e"} Feb 26 16:05:28 crc kubenswrapper[5033]: I0226 16:05:28.616967 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49c553b0-b824-4e07-b793-5464f712f5b3","Type":"ContainerStarted","Data":"fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481"} Feb 26 16:05:28 crc kubenswrapper[5033]: I0226 16:05:28.647156 5033 generic.go:334] "Generic (PLEG): container finished" podID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerID="dee9fbd05301ab5f47afe0a7e479dc4bc0f902a25bd2626ead447affce416022" exitCode=0 Feb 26 16:05:28 crc kubenswrapper[5033]: I0226 16:05:28.647243 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerDied","Data":"dee9fbd05301ab5f47afe0a7e479dc4bc0f902a25bd2626ead447affce416022"} Feb 26 16:05:28 crc kubenswrapper[5033]: I0226 16:05:28.653227 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"783187df-a438-4fb2-97be-3565e08e88ac","Type":"ContainerStarted","Data":"33f8af65abd2984345edb8a627a3f3c55b944bdf81de100064126a81f6d0b224"} Feb 26 16:05:28 crc kubenswrapper[5033]: I0226 16:05:28.654753 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 16:05:28 crc kubenswrapper[5033]: I0226 16:05:28.665827 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.665807283 podStartE2EDuration="2.665807283s" podCreationTimestamp="2026-02-26 16:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:28.65047532 +0000 UTC m=+1455.863073960" watchObservedRunningTime="2026-02-26 16:05:28.665807283 +0000 UTC m=+1455.878405933" Feb 26 16:05:28 crc kubenswrapper[5033]: I0226 16:05:28.678115 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.186614701 podStartE2EDuration="3.678096539s" podCreationTimestamp="2026-02-26 16:05:25 +0000 UTC" firstStartedPulling="2026-02-26 16:05:26.778513763 +0000 UTC m=+1453.991112423" lastFinishedPulling="2026-02-26 16:05:27.269995611 +0000 UTC m=+1454.482594261" observedRunningTime="2026-02-26 16:05:28.671358181 +0000 UTC m=+1455.883956831" watchObservedRunningTime="2026-02-26 16:05:28.678096539 +0000 UTC m=+1455.890695189" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.498285 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.498697 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.575661 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.666084 5033 generic.go:334] "Generic (PLEG): container finished" podID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerID="24305472fa0cdb496d359a39a84c66512285d81602f08916e066bc8324c14c1b" exitCode=0 Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.666186 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerDied","Data":"24305472fa0cdb496d359a39a84c66512285d81602f08916e066bc8324c14c1b"} Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.725029 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.801197 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.833792 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-run-httpd\") pod \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.834598 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "80cc4e45-9683-48f8-8790-6b54b42d8b7f" (UID: "80cc4e45-9683-48f8-8790-6b54b42d8b7f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.849419 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mwzpm"] Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.895093 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.934749 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-combined-ca-bundle\") pod \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.934865 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-scripts\") pod \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.935494 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkvf4\" (UniqueName: \"kubernetes.io/projected/80cc4e45-9683-48f8-8790-6b54b42d8b7f-kube-api-access-bkvf4\") pod \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.937335 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-sg-core-conf-yaml\") pod \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.937483 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-log-httpd\") pod \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.937566 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-config-data\") pod \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\" (UID: \"80cc4e45-9683-48f8-8790-6b54b42d8b7f\") " Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.938209 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.939465 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "80cc4e45-9683-48f8-8790-6b54b42d8b7f" (UID: "80cc4e45-9683-48f8-8790-6b54b42d8b7f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.943222 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-scripts" (OuterVolumeSpecName: "scripts") pod "80cc4e45-9683-48f8-8790-6b54b42d8b7f" (UID: "80cc4e45-9683-48f8-8790-6b54b42d8b7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.943310 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80cc4e45-9683-48f8-8790-6b54b42d8b7f-kube-api-access-bkvf4" (OuterVolumeSpecName: "kube-api-access-bkvf4") pod "80cc4e45-9683-48f8-8790-6b54b42d8b7f" (UID: "80cc4e45-9683-48f8-8790-6b54b42d8b7f"). InnerVolumeSpecName "kube-api-access-bkvf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:29 crc kubenswrapper[5033]: I0226 16:05:29.977116 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "80cc4e45-9683-48f8-8790-6b54b42d8b7f" (UID: "80cc4e45-9683-48f8-8790-6b54b42d8b7f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.029144 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80cc4e45-9683-48f8-8790-6b54b42d8b7f" (UID: "80cc4e45-9683-48f8-8790-6b54b42d8b7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.041748 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.041782 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.041792 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkvf4\" (UniqueName: \"kubernetes.io/projected/80cc4e45-9683-48f8-8790-6b54b42d8b7f-kube-api-access-bkvf4\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.041802 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.041811 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cc4e45-9683-48f8-8790-6b54b42d8b7f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.045160 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-config-data" (OuterVolumeSpecName: "config-data") pod "80cc4e45-9683-48f8-8790-6b54b42d8b7f" (UID: "80cc4e45-9683-48f8-8790-6b54b42d8b7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.143799 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80cc4e45-9683-48f8-8790-6b54b42d8b7f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.677353 5033 generic.go:334] "Generic (PLEG): container finished" podID="739b08e5-9e18-4d87-aded-724b5d2ab937" containerID="a3eb7e03f8fefb3acf5770d5fe9e48de5bd74bdc5f586a4270cbdd4bdb99df39" exitCode=0 Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.677454 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8h5d9" event={"ID":"739b08e5-9e18-4d87-aded-724b5d2ab937","Type":"ContainerDied","Data":"a3eb7e03f8fefb3acf5770d5fe9e48de5bd74bdc5f586a4270cbdd4bdb99df39"} Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.682337 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.687053 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80cc4e45-9683-48f8-8790-6b54b42d8b7f","Type":"ContainerDied","Data":"155c1a12324b17b5b7a5776f9b3c5ac70902bbe9fa58884acc14fccb6c8f8345"} Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.687435 5033 scope.go:117] "RemoveContainer" containerID="cbe9726cf4a217b2b29994ea6f8db6c5201d59ca21a2c14424b5e006d60cd949" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.716868 5033 scope.go:117] "RemoveContainer" containerID="8cc13573b7d9023710ae997b225e5ff23a3f72bde026d3f93ec9d6b11513d354" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.719200 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.727600 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.749861 5033 scope.go:117] "RemoveContainer" containerID="24305472fa0cdb496d359a39a84c66512285d81602f08916e066bc8324c14c1b" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.753034 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:30 crc kubenswrapper[5033]: E0226 16:05:30.753504 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="proxy-httpd" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.753519 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="proxy-httpd" Feb 26 16:05:30 crc kubenswrapper[5033]: E0226 16:05:30.753530 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="sg-core" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.753537 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="sg-core" Feb 26 16:05:30 crc kubenswrapper[5033]: E0226 16:05:30.753553 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="ceilometer-notification-agent" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.753562 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="ceilometer-notification-agent" Feb 26 16:05:30 crc kubenswrapper[5033]: E0226 16:05:30.753581 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="ceilometer-central-agent" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.753588 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="ceilometer-central-agent" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.755471 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="sg-core" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.755507 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="ceilometer-central-agent" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.755519 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="proxy-httpd" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.755549 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" containerName="ceilometer-notification-agent" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.758209 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.763133 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.763581 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.766000 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.766541 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.804356 5033 scope.go:117] "RemoveContainer" containerID="dee9fbd05301ab5f47afe0a7e479dc4bc0f902a25bd2626ead447affce416022" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.861985 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-run-httpd\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.862049 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.862138 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.862161 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-scripts\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.862368 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-config-data\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.862411 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcsmz\" (UniqueName: \"kubernetes.io/projected/c945adfe-427a-48ca-8a7a-035e620e2b6f-kube-api-access-wcsmz\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.862443 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-log-httpd\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.862492 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.964596 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-run-httpd\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.964641 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.964675 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.964692 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-scripts\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.964794 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-config-data\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.964819 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcsmz\" (UniqueName: \"kubernetes.io/projected/c945adfe-427a-48ca-8a7a-035e620e2b6f-kube-api-access-wcsmz\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.964845 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-log-httpd\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.964877 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.966824 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-run-httpd\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.967190 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-log-httpd\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.970537 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-scripts\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.973875 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.974439 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-config-data\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.975136 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.981513 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:30 crc kubenswrapper[5033]: I0226 16:05:30.986113 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcsmz\" (UniqueName: \"kubernetes.io/projected/c945adfe-427a-48ca-8a7a-035e620e2b6f-kube-api-access-wcsmz\") pod \"ceilometer-0\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " pod="openstack/ceilometer-0" Feb 26 16:05:31 crc kubenswrapper[5033]: I0226 16:05:31.088563 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:31 crc kubenswrapper[5033]: I0226 16:05:31.587975 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:31 crc kubenswrapper[5033]: I0226 16:05:31.662048 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80cc4e45-9683-48f8-8790-6b54b42d8b7f" path="/var/lib/kubelet/pods/80cc4e45-9683-48f8-8790-6b54b42d8b7f/volumes" Feb 26 16:05:31 crc kubenswrapper[5033]: I0226 16:05:31.693852 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerStarted","Data":"0da37745bf528a4271c979da7156527b3b4452f9e33efa33c7e374812c83f154"} Feb 26 16:05:31 crc kubenswrapper[5033]: I0226 16:05:31.693969 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mwzpm" podUID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerName="registry-server" containerID="cri-o://45be7f415ab0fee8f35208d38c1c974ead3ecabc533b9ab507a29fb9844f3d2b" gracePeriod=2 Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.074322 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.089772 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-scripts\") pod \"739b08e5-9e18-4d87-aded-724b5d2ab937\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.089840 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl5jm\" (UniqueName: \"kubernetes.io/projected/739b08e5-9e18-4d87-aded-724b5d2ab937-kube-api-access-sl5jm\") pod \"739b08e5-9e18-4d87-aded-724b5d2ab937\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.089882 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-combined-ca-bundle\") pod \"739b08e5-9e18-4d87-aded-724b5d2ab937\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.090087 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-config-data\") pod \"739b08e5-9e18-4d87-aded-724b5d2ab937\" (UID: \"739b08e5-9e18-4d87-aded-724b5d2ab937\") " Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.096586 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739b08e5-9e18-4d87-aded-724b5d2ab937-kube-api-access-sl5jm" (OuterVolumeSpecName: "kube-api-access-sl5jm") pod "739b08e5-9e18-4d87-aded-724b5d2ab937" (UID: "739b08e5-9e18-4d87-aded-724b5d2ab937"). InnerVolumeSpecName "kube-api-access-sl5jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.102405 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-scripts" (OuterVolumeSpecName: "scripts") pod "739b08e5-9e18-4d87-aded-724b5d2ab937" (UID: "739b08e5-9e18-4d87-aded-724b5d2ab937"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.129563 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "739b08e5-9e18-4d87-aded-724b5d2ab937" (UID: "739b08e5-9e18-4d87-aded-724b5d2ab937"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.134580 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-config-data" (OuterVolumeSpecName: "config-data") pod "739b08e5-9e18-4d87-aded-724b5d2ab937" (UID: "739b08e5-9e18-4d87-aded-724b5d2ab937"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.192834 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.192879 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.192891 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl5jm\" (UniqueName: \"kubernetes.io/projected/739b08e5-9e18-4d87-aded-724b5d2ab937-kube-api-access-sl5jm\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.192905 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739b08e5-9e18-4d87-aded-724b5d2ab937-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.705726 5033 generic.go:334] "Generic (PLEG): container finished" podID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerID="45be7f415ab0fee8f35208d38c1c974ead3ecabc533b9ab507a29fb9844f3d2b" exitCode=0 Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.705801 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwzpm" event={"ID":"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61","Type":"ContainerDied","Data":"45be7f415ab0fee8f35208d38c1c974ead3ecabc533b9ab507a29fb9844f3d2b"} Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.711080 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8h5d9" event={"ID":"739b08e5-9e18-4d87-aded-724b5d2ab937","Type":"ContainerDied","Data":"421408f12463c67a202d5069b8adb195e8eae4b9ab71d9a5d36e1f249c5683f8"} Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.711118 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="421408f12463c67a202d5069b8adb195e8eae4b9ab71d9a5d36e1f249c5683f8" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.711179 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8h5d9" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.791792 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 16:05:32 crc kubenswrapper[5033]: E0226 16:05:32.792252 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739b08e5-9e18-4d87-aded-724b5d2ab937" containerName="nova-cell1-conductor-db-sync" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.792286 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="739b08e5-9e18-4d87-aded-724b5d2ab937" containerName="nova-cell1-conductor-db-sync" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.792494 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="739b08e5-9e18-4d87-aded-724b5d2ab937" containerName="nova-cell1-conductor-db-sync" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.793228 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.795163 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.807578 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.830186 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.905574 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-utilities\") pod \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.905679 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggjdt\" (UniqueName: \"kubernetes.io/projected/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-kube-api-access-ggjdt\") pod \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.905812 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-catalog-content\") pod \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\" (UID: \"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61\") " Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.906126 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.906172 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.906211 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w8q8\" (UniqueName: \"kubernetes.io/projected/17f57fe7-555b-4119-b03e-fafb151494a6-kube-api-access-2w8q8\") pod \"nova-cell1-conductor-0\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.907414 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-utilities" (OuterVolumeSpecName: "utilities") pod "b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" (UID: "b3e4fe96-44fe-4cb0-94b3-f1df3b376c61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:32 crc kubenswrapper[5033]: I0226 16:05:32.915631 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-kube-api-access-ggjdt" (OuterVolumeSpecName: "kube-api-access-ggjdt") pod "b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" (UID: "b3e4fe96-44fe-4cb0-94b3-f1df3b376c61"). InnerVolumeSpecName "kube-api-access-ggjdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.007844 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.007926 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w8q8\" (UniqueName: \"kubernetes.io/projected/17f57fe7-555b-4119-b03e-fafb151494a6-kube-api-access-2w8q8\") pod \"nova-cell1-conductor-0\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.008078 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.008138 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.008154 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggjdt\" (UniqueName: \"kubernetes.io/projected/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-kube-api-access-ggjdt\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.011805 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.013115 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.028747 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w8q8\" (UniqueName: \"kubernetes.io/projected/17f57fe7-555b-4119-b03e-fafb151494a6-kube-api-access-2w8q8\") pod \"nova-cell1-conductor-0\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.049913 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" (UID: "b3e4fe96-44fe-4cb0-94b3-f1df3b376c61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.110515 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.158509 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.621050 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 16:05:33 crc kubenswrapper[5033]: W0226 16:05:33.624570 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17f57fe7_555b_4119_b03e_fafb151494a6.slice/crio-4b2513f16ab794f9288ec4293871e5005e244c973f84af173163d7308165c27c WatchSource:0}: Error finding container 4b2513f16ab794f9288ec4293871e5005e244c973f84af173163d7308165c27c: Status 404 returned error can't find the container with id 4b2513f16ab794f9288ec4293871e5005e244c973f84af173163d7308165c27c Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.721978 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"17f57fe7-555b-4119-b03e-fafb151494a6","Type":"ContainerStarted","Data":"4b2513f16ab794f9288ec4293871e5005e244c973f84af173163d7308165c27c"} Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.724032 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerStarted","Data":"67bec2447de95a6f4d02b5d04a911eb3249b53de745d14913307fd0c82293865"} Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.724065 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerStarted","Data":"1b5be0c22eca389b1ced070d1b6beb048f4cb5af28de9b78acfec76d2b7738ca"} Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.728144 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwzpm" event={"ID":"b3e4fe96-44fe-4cb0-94b3-f1df3b376c61","Type":"ContainerDied","Data":"bce7120dedfaf7bb44e21b79bdfe144a4b1dfadb032af7d3ae4cbe754e9db0f5"} Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.728195 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwzpm" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.728227 5033 scope.go:117] "RemoveContainer" containerID="45be7f415ab0fee8f35208d38c1c974ead3ecabc533b9ab507a29fb9844f3d2b" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.765217 5033 scope.go:117] "RemoveContainer" containerID="0b0bbbeff38dea3622af4d23269b5487bb30760e0f9113b56f6a4223d923304c" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.801146 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mwzpm"] Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.803866 5033 scope.go:117] "RemoveContainer" containerID="db677ade2c2920f8d4e4b57c04f72f18837b7b6e5a2611f3694ff447174f6ddc" Feb 26 16:05:33 crc kubenswrapper[5033]: I0226 16:05:33.816301 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mwzpm"] Feb 26 16:05:34 crc kubenswrapper[5033]: I0226 16:05:34.747834 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"17f57fe7-555b-4119-b03e-fafb151494a6","Type":"ContainerStarted","Data":"ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4"} Feb 26 16:05:34 crc kubenswrapper[5033]: I0226 16:05:34.748586 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:34 crc kubenswrapper[5033]: I0226 16:05:34.754181 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerStarted","Data":"389dfb2b096768cbaa5921edcf156dea3bd1c534a671dddc196f9385beea0a50"} Feb 26 16:05:34 crc kubenswrapper[5033]: I0226 16:05:34.778659 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.778634319 podStartE2EDuration="2.778634319s" podCreationTimestamp="2026-02-26 16:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:34.772173428 +0000 UTC m=+1461.984772108" watchObservedRunningTime="2026-02-26 16:05:34.778634319 +0000 UTC m=+1461.991232969" Feb 26 16:05:34 crc kubenswrapper[5033]: I0226 16:05:34.895477 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 16:05:34 crc kubenswrapper[5033]: I0226 16:05:34.929441 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 16:05:35 crc kubenswrapper[5033]: I0226 16:05:35.640409 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" path="/var/lib/kubelet/pods/b3e4fe96-44fe-4cb0-94b3-f1df3b376c61/volumes" Feb 26 16:05:35 crc kubenswrapper[5033]: I0226 16:05:35.803810 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 16:05:36 crc kubenswrapper[5033]: I0226 16:05:36.257183 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 16:05:36 crc kubenswrapper[5033]: I0226 16:05:36.779406 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerStarted","Data":"5d336d7e1dd8d2e587cb2b21bebb0119e31bd3673992ec6af254d08aad5202e9"} Feb 26 16:05:36 crc kubenswrapper[5033]: I0226 16:05:36.809832 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.578848919 podStartE2EDuration="6.809812981s" podCreationTimestamp="2026-02-26 16:05:30 +0000 UTC" firstStartedPulling="2026-02-26 16:05:31.586032746 +0000 UTC m=+1458.798631396" lastFinishedPulling="2026-02-26 16:05:35.816996808 +0000 UTC m=+1463.029595458" observedRunningTime="2026-02-26 16:05:36.804689832 +0000 UTC m=+1464.017288482" watchObservedRunningTime="2026-02-26 16:05:36.809812981 +0000 UTC m=+1464.022411631" Feb 26 16:05:36 crc kubenswrapper[5033]: I0226 16:05:36.955647 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:05:36 crc kubenswrapper[5033]: I0226 16:05:36.955940 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:05:37 crc kubenswrapper[5033]: I0226 16:05:37.788418 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:05:38 crc kubenswrapper[5033]: I0226 16:05:38.037588 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:05:38 crc kubenswrapper[5033]: I0226 16:05:38.037630 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:05:38 crc kubenswrapper[5033]: I0226 16:05:38.188087 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 26 16:05:46 crc kubenswrapper[5033]: I0226 16:05:46.959202 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 16:05:46 crc kubenswrapper[5033]: I0226 16:05:46.959787 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 16:05:46 crc kubenswrapper[5033]: I0226 16:05:46.960210 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 16:05:46 crc kubenswrapper[5033]: I0226 16:05:46.960506 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 16:05:46 crc kubenswrapper[5033]: I0226 16:05:46.963245 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 16:05:46 crc kubenswrapper[5033]: I0226 16:05:46.964804 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.164871 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-52gpj"] Feb 26 16:05:47 crc kubenswrapper[5033]: E0226 16:05:47.169560 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerName="registry-server" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.169608 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerName="registry-server" Feb 26 16:05:47 crc kubenswrapper[5033]: E0226 16:05:47.169627 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerName="extract-utilities" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.169636 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerName="extract-utilities" Feb 26 16:05:47 crc kubenswrapper[5033]: E0226 16:05:47.169671 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerName="extract-content" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.169681 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerName="extract-content" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.169916 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e4fe96-44fe-4cb0-94b3-f1df3b376c61" containerName="registry-server" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.171102 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.188451 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-52gpj"] Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.289593 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-config\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.289667 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.289959 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.290144 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.290198 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhnph\" (UniqueName: \"kubernetes.io/projected/8a492ffb-344b-445f-a93c-bff302e85b06-kube-api-access-fhnph\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.290344 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.392400 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.392873 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.392906 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhnph\" (UniqueName: \"kubernetes.io/projected/8a492ffb-344b-445f-a93c-bff302e85b06-kube-api-access-fhnph\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.393791 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.393969 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.394114 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.394146 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-config\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.394200 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.394846 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-config\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.394887 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.395373 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.427472 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhnph\" (UniqueName: \"kubernetes.io/projected/8a492ffb-344b-445f-a93c-bff302e85b06-kube-api-access-fhnph\") pod \"dnsmasq-dns-cd5cbd7b9-52gpj\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.493198 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.884339 5033 generic.go:334] "Generic (PLEG): container finished" podID="215dcec4-4856-4ae9-962f-dce89506fe6e" containerID="72486677dd69741aa30e6caf62f343575f1effe8d24ad0d1094a1dd84492a921" exitCode=137 Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.884428 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"215dcec4-4856-4ae9-962f-dce89506fe6e","Type":"ContainerDied","Data":"72486677dd69741aa30e6caf62f343575f1effe8d24ad0d1094a1dd84492a921"} Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.885710 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.887317 5033 generic.go:334] "Generic (PLEG): container finished" podID="ad3621e8-de22-43d7-8777-4c4920277c13" containerID="e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8" exitCode=137 Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.887607 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad3621e8-de22-43d7-8777-4c4920277c13","Type":"ContainerDied","Data":"e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8"} Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.887644 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ad3621e8-de22-43d7-8777-4c4920277c13","Type":"ContainerDied","Data":"719ae6415404013b9917fe728dd0acf979f9e47d6a31ac9eeea58b761d655bd8"} Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.887665 5033 scope.go:117] "RemoveContainer" containerID="e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.894990 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.936533 5033 scope.go:117] "RemoveContainer" containerID="f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.978370 5033 scope.go:117] "RemoveContainer" containerID="e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8" Feb 26 16:05:47 crc kubenswrapper[5033]: E0226 16:05:47.978968 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8\": container with ID starting with e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8 not found: ID does not exist" containerID="e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.979003 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8"} err="failed to get container status \"e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8\": rpc error: code = NotFound desc = could not find container \"e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8\": container with ID starting with e0e5732b1285db408a0324aac6856deff924f2bfe2fbb25da5a524e7aed81dd8 not found: ID does not exist" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.979031 5033 scope.go:117] "RemoveContainer" containerID="f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9" Feb 26 16:05:47 crc kubenswrapper[5033]: E0226 16:05:47.979335 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9\": container with ID starting with f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9 not found: ID does not exist" containerID="f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9" Feb 26 16:05:47 crc kubenswrapper[5033]: I0226 16:05:47.979405 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9"} err="failed to get container status \"f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9\": rpc error: code = NotFound desc = could not find container \"f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9\": container with ID starting with f6808b3f6b6ac7c84f9d487766c42503d909e4a3b0956271f3de5e9486ee87e9 not found: ID does not exist" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.004589 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-config-data\") pod \"ad3621e8-de22-43d7-8777-4c4920277c13\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.004756 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-combined-ca-bundle\") pod \"215dcec4-4856-4ae9-962f-dce89506fe6e\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.004850 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3621e8-de22-43d7-8777-4c4920277c13-logs\") pod \"ad3621e8-de22-43d7-8777-4c4920277c13\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.004905 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvqk7\" (UniqueName: \"kubernetes.io/projected/ad3621e8-de22-43d7-8777-4c4920277c13-kube-api-access-bvqk7\") pod \"ad3621e8-de22-43d7-8777-4c4920277c13\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.005078 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxm4g\" (UniqueName: \"kubernetes.io/projected/215dcec4-4856-4ae9-962f-dce89506fe6e-kube-api-access-pxm4g\") pod \"215dcec4-4856-4ae9-962f-dce89506fe6e\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.005145 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-config-data\") pod \"215dcec4-4856-4ae9-962f-dce89506fe6e\" (UID: \"215dcec4-4856-4ae9-962f-dce89506fe6e\") " Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.005316 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-combined-ca-bundle\") pod \"ad3621e8-de22-43d7-8777-4c4920277c13\" (UID: \"ad3621e8-de22-43d7-8777-4c4920277c13\") " Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.005800 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad3621e8-de22-43d7-8777-4c4920277c13-logs" (OuterVolumeSpecName: "logs") pod "ad3621e8-de22-43d7-8777-4c4920277c13" (UID: "ad3621e8-de22-43d7-8777-4c4920277c13"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.006011 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3621e8-de22-43d7-8777-4c4920277c13-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.009971 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3621e8-de22-43d7-8777-4c4920277c13-kube-api-access-bvqk7" (OuterVolumeSpecName: "kube-api-access-bvqk7") pod "ad3621e8-de22-43d7-8777-4c4920277c13" (UID: "ad3621e8-de22-43d7-8777-4c4920277c13"). InnerVolumeSpecName "kube-api-access-bvqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.010110 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215dcec4-4856-4ae9-962f-dce89506fe6e-kube-api-access-pxm4g" (OuterVolumeSpecName: "kube-api-access-pxm4g") pod "215dcec4-4856-4ae9-962f-dce89506fe6e" (UID: "215dcec4-4856-4ae9-962f-dce89506fe6e"). InnerVolumeSpecName "kube-api-access-pxm4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.032411 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-config-data" (OuterVolumeSpecName: "config-data") pod "ad3621e8-de22-43d7-8777-4c4920277c13" (UID: "ad3621e8-de22-43d7-8777-4c4920277c13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.038754 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "215dcec4-4856-4ae9-962f-dce89506fe6e" (UID: "215dcec4-4856-4ae9-962f-dce89506fe6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.039049 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad3621e8-de22-43d7-8777-4c4920277c13" (UID: "ad3621e8-de22-43d7-8777-4c4920277c13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.048483 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-config-data" (OuterVolumeSpecName: "config-data") pod "215dcec4-4856-4ae9-962f-dce89506fe6e" (UID: "215dcec4-4856-4ae9-962f-dce89506fe6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.094275 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-52gpj"] Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.110967 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.111146 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.111292 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvqk7\" (UniqueName: \"kubernetes.io/projected/ad3621e8-de22-43d7-8777-4c4920277c13-kube-api-access-bvqk7\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.111410 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxm4g\" (UniqueName: \"kubernetes.io/projected/215dcec4-4856-4ae9-962f-dce89506fe6e-kube-api-access-pxm4g\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.111515 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215dcec4-4856-4ae9-962f-dce89506fe6e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.111619 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3621e8-de22-43d7-8777-4c4920277c13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.898348 5033 generic.go:334] "Generic (PLEG): container finished" podID="8a492ffb-344b-445f-a93c-bff302e85b06" containerID="fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f" exitCode=0 Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.898476 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" event={"ID":"8a492ffb-344b-445f-a93c-bff302e85b06","Type":"ContainerDied","Data":"fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f"} Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.898746 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" event={"ID":"8a492ffb-344b-445f-a93c-bff302e85b06","Type":"ContainerStarted","Data":"bd3def5d808e13012078d7e20e0c75fb0ec7250471d922654889b4f468274f6b"} Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.900280 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.902821 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.905752 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"215dcec4-4856-4ae9-962f-dce89506fe6e","Type":"ContainerDied","Data":"d6c23cd2788bd3d2e4ee60c15f0d79b19c0439764191d12b72db183902dc2460"} Feb 26 16:05:48 crc kubenswrapper[5033]: I0226 16:05:48.905815 5033 scope.go:117] "RemoveContainer" containerID="72486677dd69741aa30e6caf62f343575f1effe8d24ad0d1094a1dd84492a921" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.098034 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.108789 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.148928 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.165330 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.187157 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: E0226 16:05:49.196892 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215dcec4-4856-4ae9-962f-dce89506fe6e" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.196933 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="215dcec4-4856-4ae9-962f-dce89506fe6e" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 16:05:49 crc kubenswrapper[5033]: E0226 16:05:49.196951 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3621e8-de22-43d7-8777-4c4920277c13" containerName="nova-metadata-log" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.196962 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3621e8-de22-43d7-8777-4c4920277c13" containerName="nova-metadata-log" Feb 26 16:05:49 crc kubenswrapper[5033]: E0226 16:05:49.196972 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3621e8-de22-43d7-8777-4c4920277c13" containerName="nova-metadata-metadata" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.196980 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3621e8-de22-43d7-8777-4c4920277c13" containerName="nova-metadata-metadata" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.197173 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3621e8-de22-43d7-8777-4c4920277c13" containerName="nova-metadata-metadata" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.197199 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="215dcec4-4856-4ae9-962f-dce89506fe6e" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.197209 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3621e8-de22-43d7-8777-4c4920277c13" containerName="nova-metadata-log" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.197946 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.199359 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.199455 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.201992 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.202326 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.203938 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.204352 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.205049 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.205207 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.205413 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.251725 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.251851 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdltr\" (UniqueName: \"kubernetes.io/projected/93066a1a-6c0c-4b1e-a257-052a4db51d70-kube-api-access-tdltr\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.251914 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.251935 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-config-data\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.251966 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.252005 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2b25\" (UniqueName: \"kubernetes.io/projected/fcedbed9-f0d1-4c33-b601-50da4f37c971-kube-api-access-h2b25\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.252037 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.252096 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.252115 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93066a1a-6c0c-4b1e-a257-052a4db51d70-logs\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.252158 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.352919 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdltr\" (UniqueName: \"kubernetes.io/projected/93066a1a-6c0c-4b1e-a257-052a4db51d70-kube-api-access-tdltr\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.353002 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.353033 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-config-data\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.353073 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.353114 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2b25\" (UniqueName: \"kubernetes.io/projected/fcedbed9-f0d1-4c33-b601-50da4f37c971-kube-api-access-h2b25\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.353151 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.353213 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.353246 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93066a1a-6c0c-4b1e-a257-052a4db51d70-logs\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.353319 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.353368 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.354009 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93066a1a-6c0c-4b1e-a257-052a4db51d70-logs\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.358153 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.358324 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.359936 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.368093 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.368846 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.372040 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-config-data\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.372131 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.372527 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2b25\" (UniqueName: \"kubernetes.io/projected/fcedbed9-f0d1-4c33-b601-50da4f37c971-kube-api-access-h2b25\") pod \"nova-cell1-novncproxy-0\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.373726 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdltr\" (UniqueName: \"kubernetes.io/projected/93066a1a-6c0c-4b1e-a257-052a4db51d70-kube-api-access-tdltr\") pod \"nova-metadata-0\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.551245 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.559362 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.643557 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215dcec4-4856-4ae9-962f-dce89506fe6e" path="/var/lib/kubelet/pods/215dcec4-4856-4ae9-962f-dce89506fe6e/volumes" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.644426 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3621e8-de22-43d7-8777-4c4920277c13" path="/var/lib/kubelet/pods/ad3621e8-de22-43d7-8777-4c4920277c13/volumes" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.645630 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.645916 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="ceilometer-central-agent" containerID="cri-o://1b5be0c22eca389b1ced070d1b6beb048f4cb5af28de9b78acfec76d2b7738ca" gracePeriod=30 Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.645997 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="proxy-httpd" containerID="cri-o://5d336d7e1dd8d2e587cb2b21bebb0119e31bd3673992ec6af254d08aad5202e9" gracePeriod=30 Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.646077 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="sg-core" containerID="cri-o://389dfb2b096768cbaa5921edcf156dea3bd1c534a671dddc196f9385beea0a50" gracePeriod=30 Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.646007 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="ceilometer-notification-agent" containerID="cri-o://67bec2447de95a6f4d02b5d04a911eb3249b53de745d14913307fd0c82293865" gracePeriod=30 Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.654343 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.199:3000/\": EOF" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.813501 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.929670 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.940988 5033 generic.go:334] "Generic (PLEG): container finished" podID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerID="5d336d7e1dd8d2e587cb2b21bebb0119e31bd3673992ec6af254d08aad5202e9" exitCode=0 Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.941178 5033 generic.go:334] "Generic (PLEG): container finished" podID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerID="389dfb2b096768cbaa5921edcf156dea3bd1c534a671dddc196f9385beea0a50" exitCode=2 Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.941242 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerDied","Data":"5d336d7e1dd8d2e587cb2b21bebb0119e31bd3673992ec6af254d08aad5202e9"} Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.941339 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerDied","Data":"389dfb2b096768cbaa5921edcf156dea3bd1c534a671dddc196f9385beea0a50"} Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.954918 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-log" containerID="cri-o://fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481" gracePeriod=30 Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.955742 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" event={"ID":"8a492ffb-344b-445f-a93c-bff302e85b06","Type":"ContainerStarted","Data":"e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781"} Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.955805 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-api" containerID="cri-o://92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e" gracePeriod=30 Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.956692 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:49 crc kubenswrapper[5033]: I0226 16:05:49.998154 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:05:50 crc kubenswrapper[5033]: W0226 16:05:50.005495 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcedbed9_f0d1_4c33_b601_50da4f37c971.slice/crio-ecd9f15b749cf18ee568635a8a33c7bc28bd0df9ed382523cdeef7116405b0be WatchSource:0}: Error finding container ecd9f15b749cf18ee568635a8a33c7bc28bd0df9ed382523cdeef7116405b0be: Status 404 returned error can't find the container with id ecd9f15b749cf18ee568635a8a33c7bc28bd0df9ed382523cdeef7116405b0be Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.014606 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" podStartSLOduration=3.014583004 podStartE2EDuration="3.014583004s" podCreationTimestamp="2026-02-26 16:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:49.990727698 +0000 UTC m=+1477.203326348" watchObservedRunningTime="2026-02-26 16:05:50.014583004 +0000 UTC m=+1477.227181664" Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.968333 5033 generic.go:334] "Generic (PLEG): container finished" podID="49c553b0-b824-4e07-b793-5464f712f5b3" containerID="fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481" exitCode=143 Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.968641 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49c553b0-b824-4e07-b793-5464f712f5b3","Type":"ContainerDied","Data":"fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481"} Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.971221 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fcedbed9-f0d1-4c33-b601-50da4f37c971","Type":"ContainerStarted","Data":"adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718"} Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.971572 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fcedbed9-f0d1-4c33-b601-50da4f37c971","Type":"ContainerStarted","Data":"ecd9f15b749cf18ee568635a8a33c7bc28bd0df9ed382523cdeef7116405b0be"} Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.975475 5033 generic.go:334] "Generic (PLEG): container finished" podID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerID="1b5be0c22eca389b1ced070d1b6beb048f4cb5af28de9b78acfec76d2b7738ca" exitCode=0 Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.975554 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerDied","Data":"1b5be0c22eca389b1ced070d1b6beb048f4cb5af28de9b78acfec76d2b7738ca"} Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.977524 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93066a1a-6c0c-4b1e-a257-052a4db51d70","Type":"ContainerStarted","Data":"dc6dc75920b601a897218ec03556e7703f8571b16b9f948992ac68c3a552a5b8"} Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.977565 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93066a1a-6c0c-4b1e-a257-052a4db51d70","Type":"ContainerStarted","Data":"6f81ef7e60c36350746ebf868687502dc916cf71af0c8866c4d79814b51151f7"} Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.977575 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93066a1a-6c0c-4b1e-a257-052a4db51d70","Type":"ContainerStarted","Data":"0a129f201fe5ecd7c2d443c221ac7cfab22854720817e9bcf8ccd396c379ec44"} Feb 26 16:05:50 crc kubenswrapper[5033]: I0226 16:05:50.996176 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.996156035 podStartE2EDuration="1.996156035s" podCreationTimestamp="2026-02-26 16:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:50.990855593 +0000 UTC m=+1478.203454263" watchObservedRunningTime="2026-02-26 16:05:50.996156035 +0000 UTC m=+1478.208754685" Feb 26 16:05:51 crc kubenswrapper[5033]: I0226 16:05:51.017761 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.017733925 podStartE2EDuration="2.017733925s" podCreationTimestamp="2026-02-26 16:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:51.011109999 +0000 UTC m=+1478.223708659" watchObservedRunningTime="2026-02-26 16:05:51.017733925 +0000 UTC m=+1478.230332585" Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.575358 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.740045 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-combined-ca-bundle\") pod \"49c553b0-b824-4e07-b793-5464f712f5b3\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.740164 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-config-data\") pod \"49c553b0-b824-4e07-b793-5464f712f5b3\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.740316 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jq8c\" (UniqueName: \"kubernetes.io/projected/49c553b0-b824-4e07-b793-5464f712f5b3-kube-api-access-9jq8c\") pod \"49c553b0-b824-4e07-b793-5464f712f5b3\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.740341 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c553b0-b824-4e07-b793-5464f712f5b3-logs\") pod \"49c553b0-b824-4e07-b793-5464f712f5b3\" (UID: \"49c553b0-b824-4e07-b793-5464f712f5b3\") " Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.743296 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49c553b0-b824-4e07-b793-5464f712f5b3-logs" (OuterVolumeSpecName: "logs") pod "49c553b0-b824-4e07-b793-5464f712f5b3" (UID: "49c553b0-b824-4e07-b793-5464f712f5b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.749334 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c553b0-b824-4e07-b793-5464f712f5b3-kube-api-access-9jq8c" (OuterVolumeSpecName: "kube-api-access-9jq8c") pod "49c553b0-b824-4e07-b793-5464f712f5b3" (UID: "49c553b0-b824-4e07-b793-5464f712f5b3"). InnerVolumeSpecName "kube-api-access-9jq8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.781593 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-config-data" (OuterVolumeSpecName: "config-data") pod "49c553b0-b824-4e07-b793-5464f712f5b3" (UID: "49c553b0-b824-4e07-b793-5464f712f5b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.784252 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49c553b0-b824-4e07-b793-5464f712f5b3" (UID: "49c553b0-b824-4e07-b793-5464f712f5b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.843090 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.843442 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jq8c\" (UniqueName: \"kubernetes.io/projected/49c553b0-b824-4e07-b793-5464f712f5b3-kube-api-access-9jq8c\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.843533 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c553b0-b824-4e07-b793-5464f712f5b3-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:53 crc kubenswrapper[5033]: I0226 16:05:53.843602 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c553b0-b824-4e07-b793-5464f712f5b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.007942 5033 generic.go:334] "Generic (PLEG): container finished" podID="49c553b0-b824-4e07-b793-5464f712f5b3" containerID="92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e" exitCode=0 Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.008032 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49c553b0-b824-4e07-b793-5464f712f5b3","Type":"ContainerDied","Data":"92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e"} Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.008088 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49c553b0-b824-4e07-b793-5464f712f5b3","Type":"ContainerDied","Data":"7aeb6e33eeb9fb1853dfed50acd55b708546d83d573361fb2cab4bc1fb30d225"} Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.008115 5033 scope.go:117] "RemoveContainer" containerID="92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.008846 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.054017 5033 scope.go:117] "RemoveContainer" containerID="fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.056964 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.079164 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.090087 5033 scope.go:117] "RemoveContainer" containerID="92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e" Feb 26 16:05:54 crc kubenswrapper[5033]: E0226 16:05:54.090578 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e\": container with ID starting with 92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e not found: ID does not exist" containerID="92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.090617 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e"} err="failed to get container status \"92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e\": rpc error: code = NotFound desc = could not find container \"92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e\": container with ID starting with 92614676091d40d94d55bd9287a65369fbf8125a99ebe2e210cd7a141bbdb13e not found: ID does not exist" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.090642 5033 scope.go:117] "RemoveContainer" containerID="fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481" Feb 26 16:05:54 crc kubenswrapper[5033]: E0226 16:05:54.090883 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481\": container with ID starting with fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481 not found: ID does not exist" containerID="fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.090911 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481"} err="failed to get container status \"fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481\": rpc error: code = NotFound desc = could not find container \"fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481\": container with ID starting with fc352fae6b8df43592911d99fd0e53ac3d9c475b556ad8509615d3b7f8b9d481 not found: ID does not exist" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.097643 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:54 crc kubenswrapper[5033]: E0226 16:05:54.098037 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-api" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.098053 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-api" Feb 26 16:05:54 crc kubenswrapper[5033]: E0226 16:05:54.098067 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-log" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.098072 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-log" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.098289 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-api" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.098306 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" containerName="nova-api-log" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.099175 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.101694 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.102354 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.108846 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.110972 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.251009 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.251062 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-config-data\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.251105 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lb62\" (UniqueName: \"kubernetes.io/projected/aeff5347-711b-419e-bc86-126ed1e45187-kube-api-access-2lb62\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.251147 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.251169 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-public-tls-certs\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.251294 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeff5347-711b-419e-bc86-126ed1e45187-logs\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.361725 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.361777 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-public-tls-certs\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.361893 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeff5347-711b-419e-bc86-126ed1e45187-logs\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.362015 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.362051 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-config-data\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.362080 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lb62\" (UniqueName: \"kubernetes.io/projected/aeff5347-711b-419e-bc86-126ed1e45187-kube-api-access-2lb62\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.362833 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeff5347-711b-419e-bc86-126ed1e45187-logs\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.369084 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.369124 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-public-tls-certs\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.374038 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-config-data\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.375772 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.383999 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lb62\" (UniqueName: \"kubernetes.io/projected/aeff5347-711b-419e-bc86-126ed1e45187-kube-api-access-2lb62\") pod \"nova-api-0\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.429600 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.552008 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.553368 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:05:54 crc kubenswrapper[5033]: I0226 16:05:54.559543 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:55 crc kubenswrapper[5033]: I0226 16:05:54.986880 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:05:55 crc kubenswrapper[5033]: I0226 16:05:55.110738 5033 generic.go:334] "Generic (PLEG): container finished" podID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerID="67bec2447de95a6f4d02b5d04a911eb3249b53de745d14913307fd0c82293865" exitCode=0 Feb 26 16:05:55 crc kubenswrapper[5033]: I0226 16:05:55.110808 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerDied","Data":"67bec2447de95a6f4d02b5d04a911eb3249b53de745d14913307fd0c82293865"} Feb 26 16:05:55 crc kubenswrapper[5033]: I0226 16:05:55.638387 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c553b0-b824-4e07-b793-5464f712f5b3" path="/var/lib/kubelet/pods/49c553b0-b824-4e07-b793-5464f712f5b3/volumes" Feb 26 16:05:55 crc kubenswrapper[5033]: I0226 16:05:55.860300 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.000164 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-log-httpd\") pod \"c945adfe-427a-48ca-8a7a-035e620e2b6f\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.000247 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-sg-core-conf-yaml\") pod \"c945adfe-427a-48ca-8a7a-035e620e2b6f\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.000309 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-run-httpd\") pod \"c945adfe-427a-48ca-8a7a-035e620e2b6f\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.000333 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-combined-ca-bundle\") pod \"c945adfe-427a-48ca-8a7a-035e620e2b6f\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.000388 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-ceilometer-tls-certs\") pod \"c945adfe-427a-48ca-8a7a-035e620e2b6f\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.000409 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-config-data\") pod \"c945adfe-427a-48ca-8a7a-035e620e2b6f\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.000491 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-scripts\") pod \"c945adfe-427a-48ca-8a7a-035e620e2b6f\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.000589 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcsmz\" (UniqueName: \"kubernetes.io/projected/c945adfe-427a-48ca-8a7a-035e620e2b6f-kube-api-access-wcsmz\") pod \"c945adfe-427a-48ca-8a7a-035e620e2b6f\" (UID: \"c945adfe-427a-48ca-8a7a-035e620e2b6f\") " Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.000969 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c945adfe-427a-48ca-8a7a-035e620e2b6f" (UID: "c945adfe-427a-48ca-8a7a-035e620e2b6f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.001215 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c945adfe-427a-48ca-8a7a-035e620e2b6f" (UID: "c945adfe-427a-48ca-8a7a-035e620e2b6f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.005626 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c945adfe-427a-48ca-8a7a-035e620e2b6f-kube-api-access-wcsmz" (OuterVolumeSpecName: "kube-api-access-wcsmz") pod "c945adfe-427a-48ca-8a7a-035e620e2b6f" (UID: "c945adfe-427a-48ca-8a7a-035e620e2b6f"). InnerVolumeSpecName "kube-api-access-wcsmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.006638 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-scripts" (OuterVolumeSpecName: "scripts") pod "c945adfe-427a-48ca-8a7a-035e620e2b6f" (UID: "c945adfe-427a-48ca-8a7a-035e620e2b6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.035950 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c945adfe-427a-48ca-8a7a-035e620e2b6f" (UID: "c945adfe-427a-48ca-8a7a-035e620e2b6f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.063978 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c945adfe-427a-48ca-8a7a-035e620e2b6f" (UID: "c945adfe-427a-48ca-8a7a-035e620e2b6f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.082503 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c945adfe-427a-48ca-8a7a-035e620e2b6f" (UID: "c945adfe-427a-48ca-8a7a-035e620e2b6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.102902 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.102962 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.102972 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c945adfe-427a-48ca-8a7a-035e620e2b6f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.102980 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.103015 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.103029 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.103041 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcsmz\" (UniqueName: \"kubernetes.io/projected/c945adfe-427a-48ca-8a7a-035e620e2b6f-kube-api-access-wcsmz\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.108695 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-config-data" (OuterVolumeSpecName: "config-data") pod "c945adfe-427a-48ca-8a7a-035e620e2b6f" (UID: "c945adfe-427a-48ca-8a7a-035e620e2b6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.134630 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeff5347-711b-419e-bc86-126ed1e45187","Type":"ContainerStarted","Data":"6682204d782cfae0fde5aa67fe8994db1f41675cb1ffbfdf97c4c18d9b1f54ee"} Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.134680 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeff5347-711b-419e-bc86-126ed1e45187","Type":"ContainerStarted","Data":"c15c13586ec73191c3332ec45d1c0c02fbd4f67cb0bcb3dc8a9cdee81b491f54"} Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.134690 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeff5347-711b-419e-bc86-126ed1e45187","Type":"ContainerStarted","Data":"4d686e6ffda796368219ef628dd81fed9b6663d4578cbc556e73d69fb2d6c542"} Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.139418 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c945adfe-427a-48ca-8a7a-035e620e2b6f","Type":"ContainerDied","Data":"0da37745bf528a4271c979da7156527b3b4452f9e33efa33c7e374812c83f154"} Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.139464 5033 scope.go:117] "RemoveContainer" containerID="5d336d7e1dd8d2e587cb2b21bebb0119e31bd3673992ec6af254d08aad5202e9" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.139498 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.170516 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.170496936 podStartE2EDuration="2.170496936s" podCreationTimestamp="2026-02-26 16:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:56.162284341 +0000 UTC m=+1483.374882991" watchObservedRunningTime="2026-02-26 16:05:56.170496936 +0000 UTC m=+1483.383095586" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.176468 5033 scope.go:117] "RemoveContainer" containerID="389dfb2b096768cbaa5921edcf156dea3bd1c534a671dddc196f9385beea0a50" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.196884 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.206587 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c945adfe-427a-48ca-8a7a-035e620e2b6f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.213151 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.224582 5033 scope.go:117] "RemoveContainer" containerID="67bec2447de95a6f4d02b5d04a911eb3249b53de745d14913307fd0c82293865" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.227958 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:56 crc kubenswrapper[5033]: E0226 16:05:56.228462 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="ceilometer-central-agent" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.228481 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="ceilometer-central-agent" Feb 26 16:05:56 crc kubenswrapper[5033]: E0226 16:05:56.228502 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="proxy-httpd" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.228512 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="proxy-httpd" Feb 26 16:05:56 crc kubenswrapper[5033]: E0226 16:05:56.228552 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="sg-core" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.228563 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="sg-core" Feb 26 16:05:56 crc kubenswrapper[5033]: E0226 16:05:56.228580 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="ceilometer-notification-agent" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.228586 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="ceilometer-notification-agent" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.229576 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="ceilometer-central-agent" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.229594 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="sg-core" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.229620 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="ceilometer-notification-agent" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.229641 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" containerName="proxy-httpd" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.233443 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.240406 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.240741 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.240930 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.245398 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.271310 5033 scope.go:117] "RemoveContainer" containerID="1b5be0c22eca389b1ced070d1b6beb048f4cb5af28de9b78acfec76d2b7738ca" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.307911 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-run-httpd\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.308095 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-scripts\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.308124 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkp8w\" (UniqueName: \"kubernetes.io/projected/fd331de5-997f-4bb2-a032-d174a0bf059b-kube-api-access-dkp8w\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.308179 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.308225 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-config-data\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.308272 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.308328 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-log-httpd\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.308359 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.410225 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-run-httpd\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.410677 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-scripts\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.410759 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkp8w\" (UniqueName: \"kubernetes.io/projected/fd331de5-997f-4bb2-a032-d174a0bf059b-kube-api-access-dkp8w\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.410823 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.410913 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-config-data\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.410979 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.411063 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-log-httpd\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.411139 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.411411 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-run-httpd\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.411709 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-log-httpd\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.418541 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.425797 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.425857 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-scripts\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.426139 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.426422 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-config-data\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.432868 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkp8w\" (UniqueName: \"kubernetes.io/projected/fd331de5-997f-4bb2-a032-d174a0bf059b-kube-api-access-dkp8w\") pod \"ceilometer-0\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " pod="openstack/ceilometer-0" Feb 26 16:05:56 crc kubenswrapper[5033]: I0226 16:05:56.567511 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:57 crc kubenswrapper[5033]: I0226 16:05:57.051332 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:57 crc kubenswrapper[5033]: W0226 16:05:57.065517 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd331de5_997f_4bb2_a032_d174a0bf059b.slice/crio-e16de7507f7e8a40917c116a17c4fb708d68c18d495c54e4b52aec253773b77b WatchSource:0}: Error finding container e16de7507f7e8a40917c116a17c4fb708d68c18d495c54e4b52aec253773b77b: Status 404 returned error can't find the container with id e16de7507f7e8a40917c116a17c4fb708d68c18d495c54e4b52aec253773b77b Feb 26 16:05:57 crc kubenswrapper[5033]: I0226 16:05:57.150766 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerStarted","Data":"e16de7507f7e8a40917c116a17c4fb708d68c18d495c54e4b52aec253773b77b"} Feb 26 16:05:57 crc kubenswrapper[5033]: I0226 16:05:57.495606 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:05:57 crc kubenswrapper[5033]: I0226 16:05:57.557120 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nznck"] Feb 26 16:05:57 crc kubenswrapper[5033]: I0226 16:05:57.557443 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-nznck" podUID="1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" containerName="dnsmasq-dns" containerID="cri-o://7ae2a7fc7a368e984faab908fd0ee5c2e4b43b945d8192bc35974c1d60d1b3bc" gracePeriod=10 Feb 26 16:05:57 crc kubenswrapper[5033]: I0226 16:05:57.642378 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c945adfe-427a-48ca-8a7a-035e620e2b6f" path="/var/lib/kubelet/pods/c945adfe-427a-48ca-8a7a-035e620e2b6f/volumes" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.194847 5033 generic.go:334] "Generic (PLEG): container finished" podID="1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" containerID="7ae2a7fc7a368e984faab908fd0ee5c2e4b43b945d8192bc35974c1d60d1b3bc" exitCode=0 Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.195008 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nznck" event={"ID":"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4","Type":"ContainerDied","Data":"7ae2a7fc7a368e984faab908fd0ee5c2e4b43b945d8192bc35974c1d60d1b3bc"} Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.198036 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerStarted","Data":"efe61e5ba101f92c1e8b4416d9d0570b31c5948ae215ef72723140bf034dbde5"} Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.567079 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.768393 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-sb\") pod \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.768468 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-svc\") pod \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.768570 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cgck\" (UniqueName: \"kubernetes.io/projected/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-kube-api-access-2cgck\") pod \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.768648 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-nb\") pod \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.768732 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-config\") pod \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.768753 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-swift-storage-0\") pod \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\" (UID: \"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4\") " Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.783498 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-kube-api-access-2cgck" (OuterVolumeSpecName: "kube-api-access-2cgck") pod "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" (UID: "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4"). InnerVolumeSpecName "kube-api-access-2cgck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.860352 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" (UID: "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.860360 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" (UID: "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.872655 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cgck\" (UniqueName: \"kubernetes.io/projected/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-kube-api-access-2cgck\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.872710 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.872723 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.885248 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" (UID: "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.926787 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-config" (OuterVolumeSpecName: "config") pod "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" (UID: "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.944759 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" (UID: "1398e63b-2d24-47ab-a400-8a8b3d3d8cd4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.975012 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.975120 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:58 crc kubenswrapper[5033]: I0226 16:05:58.975175 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.207823 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerStarted","Data":"7c2836b72b99580d0853eb6d7ae0794bc4651ce530042e155abe1f4de07321f5"} Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.210629 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-nznck" event={"ID":"1398e63b-2d24-47ab-a400-8a8b3d3d8cd4","Type":"ContainerDied","Data":"549e66a07903658736d752f8cdbd9222a862c59dad83562f770c87c356072bce"} Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.210718 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-nznck" Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.210760 5033 scope.go:117] "RemoveContainer" containerID="7ae2a7fc7a368e984faab908fd0ee5c2e4b43b945d8192bc35974c1d60d1b3bc" Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.242864 5033 scope.go:117] "RemoveContainer" containerID="1521926132cf89bcf3a78d3cf72fa7e05c7325a78bb48002c8d3030ca6645460" Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.258441 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nznck"] Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.270844 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-nznck"] Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.552312 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.554122 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.560383 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.581776 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:05:59 crc kubenswrapper[5033]: I0226 16:05:59.640440 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" path="/var/lib/kubelet/pods/1398e63b-2d24-47ab-a400-8a8b3d3d8cd4/volumes" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.132744 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535366-kk997"] Feb 26 16:06:00 crc kubenswrapper[5033]: E0226 16:06:00.133231 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" containerName="dnsmasq-dns" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.133283 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" containerName="dnsmasq-dns" Feb 26 16:06:00 crc kubenswrapper[5033]: E0226 16:06:00.133297 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" containerName="init" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.133305 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" containerName="init" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.133626 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1398e63b-2d24-47ab-a400-8a8b3d3d8cd4" containerName="dnsmasq-dns" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.134476 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535366-kk997" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.137537 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.137745 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.137668 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.143492 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535366-kk997"] Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.259272 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.298990 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbcwv\" (UniqueName: \"kubernetes.io/projected/e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f-kube-api-access-sbcwv\") pod \"auto-csr-approver-29535366-kk997\" (UID: \"e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f\") " pod="openshift-infra/auto-csr-approver-29535366-kk997" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.401412 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbcwv\" (UniqueName: \"kubernetes.io/projected/e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f-kube-api-access-sbcwv\") pod \"auto-csr-approver-29535366-kk997\" (UID: \"e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f\") " pod="openshift-infra/auto-csr-approver-29535366-kk997" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.419139 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-x8mq4"] Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.420720 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.424646 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.426891 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbcwv\" (UniqueName: \"kubernetes.io/projected/e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f-kube-api-access-sbcwv\") pod \"auto-csr-approver-29535366-kk997\" (UID: \"e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f\") " pod="openshift-infra/auto-csr-approver-29535366-kk997" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.444663 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.453758 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535366-kk997" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.473637 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-x8mq4"] Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.567457 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.567538 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.607431 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4r4m\" (UniqueName: \"kubernetes.io/projected/ec28e69b-de68-42a0-bc60-c41a9d87c90e-kube-api-access-s4r4m\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.607750 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-scripts\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.607785 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-config-data\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.608194 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.710325 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-config-data\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.710486 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.710577 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4r4m\" (UniqueName: \"kubernetes.io/projected/ec28e69b-de68-42a0-bc60-c41a9d87c90e-kube-api-access-s4r4m\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.710605 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-scripts\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.718500 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-config-data\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.720127 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.731723 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-scripts\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.735953 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4r4m\" (UniqueName: \"kubernetes.io/projected/ec28e69b-de68-42a0-bc60-c41a9d87c90e-kube-api-access-s4r4m\") pod \"nova-cell1-cell-mapping-x8mq4\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.802574 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:00 crc kubenswrapper[5033]: I0226 16:06:00.844055 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535366-kk997"] Feb 26 16:06:00 crc kubenswrapper[5033]: W0226 16:06:00.847738 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9e2e282_a5ab_4fa8_89e9_b7eee6275f4f.slice/crio-a39a04422845fe87a2a2defd77d4c323caa288ba7b9fc7f693397df736323972 WatchSource:0}: Error finding container a39a04422845fe87a2a2defd77d4c323caa288ba7b9fc7f693397df736323972: Status 404 returned error can't find the container with id a39a04422845fe87a2a2defd77d4c323caa288ba7b9fc7f693397df736323972 Feb 26 16:06:01 crc kubenswrapper[5033]: I0226 16:06:01.250459 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535366-kk997" event={"ID":"e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f","Type":"ContainerStarted","Data":"a39a04422845fe87a2a2defd77d4c323caa288ba7b9fc7f693397df736323972"} Feb 26 16:06:01 crc kubenswrapper[5033]: I0226 16:06:01.254425 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerStarted","Data":"274baa6f0fac074022c217f76e1ef56693c6f2d00d433a2f9b4042c158da447b"} Feb 26 16:06:01 crc kubenswrapper[5033]: I0226 16:06:01.305029 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-x8mq4"] Feb 26 16:06:01 crc kubenswrapper[5033]: W0226 16:06:01.321182 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec28e69b_de68_42a0_bc60_c41a9d87c90e.slice/crio-05f1f427c6f915dc8d4efc377047411755d9214aac58f746fa05b76f3f74f0b0 WatchSource:0}: Error finding container 05f1f427c6f915dc8d4efc377047411755d9214aac58f746fa05b76f3f74f0b0: Status 404 returned error can't find the container with id 05f1f427c6f915dc8d4efc377047411755d9214aac58f746fa05b76f3f74f0b0 Feb 26 16:06:02 crc kubenswrapper[5033]: I0226 16:06:02.270816 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535366-kk997" event={"ID":"e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f","Type":"ContainerStarted","Data":"5023613edb03d8f6e62ea75ac5a8258a7c708083278aa3e2673f05799f53cdb1"} Feb 26 16:06:02 crc kubenswrapper[5033]: I0226 16:06:02.275875 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x8mq4" event={"ID":"ec28e69b-de68-42a0-bc60-c41a9d87c90e","Type":"ContainerStarted","Data":"ea8b8c6bf7109310286c3358cc85b56b065d8fb055d796d627b747c0847016e4"} Feb 26 16:06:02 crc kubenswrapper[5033]: I0226 16:06:02.275913 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x8mq4" event={"ID":"ec28e69b-de68-42a0-bc60-c41a9d87c90e","Type":"ContainerStarted","Data":"05f1f427c6f915dc8d4efc377047411755d9214aac58f746fa05b76f3f74f0b0"} Feb 26 16:06:02 crc kubenswrapper[5033]: I0226 16:06:02.300086 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535366-kk997" podStartSLOduration=1.241221058 podStartE2EDuration="2.300065778s" podCreationTimestamp="2026-02-26 16:06:00 +0000 UTC" firstStartedPulling="2026-02-26 16:06:00.850364543 +0000 UTC m=+1488.062963193" lastFinishedPulling="2026-02-26 16:06:01.909209263 +0000 UTC m=+1489.121807913" observedRunningTime="2026-02-26 16:06:02.285307642 +0000 UTC m=+1489.497906292" watchObservedRunningTime="2026-02-26 16:06:02.300065778 +0000 UTC m=+1489.512664438" Feb 26 16:06:02 crc kubenswrapper[5033]: I0226 16:06:02.309989 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-x8mq4" podStartSLOduration=2.309970953 podStartE2EDuration="2.309970953s" podCreationTimestamp="2026-02-26 16:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:02.30825057 +0000 UTC m=+1489.520849230" watchObservedRunningTime="2026-02-26 16:06:02.309970953 +0000 UTC m=+1489.522569613" Feb 26 16:06:03 crc kubenswrapper[5033]: I0226 16:06:03.287493 5033 generic.go:334] "Generic (PLEG): container finished" podID="e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f" containerID="5023613edb03d8f6e62ea75ac5a8258a7c708083278aa3e2673f05799f53cdb1" exitCode=0 Feb 26 16:06:03 crc kubenswrapper[5033]: I0226 16:06:03.287674 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535366-kk997" event={"ID":"e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f","Type":"ContainerDied","Data":"5023613edb03d8f6e62ea75ac5a8258a7c708083278aa3e2673f05799f53cdb1"} Feb 26 16:06:03 crc kubenswrapper[5033]: I0226 16:06:03.294191 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerStarted","Data":"fd787498335d14e2033ee5b470f24eb5998e703815bbba943d72e487ec251728"} Feb 26 16:06:03 crc kubenswrapper[5033]: I0226 16:06:03.294239 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:06:03 crc kubenswrapper[5033]: I0226 16:06:03.332675 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.197517981 podStartE2EDuration="7.332655289s" podCreationTimestamp="2026-02-26 16:05:56 +0000 UTC" firstStartedPulling="2026-02-26 16:05:57.068401539 +0000 UTC m=+1484.281000189" lastFinishedPulling="2026-02-26 16:06:02.203538847 +0000 UTC m=+1489.416137497" observedRunningTime="2026-02-26 16:06:03.326820915 +0000 UTC m=+1490.539419595" watchObservedRunningTime="2026-02-26 16:06:03.332655289 +0000 UTC m=+1490.545253959" Feb 26 16:06:03 crc kubenswrapper[5033]: I0226 16:06:03.744319 5033 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podb3e4fe96-44fe-4cb0-94b3-f1df3b376c61"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podb3e4fe96-44fe-4cb0-94b3-f1df3b376c61] : Timed out while waiting for systemd to remove kubepods-burstable-podb3e4fe96_44fe_4cb0_94b3_f1df3b376c61.slice" Feb 26 16:06:04 crc kubenswrapper[5033]: I0226 16:06:04.429837 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:06:04 crc kubenswrapper[5033]: I0226 16:06:04.430578 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:06:04 crc kubenswrapper[5033]: I0226 16:06:04.683888 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535366-kk997" Feb 26 16:06:04 crc kubenswrapper[5033]: I0226 16:06:04.790844 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbcwv\" (UniqueName: \"kubernetes.io/projected/e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f-kube-api-access-sbcwv\") pod \"e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f\" (UID: \"e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f\") " Feb 26 16:06:04 crc kubenswrapper[5033]: I0226 16:06:04.805540 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f-kube-api-access-sbcwv" (OuterVolumeSpecName: "kube-api-access-sbcwv") pod "e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f" (UID: "e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f"). InnerVolumeSpecName "kube-api-access-sbcwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:04 crc kubenswrapper[5033]: I0226 16:06:04.893065 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbcwv\" (UniqueName: \"kubernetes.io/projected/e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f-kube-api-access-sbcwv\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:05 crc kubenswrapper[5033]: I0226 16:06:05.313612 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535366-kk997" event={"ID":"e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f","Type":"ContainerDied","Data":"a39a04422845fe87a2a2defd77d4c323caa288ba7b9fc7f693397df736323972"} Feb 26 16:06:05 crc kubenswrapper[5033]: I0226 16:06:05.313652 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a39a04422845fe87a2a2defd77d4c323caa288ba7b9fc7f693397df736323972" Feb 26 16:06:05 crc kubenswrapper[5033]: I0226 16:06:05.313672 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535366-kk997" Feb 26 16:06:05 crc kubenswrapper[5033]: I0226 16:06:05.380220 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535360-m5lw2"] Feb 26 16:06:05 crc kubenswrapper[5033]: I0226 16:06:05.389527 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535360-m5lw2"] Feb 26 16:06:05 crc kubenswrapper[5033]: I0226 16:06:05.447461 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:06:05 crc kubenswrapper[5033]: I0226 16:06:05.447857 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:06:05 crc kubenswrapper[5033]: I0226 16:06:05.639759 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3350667e-a84c-4d48-8839-aecaa3b75a2a" path="/var/lib/kubelet/pods/3350667e-a84c-4d48-8839-aecaa3b75a2a/volumes" Feb 26 16:06:07 crc kubenswrapper[5033]: I0226 16:06:07.331473 5033 generic.go:334] "Generic (PLEG): container finished" podID="ec28e69b-de68-42a0-bc60-c41a9d87c90e" containerID="ea8b8c6bf7109310286c3358cc85b56b065d8fb055d796d627b747c0847016e4" exitCode=0 Feb 26 16:06:07 crc kubenswrapper[5033]: I0226 16:06:07.331552 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x8mq4" event={"ID":"ec28e69b-de68-42a0-bc60-c41a9d87c90e","Type":"ContainerDied","Data":"ea8b8c6bf7109310286c3358cc85b56b065d8fb055d796d627b747c0847016e4"} Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.706506 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.864433 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4r4m\" (UniqueName: \"kubernetes.io/projected/ec28e69b-de68-42a0-bc60-c41a9d87c90e-kube-api-access-s4r4m\") pod \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.864527 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-scripts\") pod \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.864582 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-config-data\") pod \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.864772 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-combined-ca-bundle\") pod \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\" (UID: \"ec28e69b-de68-42a0-bc60-c41a9d87c90e\") " Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.871157 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-scripts" (OuterVolumeSpecName: "scripts") pod "ec28e69b-de68-42a0-bc60-c41a9d87c90e" (UID: "ec28e69b-de68-42a0-bc60-c41a9d87c90e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.871488 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec28e69b-de68-42a0-bc60-c41a9d87c90e-kube-api-access-s4r4m" (OuterVolumeSpecName: "kube-api-access-s4r4m") pod "ec28e69b-de68-42a0-bc60-c41a9d87c90e" (UID: "ec28e69b-de68-42a0-bc60-c41a9d87c90e"). InnerVolumeSpecName "kube-api-access-s4r4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.894909 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-config-data" (OuterVolumeSpecName: "config-data") pod "ec28e69b-de68-42a0-bc60-c41a9d87c90e" (UID: "ec28e69b-de68-42a0-bc60-c41a9d87c90e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.899959 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec28e69b-de68-42a0-bc60-c41a9d87c90e" (UID: "ec28e69b-de68-42a0-bc60-c41a9d87c90e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.967398 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.967437 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.967448 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4r4m\" (UniqueName: \"kubernetes.io/projected/ec28e69b-de68-42a0-bc60-c41a9d87c90e-kube-api-access-s4r4m\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:08 crc kubenswrapper[5033]: I0226 16:06:08.967458 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec28e69b-de68-42a0-bc60-c41a9d87c90e-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.358238 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x8mq4" event={"ID":"ec28e69b-de68-42a0-bc60-c41a9d87c90e","Type":"ContainerDied","Data":"05f1f427c6f915dc8d4efc377047411755d9214aac58f746fa05b76f3f74f0b0"} Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.358633 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f1f427c6f915dc8d4efc377047411755d9214aac58f746fa05b76f3f74f0b0" Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.358725 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x8mq4" Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.533313 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.533636 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-api" containerID="cri-o://6682204d782cfae0fde5aa67fe8994db1f41675cb1ffbfdf97c4c18d9b1f54ee" gracePeriod=30 Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.533573 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-log" containerID="cri-o://c15c13586ec73191c3332ec45d1c0c02fbd4f67cb0bcb3dc8a9cdee81b491f54" gracePeriod=30 Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.549210 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.549486 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4b962555-2817-45b2-88e3-7618e6d7c74b" containerName="nova-scheduler-scheduler" containerID="cri-o://f9f7328cb4211b752ac63a6e537eb7e195ecbfc52e91f427fc9cd85c2ff35cc2" gracePeriod=30 Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.563934 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.564634 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-log" containerID="cri-o://6f81ef7e60c36350746ebf868687502dc916cf71af0c8866c4d79814b51151f7" gracePeriod=30 Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.564413 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-metadata" containerID="cri-o://dc6dc75920b601a897218ec03556e7703f8571b16b9f948992ac68c3a552a5b8" gracePeriod=30 Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.580233 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": EOF" Feb 26 16:06:09 crc kubenswrapper[5033]: I0226 16:06:09.580876 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": EOF" Feb 26 16:06:09 crc kubenswrapper[5033]: E0226 16:06:09.896940 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9f7328cb4211b752ac63a6e537eb7e195ecbfc52e91f427fc9cd85c2ff35cc2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:06:09 crc kubenswrapper[5033]: E0226 16:06:09.898444 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9f7328cb4211b752ac63a6e537eb7e195ecbfc52e91f427fc9cd85c2ff35cc2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:06:09 crc kubenswrapper[5033]: E0226 16:06:09.902515 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9f7328cb4211b752ac63a6e537eb7e195ecbfc52e91f427fc9cd85c2ff35cc2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:06:09 crc kubenswrapper[5033]: E0226 16:06:09.902546 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4b962555-2817-45b2-88e3-7618e6d7c74b" containerName="nova-scheduler-scheduler" Feb 26 16:06:10 crc kubenswrapper[5033]: I0226 16:06:10.382376 5033 generic.go:334] "Generic (PLEG): container finished" podID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerID="6f81ef7e60c36350746ebf868687502dc916cf71af0c8866c4d79814b51151f7" exitCode=143 Feb 26 16:06:10 crc kubenswrapper[5033]: I0226 16:06:10.382563 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93066a1a-6c0c-4b1e-a257-052a4db51d70","Type":"ContainerDied","Data":"6f81ef7e60c36350746ebf868687502dc916cf71af0c8866c4d79814b51151f7"} Feb 26 16:06:10 crc kubenswrapper[5033]: I0226 16:06:10.385615 5033 generic.go:334] "Generic (PLEG): container finished" podID="aeff5347-711b-419e-bc86-126ed1e45187" containerID="c15c13586ec73191c3332ec45d1c0c02fbd4f67cb0bcb3dc8a9cdee81b491f54" exitCode=143 Feb 26 16:06:10 crc kubenswrapper[5033]: I0226 16:06:10.385653 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeff5347-711b-419e-bc86-126ed1e45187","Type":"ContainerDied","Data":"c15c13586ec73191c3332ec45d1c0c02fbd4f67cb0bcb3dc8a9cdee81b491f54"} Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.419640 5033 generic.go:334] "Generic (PLEG): container finished" podID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerID="dc6dc75920b601a897218ec03556e7703f8571b16b9f948992ac68c3a552a5b8" exitCode=0 Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.420174 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93066a1a-6c0c-4b1e-a257-052a4db51d70","Type":"ContainerDied","Data":"dc6dc75920b601a897218ec03556e7703f8571b16b9f948992ac68c3a552a5b8"} Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.420203 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93066a1a-6c0c-4b1e-a257-052a4db51d70","Type":"ContainerDied","Data":"0a129f201fe5ecd7c2d443c221ac7cfab22854720817e9bcf8ccd396c379ec44"} Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.420214 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a129f201fe5ecd7c2d443c221ac7cfab22854720817e9bcf8ccd396c379ec44" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.422328 5033 generic.go:334] "Generic (PLEG): container finished" podID="aeff5347-711b-419e-bc86-126ed1e45187" containerID="6682204d782cfae0fde5aa67fe8994db1f41675cb1ffbfdf97c4c18d9b1f54ee" exitCode=0 Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.422378 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeff5347-711b-419e-bc86-126ed1e45187","Type":"ContainerDied","Data":"6682204d782cfae0fde5aa67fe8994db1f41675cb1ffbfdf97c4c18d9b1f54ee"} Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.450628 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.537415 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.559945 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-config-data\") pod \"93066a1a-6c0c-4b1e-a257-052a4db51d70\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.560105 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdltr\" (UniqueName: \"kubernetes.io/projected/93066a1a-6c0c-4b1e-a257-052a4db51d70-kube-api-access-tdltr\") pod \"93066a1a-6c0c-4b1e-a257-052a4db51d70\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.563007 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-combined-ca-bundle\") pod \"93066a1a-6c0c-4b1e-a257-052a4db51d70\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.563050 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93066a1a-6c0c-4b1e-a257-052a4db51d70-logs\") pod \"93066a1a-6c0c-4b1e-a257-052a4db51d70\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.563211 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-nova-metadata-tls-certs\") pod \"93066a1a-6c0c-4b1e-a257-052a4db51d70\" (UID: \"93066a1a-6c0c-4b1e-a257-052a4db51d70\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.584397 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93066a1a-6c0c-4b1e-a257-052a4db51d70-kube-api-access-tdltr" (OuterVolumeSpecName: "kube-api-access-tdltr") pod "93066a1a-6c0c-4b1e-a257-052a4db51d70" (UID: "93066a1a-6c0c-4b1e-a257-052a4db51d70"). InnerVolumeSpecName "kube-api-access-tdltr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.598567 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93066a1a-6c0c-4b1e-a257-052a4db51d70-logs" (OuterVolumeSpecName: "logs") pod "93066a1a-6c0c-4b1e-a257-052a4db51d70" (UID: "93066a1a-6c0c-4b1e-a257-052a4db51d70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.603482 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93066a1a-6c0c-4b1e-a257-052a4db51d70" (UID: "93066a1a-6c0c-4b1e-a257-052a4db51d70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.606205 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-config-data" (OuterVolumeSpecName: "config-data") pod "93066a1a-6c0c-4b1e-a257-052a4db51d70" (UID: "93066a1a-6c0c-4b1e-a257-052a4db51d70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.625918 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "93066a1a-6c0c-4b1e-a257-052a4db51d70" (UID: "93066a1a-6c0c-4b1e-a257-052a4db51d70"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.667320 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-combined-ca-bundle\") pod \"aeff5347-711b-419e-bc86-126ed1e45187\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.668413 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lb62\" (UniqueName: \"kubernetes.io/projected/aeff5347-711b-419e-bc86-126ed1e45187-kube-api-access-2lb62\") pod \"aeff5347-711b-419e-bc86-126ed1e45187\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.668593 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-config-data\") pod \"aeff5347-711b-419e-bc86-126ed1e45187\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.668639 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-internal-tls-certs\") pod \"aeff5347-711b-419e-bc86-126ed1e45187\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.668847 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeff5347-711b-419e-bc86-126ed1e45187-logs\") pod \"aeff5347-711b-419e-bc86-126ed1e45187\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.668944 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-public-tls-certs\") pod \"aeff5347-711b-419e-bc86-126ed1e45187\" (UID: \"aeff5347-711b-419e-bc86-126ed1e45187\") " Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.669550 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeff5347-711b-419e-bc86-126ed1e45187-logs" (OuterVolumeSpecName: "logs") pod "aeff5347-711b-419e-bc86-126ed1e45187" (UID: "aeff5347-711b-419e-bc86-126ed1e45187"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.669869 5033 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.669912 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.669932 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeff5347-711b-419e-bc86-126ed1e45187-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.669944 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdltr\" (UniqueName: \"kubernetes.io/projected/93066a1a-6c0c-4b1e-a257-052a4db51d70-kube-api-access-tdltr\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.669957 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93066a1a-6c0c-4b1e-a257-052a4db51d70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.669981 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93066a1a-6c0c-4b1e-a257-052a4db51d70-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.672166 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeff5347-711b-419e-bc86-126ed1e45187-kube-api-access-2lb62" (OuterVolumeSpecName: "kube-api-access-2lb62") pod "aeff5347-711b-419e-bc86-126ed1e45187" (UID: "aeff5347-711b-419e-bc86-126ed1e45187"). InnerVolumeSpecName "kube-api-access-2lb62". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.698669 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeff5347-711b-419e-bc86-126ed1e45187" (UID: "aeff5347-711b-419e-bc86-126ed1e45187"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.700871 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-config-data" (OuterVolumeSpecName: "config-data") pod "aeff5347-711b-419e-bc86-126ed1e45187" (UID: "aeff5347-711b-419e-bc86-126ed1e45187"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.717656 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "aeff5347-711b-419e-bc86-126ed1e45187" (UID: "aeff5347-711b-419e-bc86-126ed1e45187"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.719517 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aeff5347-711b-419e-bc86-126ed1e45187" (UID: "aeff5347-711b-419e-bc86-126ed1e45187"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.772757 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.772798 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lb62\" (UniqueName: \"kubernetes.io/projected/aeff5347-711b-419e-bc86-126ed1e45187-kube-api-access-2lb62\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.772812 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.772823 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[5033]: I0226 16:06:13.772832 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeff5347-711b-419e-bc86-126ed1e45187-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.434858 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aeff5347-711b-419e-bc86-126ed1e45187","Type":"ContainerDied","Data":"4d686e6ffda796368219ef628dd81fed9b6663d4578cbc556e73d69fb2d6c542"} Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.434930 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.435214 5033 scope.go:117] "RemoveContainer" containerID="6682204d782cfae0fde5aa67fe8994db1f41675cb1ffbfdf97c4c18d9b1f54ee" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.439728 5033 generic.go:334] "Generic (PLEG): container finished" podID="4b962555-2817-45b2-88e3-7618e6d7c74b" containerID="f9f7328cb4211b752ac63a6e537eb7e195ecbfc52e91f427fc9cd85c2ff35cc2" exitCode=0 Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.439797 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.439815 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b962555-2817-45b2-88e3-7618e6d7c74b","Type":"ContainerDied","Data":"f9f7328cb4211b752ac63a6e537eb7e195ecbfc52e91f427fc9cd85c2ff35cc2"} Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.475683 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.484017 5033 scope.go:117] "RemoveContainer" containerID="c15c13586ec73191c3332ec45d1c0c02fbd4f67cb0bcb3dc8a9cdee81b491f54" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.495897 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.521616 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.551021 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.558964 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:06:14 crc kubenswrapper[5033]: E0226 16:06:14.559326 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-log" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559342 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-log" Feb 26 16:06:14 crc kubenswrapper[5033]: E0226 16:06:14.559367 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-api" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559374 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-api" Feb 26 16:06:14 crc kubenswrapper[5033]: E0226 16:06:14.559384 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec28e69b-de68-42a0-bc60-c41a9d87c90e" containerName="nova-manage" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559390 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec28e69b-de68-42a0-bc60-c41a9d87c90e" containerName="nova-manage" Feb 26 16:06:14 crc kubenswrapper[5033]: E0226 16:06:14.559407 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-log" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559413 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-log" Feb 26 16:06:14 crc kubenswrapper[5033]: E0226 16:06:14.559418 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f" containerName="oc" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559423 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f" containerName="oc" Feb 26 16:06:14 crc kubenswrapper[5033]: E0226 16:06:14.559443 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-metadata" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559449 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-metadata" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559595 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-metadata" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559612 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec28e69b-de68-42a0-bc60-c41a9d87c90e" containerName="nova-manage" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559624 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-api" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559645 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f" containerName="oc" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559658 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" containerName="nova-metadata-log" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.559671 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeff5347-711b-419e-bc86-126ed1e45187" containerName="nova-api-log" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.560535 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.563128 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.563485 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.571865 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.574243 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.578238 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.578522 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.580104 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.581829 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.589950 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.592515 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687003 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-config-data\") pod \"4b962555-2817-45b2-88e3-7618e6d7c74b\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687212 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-combined-ca-bundle\") pod \"4b962555-2817-45b2-88e3-7618e6d7c74b\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687268 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbgxc\" (UniqueName: \"kubernetes.io/projected/4b962555-2817-45b2-88e3-7618e6d7c74b-kube-api-access-fbgxc\") pod \"4b962555-2817-45b2-88e3-7618e6d7c74b\" (UID: \"4b962555-2817-45b2-88e3-7618e6d7c74b\") " Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687499 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687550 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-logs\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687574 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9jd7\" (UniqueName: \"kubernetes.io/projected/fb2a5173-ce4c-4d4a-a136-70356adabbb9-kube-api-access-z9jd7\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687588 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-config-data\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687610 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frc4t\" (UniqueName: \"kubernetes.io/projected/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-kube-api-access-frc4t\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687631 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-config-data\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687872 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.687984 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb2a5173-ce4c-4d4a-a136-70356adabbb9-logs\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.688096 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.688204 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.688359 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.698401 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b962555-2817-45b2-88e3-7618e6d7c74b-kube-api-access-fbgxc" (OuterVolumeSpecName: "kube-api-access-fbgxc") pod "4b962555-2817-45b2-88e3-7618e6d7c74b" (UID: "4b962555-2817-45b2-88e3-7618e6d7c74b"). InnerVolumeSpecName "kube-api-access-fbgxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.715068 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-config-data" (OuterVolumeSpecName: "config-data") pod "4b962555-2817-45b2-88e3-7618e6d7c74b" (UID: "4b962555-2817-45b2-88e3-7618e6d7c74b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.722602 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b962555-2817-45b2-88e3-7618e6d7c74b" (UID: "4b962555-2817-45b2-88e3-7618e6d7c74b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789579 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-logs\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789632 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9jd7\" (UniqueName: \"kubernetes.io/projected/fb2a5173-ce4c-4d4a-a136-70356adabbb9-kube-api-access-z9jd7\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789653 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-config-data\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789677 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frc4t\" (UniqueName: \"kubernetes.io/projected/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-kube-api-access-frc4t\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789701 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-config-data\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789740 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789761 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb2a5173-ce4c-4d4a-a136-70356adabbb9-logs\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789794 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789826 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789861 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.789893 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.790036 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbgxc\" (UniqueName: \"kubernetes.io/projected/4b962555-2817-45b2-88e3-7618e6d7c74b-kube-api-access-fbgxc\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.790085 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.790095 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b962555-2817-45b2-88e3-7618e6d7c74b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.790221 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb2a5173-ce4c-4d4a-a136-70356adabbb9-logs\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.790572 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-logs\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.795030 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.795035 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.795084 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-config-data\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.795072 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.795924 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.796467 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.796687 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-config-data\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.807015 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frc4t\" (UniqueName: \"kubernetes.io/projected/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-kube-api-access-frc4t\") pod \"nova-metadata-0\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.808314 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9jd7\" (UniqueName: \"kubernetes.io/projected/fb2a5173-ce4c-4d4a-a136-70356adabbb9-kube-api-access-z9jd7\") pod \"nova-api-0\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " pod="openstack/nova-api-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.905690 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:06:14 crc kubenswrapper[5033]: I0226 16:06:14.916075 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.454540 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:06:15 crc kubenswrapper[5033]: W0226 16:06:15.456318 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea4c6e17_4b51_4387_98c8_d071c1fc7c5a.slice/crio-908ad1a560cc30affda4c6aead0dd22bd4e7d2a13900651cbf3603b49754f852 WatchSource:0}: Error finding container 908ad1a560cc30affda4c6aead0dd22bd4e7d2a13900651cbf3603b49754f852: Status 404 returned error can't find the container with id 908ad1a560cc30affda4c6aead0dd22bd4e7d2a13900651cbf3603b49754f852 Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.458524 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b962555-2817-45b2-88e3-7618e6d7c74b","Type":"ContainerDied","Data":"9628a6d80e2604137f5f4125c117002f5b3a51f039fc9b83f9ead186ab9f9e2e"} Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.458590 5033 scope.go:117] "RemoveContainer" containerID="f9f7328cb4211b752ac63a6e537eb7e195ecbfc52e91f427fc9cd85c2ff35cc2" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.458753 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.514373 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.526929 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.539946 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:06:15 crc kubenswrapper[5033]: E0226 16:06:15.540613 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b962555-2817-45b2-88e3-7618e6d7c74b" containerName="nova-scheduler-scheduler" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.540699 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b962555-2817-45b2-88e3-7618e6d7c74b" containerName="nova-scheduler-scheduler" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.540944 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b962555-2817-45b2-88e3-7618e6d7c74b" containerName="nova-scheduler-scheduler" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.541723 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.544605 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.550752 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.565502 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.639948 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b962555-2817-45b2-88e3-7618e6d7c74b" path="/var/lib/kubelet/pods/4b962555-2817-45b2-88e3-7618e6d7c74b/volumes" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.641299 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93066a1a-6c0c-4b1e-a257-052a4db51d70" path="/var/lib/kubelet/pods/93066a1a-6c0c-4b1e-a257-052a4db51d70/volumes" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.643542 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeff5347-711b-419e-bc86-126ed1e45187" path="/var/lib/kubelet/pods/aeff5347-711b-419e-bc86-126ed1e45187/volumes" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.710867 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.711074 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpf4t\" (UniqueName: \"kubernetes.io/projected/d989fd27-f36c-4503-a400-e09bb4e70398-kube-api-access-vpf4t\") pod \"nova-scheduler-0\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.711151 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-config-data\") pod \"nova-scheduler-0\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.812980 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.813062 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpf4t\" (UniqueName: \"kubernetes.io/projected/d989fd27-f36c-4503-a400-e09bb4e70398-kube-api-access-vpf4t\") pod \"nova-scheduler-0\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.813096 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-config-data\") pod \"nova-scheduler-0\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.820414 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-config-data\") pod \"nova-scheduler-0\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.820421 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " pod="openstack/nova-scheduler-0" Feb 26 16:06:15 crc kubenswrapper[5033]: I0226 16:06:15.833575 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpf4t\" (UniqueName: \"kubernetes.io/projected/d989fd27-f36c-4503-a400-e09bb4e70398-kube-api-access-vpf4t\") pod \"nova-scheduler-0\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " pod="openstack/nova-scheduler-0" Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.017739 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.452871 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:06:16 crc kubenswrapper[5033]: W0226 16:06:16.454688 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd989fd27_f36c_4503_a400_e09bb4e70398.slice/crio-0800dcfb803da44e95f2e664091861b1ee91295b19444a1da50a57c7fc5f4635 WatchSource:0}: Error finding container 0800dcfb803da44e95f2e664091861b1ee91295b19444a1da50a57c7fc5f4635: Status 404 returned error can't find the container with id 0800dcfb803da44e95f2e664091861b1ee91295b19444a1da50a57c7fc5f4635 Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.469103 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d989fd27-f36c-4503-a400-e09bb4e70398","Type":"ContainerStarted","Data":"0800dcfb803da44e95f2e664091861b1ee91295b19444a1da50a57c7fc5f4635"} Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.475028 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a","Type":"ContainerStarted","Data":"4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166"} Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.475203 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a","Type":"ContainerStarted","Data":"bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520"} Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.475340 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a","Type":"ContainerStarted","Data":"908ad1a560cc30affda4c6aead0dd22bd4e7d2a13900651cbf3603b49754f852"} Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.478800 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb2a5173-ce4c-4d4a-a136-70356adabbb9","Type":"ContainerStarted","Data":"b6eace0b2103a031f250c85a90a9746babaf3bcd0afd8cd89a42175fe2fb3877"} Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.478929 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb2a5173-ce4c-4d4a-a136-70356adabbb9","Type":"ContainerStarted","Data":"761688171a3882c19e60e75384d7494e2eaf7d87bf59d1ddf87fe813049c4ba5"} Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.479048 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb2a5173-ce4c-4d4a-a136-70356adabbb9","Type":"ContainerStarted","Data":"8d76e3a5afef1c114c747943f3f2a6468ea7e8c016af89ad72742973d77f7844"} Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.496483 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.496465255 podStartE2EDuration="2.496465255s" podCreationTimestamp="2026-02-26 16:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:16.490342122 +0000 UTC m=+1503.702940772" watchObservedRunningTime="2026-02-26 16:06:16.496465255 +0000 UTC m=+1503.709063905" Feb 26 16:06:16 crc kubenswrapper[5033]: I0226 16:06:16.516035 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.516019149 podStartE2EDuration="2.516019149s" podCreationTimestamp="2026-02-26 16:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:16.513166789 +0000 UTC m=+1503.725765449" watchObservedRunningTime="2026-02-26 16:06:16.516019149 +0000 UTC m=+1503.728617799" Feb 26 16:06:17 crc kubenswrapper[5033]: E0226 16:06:17.414536 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93066a1a_6c0c_4b1e_a257_052a4db51d70.slice\": RecentStats: unable to find data in memory cache]" Feb 26 16:06:17 crc kubenswrapper[5033]: I0226 16:06:17.489456 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d989fd27-f36c-4503-a400-e09bb4e70398","Type":"ContainerStarted","Data":"8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5"} Feb 26 16:06:17 crc kubenswrapper[5033]: I0226 16:06:17.508220 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.508201029 podStartE2EDuration="2.508201029s" podCreationTimestamp="2026-02-26 16:06:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:17.503318219 +0000 UTC m=+1504.715916879" watchObservedRunningTime="2026-02-26 16:06:17.508201029 +0000 UTC m=+1504.720799679" Feb 26 16:06:19 crc kubenswrapper[5033]: I0226 16:06:19.906317 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:06:19 crc kubenswrapper[5033]: I0226 16:06:19.906371 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:06:21 crc kubenswrapper[5033]: I0226 16:06:21.018652 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 16:06:24 crc kubenswrapper[5033]: I0226 16:06:24.906968 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 16:06:24 crc kubenswrapper[5033]: I0226 16:06:24.907063 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 16:06:24 crc kubenswrapper[5033]: I0226 16:06:24.917544 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:06:24 crc kubenswrapper[5033]: I0226 16:06:24.917662 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:06:25 crc kubenswrapper[5033]: I0226 16:06:25.921444 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:06:25 crc kubenswrapper[5033]: I0226 16:06:25.921484 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:06:25 crc kubenswrapper[5033]: I0226 16:06:25.940529 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:06:25 crc kubenswrapper[5033]: I0226 16:06:25.940565 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:06:26 crc kubenswrapper[5033]: I0226 16:06:26.017978 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 16:06:26 crc kubenswrapper[5033]: I0226 16:06:26.045270 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 16:06:26 crc kubenswrapper[5033]: I0226 16:06:26.575282 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 16:06:26 crc kubenswrapper[5033]: I0226 16:06:26.654559 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 16:06:27 crc kubenswrapper[5033]: I0226 16:06:27.012197 5033 scope.go:117] "RemoveContainer" containerID="45897f3d33a2eee6754b7aa0b054d288b06ca6f2b6ac01bafc31387f6118566d" Feb 26 16:06:27 crc kubenswrapper[5033]: E0226 16:06:27.644798 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93066a1a_6c0c_4b1e_a257_052a4db51d70.slice\": RecentStats: unable to find data in memory cache]" Feb 26 16:06:34 crc kubenswrapper[5033]: I0226 16:06:34.910428 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 16:06:34 crc kubenswrapper[5033]: I0226 16:06:34.911078 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 16:06:34 crc kubenswrapper[5033]: I0226 16:06:34.916541 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 16:06:34 crc kubenswrapper[5033]: I0226 16:06:34.921458 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 16:06:34 crc kubenswrapper[5033]: I0226 16:06:34.921759 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 16:06:34 crc kubenswrapper[5033]: I0226 16:06:34.922666 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 16:06:34 crc kubenswrapper[5033]: I0226 16:06:34.923067 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 16:06:34 crc kubenswrapper[5033]: I0226 16:06:34.935797 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 16:06:35 crc kubenswrapper[5033]: I0226 16:06:35.676249 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 16:06:35 crc kubenswrapper[5033]: I0226 16:06:35.706594 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 16:06:37 crc kubenswrapper[5033]: E0226 16:06:37.878440 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93066a1a_6c0c_4b1e_a257_052a4db51d70.slice\": RecentStats: unable to find data in memory cache]" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.403613 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-25q4r"] Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.406666 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.426580 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25q4r"] Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.505335 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8n64\" (UniqueName: \"kubernetes.io/projected/a4dc26b2-5d4e-4d28-af34-67307512a4cd-kube-api-access-s8n64\") pod \"community-operators-25q4r\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.505751 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-catalog-content\") pod \"community-operators-25q4r\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.505803 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-utilities\") pod \"community-operators-25q4r\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.607969 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-catalog-content\") pod \"community-operators-25q4r\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.608100 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-utilities\") pod \"community-operators-25q4r\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.608270 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8n64\" (UniqueName: \"kubernetes.io/projected/a4dc26b2-5d4e-4d28-af34-67307512a4cd-kube-api-access-s8n64\") pod \"community-operators-25q4r\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.608672 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-catalog-content\") pod \"community-operators-25q4r\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.608728 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-utilities\") pod \"community-operators-25q4r\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.627381 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8n64\" (UniqueName: \"kubernetes.io/projected/a4dc26b2-5d4e-4d28-af34-67307512a4cd-kube-api-access-s8n64\") pod \"community-operators-25q4r\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:43 crc kubenswrapper[5033]: I0226 16:06:43.726775 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:44 crc kubenswrapper[5033]: I0226 16:06:44.242146 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25q4r"] Feb 26 16:06:44 crc kubenswrapper[5033]: I0226 16:06:44.759873 5033 generic.go:334] "Generic (PLEG): container finished" podID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerID="1958b4cdd155f234eab178140592d44cbf4eeb304eaea0b1e9bbaca583aeb3ef" exitCode=0 Feb 26 16:06:44 crc kubenswrapper[5033]: I0226 16:06:44.759944 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25q4r" event={"ID":"a4dc26b2-5d4e-4d28-af34-67307512a4cd","Type":"ContainerDied","Data":"1958b4cdd155f234eab178140592d44cbf4eeb304eaea0b1e9bbaca583aeb3ef"} Feb 26 16:06:44 crc kubenswrapper[5033]: I0226 16:06:44.760336 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25q4r" event={"ID":"a4dc26b2-5d4e-4d28-af34-67307512a4cd","Type":"ContainerStarted","Data":"8f4b838f3609b6a475bf6418fba140844bc4de677dbe662bfb096a849d92ae66"} Feb 26 16:06:45 crc kubenswrapper[5033]: I0226 16:06:45.771635 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25q4r" event={"ID":"a4dc26b2-5d4e-4d28-af34-67307512a4cd","Type":"ContainerStarted","Data":"5a4e87ec9aaac70ec455a5b5d1fdd7b241dbfd64106d9d905d511b4441d4a558"} Feb 26 16:06:46 crc kubenswrapper[5033]: I0226 16:06:46.791297 5033 generic.go:334] "Generic (PLEG): container finished" podID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerID="5a4e87ec9aaac70ec455a5b5d1fdd7b241dbfd64106d9d905d511b4441d4a558" exitCode=0 Feb 26 16:06:46 crc kubenswrapper[5033]: I0226 16:06:46.791845 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25q4r" event={"ID":"a4dc26b2-5d4e-4d28-af34-67307512a4cd","Type":"ContainerDied","Data":"5a4e87ec9aaac70ec455a5b5d1fdd7b241dbfd64106d9d905d511b4441d4a558"} Feb 26 16:06:47 crc kubenswrapper[5033]: I0226 16:06:47.806784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25q4r" event={"ID":"a4dc26b2-5d4e-4d28-af34-67307512a4cd","Type":"ContainerStarted","Data":"fb66e2834db2c23f74d5cee1c9b1bd88ffb6e98af3223011fc7d948fdd813f2f"} Feb 26 16:06:47 crc kubenswrapper[5033]: I0226 16:06:47.843808 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-25q4r" podStartSLOduration=2.362494687 podStartE2EDuration="4.843785529s" podCreationTimestamp="2026-02-26 16:06:43 +0000 UTC" firstStartedPulling="2026-02-26 16:06:44.762587155 +0000 UTC m=+1531.975185805" lastFinishedPulling="2026-02-26 16:06:47.243877977 +0000 UTC m=+1534.456476647" observedRunningTime="2026-02-26 16:06:47.822911583 +0000 UTC m=+1535.035510303" watchObservedRunningTime="2026-02-26 16:06:47.843785529 +0000 UTC m=+1535.056384179" Feb 26 16:06:48 crc kubenswrapper[5033]: E0226 16:06:48.138887 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93066a1a_6c0c_4b1e_a257_052a4db51d70.slice\": RecentStats: unable to find data in memory cache]" Feb 26 16:06:50 crc kubenswrapper[5033]: I0226 16:06:50.198453 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:06:50 crc kubenswrapper[5033]: I0226 16:06:50.198882 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:06:53 crc kubenswrapper[5033]: I0226 16:06:53.727704 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:53 crc kubenswrapper[5033]: I0226 16:06:53.728023 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:53 crc kubenswrapper[5033]: I0226 16:06:53.768618 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:53 crc kubenswrapper[5033]: I0226 16:06:53.905250 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:54 crc kubenswrapper[5033]: I0226 16:06:54.002965 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25q4r"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.493757 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9702-account-create-update-89h9d"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.494988 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.507696 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.542777 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9702-account-create-update-89h9d"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.580025 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9702-account-create-update-rfpn9"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.594839 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9702-account-create-update-rfpn9"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.603843 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.604140 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c0f0a198-dc99-4384-8ab0-b978bf9275b5" containerName="openstackclient" containerID="cri-o://f4a8a899d4825519a257a55cdd9c58e86c24aafa460378cd50a7461c75a3be78" gracePeriod=2 Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.620343 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.642297 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6" path="/var/lib/kubelet/pods/a5ec00a5-fb3e-4c6f-8d99-8f8df43e6ac6/volumes" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.657360 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-959s5"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.662669 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96dks\" (UniqueName: \"kubernetes.io/projected/907a1311-765c-4cd1-8e92-ed545e59c736-kube-api-access-96dks\") pod \"barbican-9702-account-create-update-89h9d\" (UID: \"907a1311-765c-4cd1-8e92-ed545e59c736\") " pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.662806 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907a1311-765c-4cd1-8e92-ed545e59c736-operator-scripts\") pod \"barbican-9702-account-create-update-89h9d\" (UID: \"907a1311-765c-4cd1-8e92-ed545e59c736\") " pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.668926 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-959s5"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.707701 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-czhv8"] Feb 26 16:06:55 crc kubenswrapper[5033]: E0226 16:06:55.714390 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f0a198-dc99-4384-8ab0-b978bf9275b5" containerName="openstackclient" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.714423 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f0a198-dc99-4384-8ab0-b978bf9275b5" containerName="openstackclient" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.714659 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f0a198-dc99-4384-8ab0-b978bf9275b5" containerName="openstackclient" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.715621 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-czhv8" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.729658 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.737615 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-czhv8"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.768429 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96dks\" (UniqueName: \"kubernetes.io/projected/907a1311-765c-4cd1-8e92-ed545e59c736-kube-api-access-96dks\") pod \"barbican-9702-account-create-update-89h9d\" (UID: \"907a1311-765c-4cd1-8e92-ed545e59c736\") " pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.768562 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907a1311-765c-4cd1-8e92-ed545e59c736-operator-scripts\") pod \"barbican-9702-account-create-update-89h9d\" (UID: \"907a1311-765c-4cd1-8e92-ed545e59c736\") " pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.770991 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907a1311-765c-4cd1-8e92-ed545e59c736-operator-scripts\") pod \"barbican-9702-account-create-update-89h9d\" (UID: \"907a1311-765c-4cd1-8e92-ed545e59c736\") " pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.821234 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96dks\" (UniqueName: \"kubernetes.io/projected/907a1311-765c-4cd1-8e92-ed545e59c736-kube-api-access-96dks\") pod \"barbican-9702-account-create-update-89h9d\" (UID: \"907a1311-765c-4cd1-8e92-ed545e59c736\") " pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.873006 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fdf9-account-create-update-lshpf"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.874518 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.875035 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts\") pod \"root-account-create-update-czhv8\" (UID: \"bd95e3d9-8fc5-4f14-a009-cd1749743679\") " pod="openstack/root-account-create-update-czhv8" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.875104 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpcrc\" (UniqueName: \"kubernetes.io/projected/bd95e3d9-8fc5-4f14-a009-cd1749743679-kube-api-access-bpcrc\") pod \"root-account-create-update-czhv8\" (UID: \"bd95e3d9-8fc5-4f14-a009-cd1749743679\") " pod="openstack/root-account-create-update-czhv8" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.887185 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.896861 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-25q4r" podUID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerName="registry-server" containerID="cri-o://fb66e2834db2c23f74d5cee1c9b1bd88ffb6e98af3223011fc7d948fdd813f2f" gracePeriod=2 Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.903312 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3699-account-create-update-9vvpr"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.904531 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.907504 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.931327 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3699-account-create-update-4dzzm"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.970518 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3699-account-create-update-4dzzm"] Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.977676 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpcrc\" (UniqueName: \"kubernetes.io/projected/bd95e3d9-8fc5-4f14-a009-cd1749743679-kube-api-access-bpcrc\") pod \"root-account-create-update-czhv8\" (UID: \"bd95e3d9-8fc5-4f14-a009-cd1749743679\") " pod="openstack/root-account-create-update-czhv8" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.977844 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs9bq\" (UniqueName: \"kubernetes.io/projected/540a469d-81f3-40f5-9826-27384a03ce2d-kube-api-access-bs9bq\") pod \"placement-fdf9-account-create-update-lshpf\" (UID: \"540a469d-81f3-40f5-9826-27384a03ce2d\") " pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.978012 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540a469d-81f3-40f5-9826-27384a03ce2d-operator-scripts\") pod \"placement-fdf9-account-create-update-lshpf\" (UID: \"540a469d-81f3-40f5-9826-27384a03ce2d\") " pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.978044 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts\") pod \"root-account-create-update-czhv8\" (UID: \"bd95e3d9-8fc5-4f14-a009-cd1749743679\") " pod="openstack/root-account-create-update-czhv8" Feb 26 16:06:55 crc kubenswrapper[5033]: I0226 16:06:55.987790 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts\") pod \"root-account-create-update-czhv8\" (UID: \"bd95e3d9-8fc5-4f14-a009-cd1749743679\") " pod="openstack/root-account-create-update-czhv8" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.002867 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpcrc\" (UniqueName: \"kubernetes.io/projected/bd95e3d9-8fc5-4f14-a009-cd1749743679-kube-api-access-bpcrc\") pod \"root-account-create-update-czhv8\" (UID: \"bd95e3d9-8fc5-4f14-a009-cd1749743679\") " pod="openstack/root-account-create-update-czhv8" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.002927 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fdf9-account-create-update-lshpf"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.015166 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.049987 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3699-account-create-update-9vvpr"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.050920 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-czhv8" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.079611 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs9bq\" (UniqueName: \"kubernetes.io/projected/540a469d-81f3-40f5-9826-27384a03ce2d-kube-api-access-bs9bq\") pod \"placement-fdf9-account-create-update-lshpf\" (UID: \"540a469d-81f3-40f5-9826-27384a03ce2d\") " pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.079690 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8b6t\" (UniqueName: \"kubernetes.io/projected/c13ba380-e3f6-4322-a434-7b69da9e4e3f-kube-api-access-t8b6t\") pod \"glance-3699-account-create-update-9vvpr\" (UID: \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\") " pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.079751 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540a469d-81f3-40f5-9826-27384a03ce2d-operator-scripts\") pod \"placement-fdf9-account-create-update-lshpf\" (UID: \"540a469d-81f3-40f5-9826-27384a03ce2d\") " pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.079808 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13ba380-e3f6-4322-a434-7b69da9e4e3f-operator-scripts\") pod \"glance-3699-account-create-update-9vvpr\" (UID: \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\") " pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.080773 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540a469d-81f3-40f5-9826-27384a03ce2d-operator-scripts\") pod \"placement-fdf9-account-create-update-lshpf\" (UID: \"540a469d-81f3-40f5-9826-27384a03ce2d\") " pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.128489 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.166533 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fdf9-account-create-update-hcvt7"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.178456 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fdf9-account-create-update-hcvt7"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.182329 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8b6t\" (UniqueName: \"kubernetes.io/projected/c13ba380-e3f6-4322-a434-7b69da9e4e3f-kube-api-access-t8b6t\") pod \"glance-3699-account-create-update-9vvpr\" (UID: \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\") " pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:06:56 crc kubenswrapper[5033]: E0226 16:06:56.182401 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 26 16:06:56 crc kubenswrapper[5033]: E0226 16:06:56.182490 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data podName:443e562a-aefe-46f3-a86f-59205f5849c7 nodeName:}" failed. No retries permitted until 2026-02-26 16:06:56.682460426 +0000 UTC m=+1543.895059076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data") pod "rabbitmq-server-0" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7") : configmap "rabbitmq-config-data" not found Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.182686 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13ba380-e3f6-4322-a434-7b69da9e4e3f-operator-scripts\") pod \"glance-3699-account-create-update-9vvpr\" (UID: \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\") " pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.186480 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs9bq\" (UniqueName: \"kubernetes.io/projected/540a469d-81f3-40f5-9826-27384a03ce2d-kube-api-access-bs9bq\") pod \"placement-fdf9-account-create-update-lshpf\" (UID: \"540a469d-81f3-40f5-9826-27384a03ce2d\") " pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.200197 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6a31-account-create-update-gbdhh"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.208831 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13ba380-e3f6-4322-a434-7b69da9e4e3f-operator-scripts\") pod \"glance-3699-account-create-update-9vvpr\" (UID: \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\") " pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.218823 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.236350 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.296611 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.297440 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.297704 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerName="ovn-northd" containerID="cri-o://65abe18a8fd80fc9112bf0561b66f917755c231568bc655d18e082d960532c5a" gracePeriod=30 Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.298003 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerName="openstack-network-exporter" containerID="cri-o://0781d5f97d7bdfedb1b4ca988b8219b988b2353f19930fed09bd026d7a0f175c" gracePeriod=30 Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.300086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8b6t\" (UniqueName: \"kubernetes.io/projected/c13ba380-e3f6-4322-a434-7b69da9e4e3f-kube-api-access-t8b6t\") pod \"glance-3699-account-create-update-9vvpr\" (UID: \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\") " pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.353327 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-bb44-account-create-update-xkl5l"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.384503 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6a31-account-create-update-gbdhh"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.410418 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-bb44-account-create-update-xkl5l"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.438359 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90168acb-f862-4f73-bb8a-dd5b308977e1-operator-scripts\") pod \"neutron-6a31-account-create-update-gbdhh\" (UID: \"90168acb-f862-4f73-bb8a-dd5b308977e1\") " pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.438427 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfn2j\" (UniqueName: \"kubernetes.io/projected/90168acb-f862-4f73-bb8a-dd5b308977e1-kube-api-access-jfn2j\") pod \"neutron-6a31-account-create-update-gbdhh\" (UID: \"90168acb-f862-4f73-bb8a-dd5b308977e1\") " pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.482153 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.482990 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerName="openstack-network-exporter" containerID="cri-o://7d15896c86ef74e743d6d11de6e7eb42b79dd557583bbade3356774639583f07" gracePeriod=300 Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.513326 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6a31-account-create-update-5x6jh"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.522596 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6a31-account-create-update-5x6jh"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.540749 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90168acb-f862-4f73-bb8a-dd5b308977e1-operator-scripts\") pod \"neutron-6a31-account-create-update-gbdhh\" (UID: \"90168acb-f862-4f73-bb8a-dd5b308977e1\") " pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.540842 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfn2j\" (UniqueName: \"kubernetes.io/projected/90168acb-f862-4f73-bb8a-dd5b308977e1-kube-api-access-jfn2j\") pod \"neutron-6a31-account-create-update-gbdhh\" (UID: \"90168acb-f862-4f73-bb8a-dd5b308977e1\") " pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.541050 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.542316 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90168acb-f862-4f73-bb8a-dd5b308977e1-operator-scripts\") pod \"neutron-6a31-account-create-update-gbdhh\" (UID: \"90168acb-f862-4f73-bb8a-dd5b308977e1\") " pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.555096 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-kptqh"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.612312 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-kptqh"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.615339 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-lkf9s"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.633159 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfn2j\" (UniqueName: \"kubernetes.io/projected/90168acb-f862-4f73-bb8a-dd5b308977e1-kube-api-access-jfn2j\") pod \"neutron-6a31-account-create-update-gbdhh\" (UID: \"90168acb-f862-4f73-bb8a-dd5b308977e1\") " pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.654102 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9jqk7"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.678732 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-lkf9s"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.706311 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.737784 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9jqk7"] Feb 26 16:06:56 crc kubenswrapper[5033]: E0226 16:06:56.744600 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 26 16:06:56 crc kubenswrapper[5033]: E0226 16:06:56.744667 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data podName:443e562a-aefe-46f3-a86f-59205f5849c7 nodeName:}" failed. No retries permitted until 2026-02-26 16:06:57.744653955 +0000 UTC m=+1544.957252605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data") pod "rabbitmq-server-0" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7") : configmap "rabbitmq-config-data" not found Feb 26 16:06:56 crc kubenswrapper[5033]: E0226 16:06:56.745899 5033 configmap.go:193] Couldn't get configMap openstack/ovndbcluster-nb-scripts: configmap "ovndbcluster-nb-scripts" not found Feb 26 16:06:56 crc kubenswrapper[5033]: E0226 16:06:56.745930 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts podName:28081635-21dd-4dbf-ae6a-354bbad0dbdd nodeName:}" failed. No retries permitted until 2026-02-26 16:06:57.245922456 +0000 UTC m=+1544.458521106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts") pod "ovsdbserver-nb-0" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd") : configmap "ovndbcluster-nb-scripts" not found Feb 26 16:06:56 crc kubenswrapper[5033]: E0226 16:06:56.748320 5033 configmap.go:193] Couldn't get configMap openstack/ovndbcluster-nb-config: configmap "ovndbcluster-nb-config" not found Feb 26 16:06:56 crc kubenswrapper[5033]: E0226 16:06:56.748361 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config podName:28081635-21dd-4dbf-ae6a-354bbad0dbdd nodeName:}" failed. No retries permitted until 2026-02-26 16:06:57.248350725 +0000 UTC m=+1544.460949375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config") pod "ovsdbserver-nb-0" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd") : configmap "ovndbcluster-nb-config" not found Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.908787 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.958207 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-041e-account-create-update-6zjnq"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.960894 5033 generic.go:334] "Generic (PLEG): container finished" podID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerID="0781d5f97d7bdfedb1b4ca988b8219b988b2353f19930fed09bd026d7a0f175c" exitCode=2 Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.960977 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00a5a420-8682-4780-8349-3da44c8d7f1a","Type":"ContainerDied","Data":"0781d5f97d7bdfedb1b4ca988b8219b988b2353f19930fed09bd026d7a0f175c"} Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.980160 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hb64j"] Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.990913 5033 generic.go:334] "Generic (PLEG): container finished" podID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerID="7d15896c86ef74e743d6d11de6e7eb42b79dd557583bbade3356774639583f07" exitCode=2 Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.991020 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef","Type":"ContainerDied","Data":"7d15896c86ef74e743d6d11de6e7eb42b79dd557583bbade3356774639583f07"} Feb 26 16:06:56 crc kubenswrapper[5033]: I0226 16:06:56.994136 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-43b7-account-create-update-gbdhs"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.011287 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-43b7-account-create-update-gbdhs"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.039414 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hb64j"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.053447 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-041e-account-create-update-6zjnq"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.073778 5033 generic.go:334] "Generic (PLEG): container finished" podID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerID="fb66e2834db2c23f74d5cee1c9b1bd88ffb6e98af3223011fc7d948fdd813f2f" exitCode=0 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.074128 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25q4r" event={"ID":"a4dc26b2-5d4e-4d28-af34-67307512a4cd","Type":"ContainerDied","Data":"fb66e2834db2c23f74d5cee1c9b1bd88ffb6e98af3223011fc7d948fdd813f2f"} Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.074569 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerName="openstack-network-exporter" containerID="cri-o://806513972f7ac7f35e48dccd32d3d8681e9469c01f754dd9f30afc6dcbc44290" gracePeriod=300 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.105321 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.157003 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9bknn"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.202411 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-56c7-account-create-update-dqbl5"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.244205 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-56c7-account-create-update-dqbl5"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.265137 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerName="ovsdbserver-sb" containerID="cri-o://6a0a05a28e34cceca86396fc370a3c3c2d613869b745809e63d4581ad8187860" gracePeriod=300 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.265373 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9bknn"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.273726 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-84k4b"] Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.276298 5033 configmap.go:193] Couldn't get configMap openstack/ovndbcluster-nb-config: configmap "ovndbcluster-nb-config" not found Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.276379 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config podName:28081635-21dd-4dbf-ae6a-354bbad0dbdd nodeName:}" failed. No retries permitted until 2026-02-26 16:06:58.276348046 +0000 UTC m=+1545.488946696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config") pod "ovsdbserver-nb-0" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd") : configmap "ovndbcluster-nb-config" not found Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.276428 5033 configmap.go:193] Couldn't get configMap openstack/ovndbcluster-nb-scripts: configmap "ovndbcluster-nb-scripts" not found Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.276451 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts podName:28081635-21dd-4dbf-ae6a-354bbad0dbdd nodeName:}" failed. No retries permitted until 2026-02-26 16:06:58.276443308 +0000 UTC m=+1545.489042048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts") pod "ovsdbserver-nb-0" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd") : configmap "ovndbcluster-nb-scripts" not found Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.277976 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.278055 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data podName:94562cbd-1b98-4b30-9a97-f26fb103bb72 nodeName:}" failed. No retries permitted until 2026-02-26 16:06:57.778035669 +0000 UTC m=+1544.990634389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data") pod "rabbitmq-cell1-server-0" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72") : configmap "rabbitmq-cell1-config-data" not found Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.285907 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerName="ovsdbserver-nb" containerID="cri-o://014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323" gracePeriod=300 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.304390 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-9ts2z"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.304671 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-9ts2z" podUID="45a330af-c614-4321-8534-060b7e5fb93f" containerName="openstack-network-exporter" containerID="cri-o://f0481a06ae138ce8c43ec40c4c5ac9624c819724a7da33627fecd75b0cbe6925" gracePeriod=30 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.316588 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fxrwp"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.339504 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.339966 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="801b1613-3ed1-423d-85cf-6b87f6711034" containerName="cinder-scheduler" containerID="cri-o://594bc92b21cdaf12c49af3fc339f3186f938691726d83e69d1538a692373df92" gracePeriod=30 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.340650 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="801b1613-3ed1-423d-85cf-6b87f6711034" containerName="probe" containerID="cri-o://81b00d0b1c13dc1cda83736d47b6fac79d1887078858c3ef5eaedcb3b16e0883" gracePeriod=30 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.588320 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5gx5"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.710607 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dac4b71-5c96-482d-b6d2-656990c1d873" path="/var/lib/kubelet/pods/0dac4b71-5c96-482d-b6d2-656990c1d873/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.711239 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143852fb-6ab2-485f-bc1f-e269f407f221" path="/var/lib/kubelet/pods/143852fb-6ab2-485f-bc1f-e269f407f221/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.711779 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d9c0d5-6eff-4216-a63e-cdc21eaacf42" path="/var/lib/kubelet/pods/33d9c0d5-6eff-4216-a63e-cdc21eaacf42/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.721623 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40664161-2e8b-4145-8b7a-5ec3723150bf" path="/var/lib/kubelet/pods/40664161-2e8b-4145-8b7a-5ec3723150bf/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.722591 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6410a07e-efb2-4773-8007-688038a9898e" path="/var/lib/kubelet/pods/6410a07e-efb2-4773-8007-688038a9898e/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.733427 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d" path="/var/lib/kubelet/pods/6d1d3eb9-d4fb-45e6-a12f-2a4823dee95d/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.736660 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6df4ba-4af0-46df-a43a-ff8737d7de9c" path="/var/lib/kubelet/pods/8e6df4ba-4af0-46df-a43a-ff8737d7de9c/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.738238 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0528d63-bfdb-4036-af0f-4900569f04c8" path="/var/lib/kubelet/pods/a0528d63-bfdb-4036-af0f-4900569f04c8/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.749732 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.749794 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data podName:443e562a-aefe-46f3-a86f-59205f5849c7 nodeName:}" failed. No retries permitted until 2026-02-26 16:06:59.749778725 +0000 UTC m=+1546.962377375 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data") pod "rabbitmq-server-0" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7") : configmap "rabbitmq-config-data" not found Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.761492 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a1bbea-b471-4f30-abe1-de75f9641b24" path="/var/lib/kubelet/pods/b1a1bbea-b471-4f30-abe1-de75f9641b24/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.762159 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc977b82-241c-4c92-9755-877433f3e4de" path="/var/lib/kubelet/pods/bc977b82-241c-4c92-9755-877433f3e4de/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.762836 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf479687-fa77-4c8b-95ee-23b30c858b35" path="/var/lib/kubelet/pods/cf479687-fa77-4c8b-95ee-23b30c858b35/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.776581 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded3d8f6-4b47-4d1d-af78-b295f7b729a7" path="/var/lib/kubelet/pods/ded3d8f6-4b47-4d1d-af78-b295f7b729a7/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.777408 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5" path="/var/lib/kubelet/pods/ff9e7793-b9b3-4a3a-8b74-4c50b3c2c9a5/volumes" Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.778900 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5gx5"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.815714 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-x8mq4"] Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.853248 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 26 16:06:57 crc kubenswrapper[5033]: E0226 16:06:57.853325 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data podName:94562cbd-1b98-4b30-9a97-f26fb103bb72 nodeName:}" failed. No retries permitted until 2026-02-26 16:06:58.85331153 +0000 UTC m=+1546.065910170 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data") pod "rabbitmq-cell1-server-0" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72") : configmap "rabbitmq-cell1-config-data" not found Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.910123 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-x8mq4"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.940673 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.940982 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerName="cinder-api-log" containerID="cri-o://aa1ee9867435cb326cccdc9dafbd435a111a0e341d91a8b67afb8b92f8d1ca85" gracePeriod=30 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.941116 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerName="cinder-api" containerID="cri-o://063b18146099b5100ca56d54d72414c86d85a242467f6339a74297d196991539" gracePeriod=30 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.988322 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6d648bbbdd-g6xz5"] Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.988671 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6d648bbbdd-g6xz5" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-log" containerID="cri-o://e957c4c3c0035ebbddff5e5a96c325ad85985b3bd7c2e72eaa4ed32e100f9146" gracePeriod=30 Feb 26 16:06:57 crc kubenswrapper[5033]: I0226 16:06:57.989154 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6d648bbbdd-g6xz5" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-api" containerID="cri-o://37b6556400506b1236e7114115d1a4566edf6e5a76d32b1374c1c673791271cf" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.016590 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.034336 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.034632 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerName="glance-log" containerID="cri-o://a46ddf9c39fa3f9620a8c8a0c8752e122ad6d05de939c29de0a166524be100e8" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.034815 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerName="glance-httpd" containerID="cri-o://e00abe8f8586a8c6516438836f22e8cd867774bd04666cb5631a9962be629702" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.059931 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323 is running failed: container process not found" containerID="014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.067705 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-utilities\") pod \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.067738 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-catalog-content\") pod \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.067963 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8n64\" (UniqueName: \"kubernetes.io/projected/a4dc26b2-5d4e-4d28-af34-67307512a4cd-kube-api-access-s8n64\") pod \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\" (UID: \"a4dc26b2-5d4e-4d28-af34-67307512a4cd\") " Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.068446 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6kvjv"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.069894 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-utilities" (OuterVolumeSpecName: "utilities") pod "a4dc26b2-5d4e-4d28-af34-67307512a4cd" (UID: "a4dc26b2-5d4e-4d28-af34-67307512a4cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.085898 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4dc26b2-5d4e-4d28-af34-67307512a4cd-kube-api-access-s8n64" (OuterVolumeSpecName: "kube-api-access-s8n64") pod "a4dc26b2-5d4e-4d28-af34-67307512a4cd" (UID: "a4dc26b2-5d4e-4d28-af34-67307512a4cd"). InnerVolumeSpecName "kube-api-access-s8n64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.087207 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323 is running failed: container process not found" containerID="014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.088426 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6kvjv"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.102593 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.102933 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerName="glance-log" containerID="cri-o://3a534aa8128f826e879da52f7e79951a6fad47306bfecf8692f8608dfd09639b" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.103198 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerName="glance-httpd" containerID="cri-o://2f2bce733fa63476566ab5bf20d7a8fb1718f0f35fccb3db14100b8bf61575d0" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.112835 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323 is running failed: container process not found" containerID="014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.112907 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerName="ovsdbserver-nb" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.145379 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.145945 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-server" containerID="cri-o://7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146402 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="swift-recon-cron" containerID="cri-o://9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146473 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="rsync" containerID="cri-o://0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146543 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-expirer" containerID="cri-o://273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146589 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-updater" containerID="cri-o://e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146638 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-auditor" containerID="cri-o://cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146687 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-replicator" containerID="cri-o://63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146730 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-server" containerID="cri-o://16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146779 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-updater" containerID="cri-o://876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146828 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-auditor" containerID="cri-o://d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146889 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-replicator" containerID="cri-o://994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146942 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-server" containerID="cri-o://f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.146986 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-reaper" containerID="cri-o://ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.147035 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-auditor" containerID="cri-o://fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.147076 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-replicator" containerID="cri-o://66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.171005 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8n64\" (UniqueName: \"kubernetes.io/projected/a4dc26b2-5d4e-4d28-af34-67307512a4cd-kube-api-access-s8n64\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.171412 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.193521 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_de38b113-f8c9-4c5d-913e-f0efc7a7b5ef/ovsdbserver-sb/0.log" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.193578 5033 generic.go:334] "Generic (PLEG): container finished" podID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerID="6a0a05a28e34cceca86396fc370a3c3c2d613869b745809e63d4581ad8187860" exitCode=143 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.193831 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef","Type":"ContainerDied","Data":"6a0a05a28e34cceca86396fc370a3c3c2d613869b745809e63d4581ad8187860"} Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.207617 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25q4r" event={"ID":"a4dc26b2-5d4e-4d28-af34-67307512a4cd","Type":"ContainerDied","Data":"8f4b838f3609b6a475bf6418fba140844bc4de677dbe662bfb096a849d92ae66"} Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.208013 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25q4r" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.211500 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ckn6b"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.211554 5033 scope.go:117] "RemoveContainer" containerID="fb66e2834db2c23f74d5cee1c9b1bd88ffb6e98af3223011fc7d948fdd813f2f" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.253639 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ckn6b"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.265074 5033 generic.go:334] "Generic (PLEG): container finished" podID="c0f0a198-dc99-4384-8ab0-b978bf9275b5" containerID="f4a8a899d4825519a257a55cdd9c58e86c24aafa460378cd50a7461c75a3be78" exitCode=137 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.279657 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-84cln"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.293544 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-84cln"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.306099 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9ts2z_45a330af-c614-4321-8534-060b7e5fb93f/openstack-network-exporter/0.log" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.306606 5033 generic.go:334] "Generic (PLEG): container finished" podID="45a330af-c614-4321-8534-060b7e5fb93f" containerID="f0481a06ae138ce8c43ec40c4c5ac9624c819724a7da33627fecd75b0cbe6925" exitCode=2 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.306732 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9ts2z" event={"ID":"45a330af-c614-4321-8534-060b7e5fb93f","Type":"ContainerDied","Data":"f0481a06ae138ce8c43ec40c4c5ac9624c819724a7da33627fecd75b0cbe6925"} Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.312933 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6d9577fb9-j9mmc"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.313491 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6d9577fb9-j9mmc" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerName="neutron-api" containerID="cri-o://6f4850a9a8f812df409a9a6fb3e745a63952c033253853ae16812e130cbe9138" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.315852 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6d9577fb9-j9mmc" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerName="neutron-httpd" containerID="cri-o://fb11d0b59395826f637bc346a58329527565a9575674426c6ee93c2cfdc8dac0" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.342661 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4dc26b2-5d4e-4d28-af34-67307512a4cd" (UID: "a4dc26b2-5d4e-4d28-af34-67307512a4cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.359390 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9702-account-create-update-89h9d"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.378971 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-h9h8w"] Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.385569 5033 configmap.go:193] Couldn't get configMap openstack/ovndbcluster-nb-scripts: configmap "ovndbcluster-nb-scripts" not found Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.385659 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts podName:28081635-21dd-4dbf-ae6a-354bbad0dbdd nodeName:}" failed. No retries permitted until 2026-02-26 16:07:00.385635769 +0000 UTC m=+1547.598234419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts") pod "ovsdbserver-nb-0" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd") : configmap "ovndbcluster-nb-scripts" not found Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.389410 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_28081635-21dd-4dbf-ae6a-354bbad0dbdd/ovsdbserver-nb/0.log" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.389488 5033 generic.go:334] "Generic (PLEG): container finished" podID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerID="806513972f7ac7f35e48dccd32d3d8681e9469c01f754dd9f30afc6dcbc44290" exitCode=2 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.389514 5033 generic.go:334] "Generic (PLEG): container finished" podID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerID="014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323" exitCode=143 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.389634 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"28081635-21dd-4dbf-ae6a-354bbad0dbdd","Type":"ContainerDied","Data":"806513972f7ac7f35e48dccd32d3d8681e9469c01f754dd9f30afc6dcbc44290"} Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.389666 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"28081635-21dd-4dbf-ae6a-354bbad0dbdd","Type":"ContainerDied","Data":"014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323"} Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.390148 5033 configmap.go:193] Couldn't get configMap openstack/ovndbcluster-nb-config: configmap "ovndbcluster-nb-config" not found Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.390215 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config podName:28081635-21dd-4dbf-ae6a-354bbad0dbdd nodeName:}" failed. No retries permitted until 2026-02-26 16:07:00.390196252 +0000 UTC m=+1547.602794912 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config") pod "ovsdbserver-nb-0" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd") : configmap "ovndbcluster-nb-config" not found Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.392326 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dc26b2-5d4e-4d28-af34-67307512a4cd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.399653 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-52gpj"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.399918 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" podUID="8a492ffb-344b-445f-a93c-bff302e85b06" containerName="dnsmasq-dns" containerID="cri-o://e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781" gracePeriod=10 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.421729 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-h9h8w"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.427346 5033 generic.go:334] "Generic (PLEG): container finished" podID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerID="aa1ee9867435cb326cccdc9dafbd435a111a0e341d91a8b67afb8b92f8d1ca85" exitCode=143 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.427400 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ab4047a-07ab-4c15-88eb-7dbefe5bea78","Type":"ContainerDied","Data":"aa1ee9867435cb326cccdc9dafbd435a111a0e341d91a8b67afb8b92f8d1ca85"} Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.441121 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5c49db967-q8hjf"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.441493 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5c49db967-q8hjf" podUID="50dfde4d-180f-4679-b02b-62a9592eb237" containerName="barbican-worker-log" containerID="cri-o://f144e91d04c79983ad9d2426e8a65c28484d0f2b51c249d12afc1f6ee1a0c14a" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.441750 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5c49db967-q8hjf" podUID="50dfde4d-180f-4679-b02b-62a9592eb237" containerName="barbican-worker" containerID="cri-o://e792002563a7c6234a552b17da91049d9907ca9b6088ef5dc670bcb69af229ae" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.455404 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-564dc86f78-j9p7h"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.455612 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" podUID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerName="barbican-keystone-listener-log" containerID="cri-o://154a195b0bbcd9352b7c0c53282fb8b79ead74d9d7ef469042e66e9b52b0dc36" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.455750 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" podUID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerName="barbican-keystone-listener" containerID="cri-o://21a665deef8b6a56264f1e3d28a3709293482a49500e908c4c8d0cd127477c4b" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.476147 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3699-account-create-update-9vvpr"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.487407 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86984d6b66-tg6tl"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.488077 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86984d6b66-tg6tl" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api-log" containerID="cri-o://09653e26a9486bd66b87542be3842434fcfe613ab05437a4b825e241e39e81f0" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.488551 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86984d6b66-tg6tl" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api" containerID="cri-o://31c65a36bdf508f77701ec37bf337b3a58e6c854126c11d769248ce82adecb01" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.521913 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-cgff2"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.530378 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-cgff2"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.541501 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.561043 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.575743 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.576098 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-log" containerID="cri-o://bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.576364 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-metadata" containerID="cri-o://4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.599576 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fdf9-account-create-update-lshpf"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.624436 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6d9577fb9-j9mmc" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.170:9696/\": dial tcp 10.217.0.170:9696: connect: connection refused" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.624851 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" containerID="cri-o://10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" gracePeriod=29 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.650398 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-6sfh6"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.664239 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-6sfh6"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.695502 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zt7xr"] Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.718672 5033 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 26 16:06:58 crc kubenswrapper[5033]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 26 16:06:58 crc kubenswrapper[5033]: + source /usr/local/bin/container-scripts/functions Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNBridge=br-int Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNRemote=tcp:localhost:6642 Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNEncapType=geneve Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNAvailabilityZones= Feb 26 16:06:58 crc kubenswrapper[5033]: ++ EnableChassisAsGateway=true Feb 26 16:06:58 crc kubenswrapper[5033]: ++ PhysicalNetworks= Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNHostName= Feb 26 16:06:58 crc kubenswrapper[5033]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 26 16:06:58 crc kubenswrapper[5033]: ++ ovs_dir=/var/lib/openvswitch Feb 26 16:06:58 crc kubenswrapper[5033]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 26 16:06:58 crc kubenswrapper[5033]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 26 16:06:58 crc kubenswrapper[5033]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 26 16:06:58 crc kubenswrapper[5033]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 26 16:06:58 crc kubenswrapper[5033]: + sleep 0.5 Feb 26 16:06:58 crc kubenswrapper[5033]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 26 16:06:58 crc kubenswrapper[5033]: + sleep 0.5 Feb 26 16:06:58 crc kubenswrapper[5033]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 26 16:06:58 crc kubenswrapper[5033]: + cleanup_ovsdb_server_semaphore Feb 26 16:06:58 crc kubenswrapper[5033]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 26 16:06:58 crc kubenswrapper[5033]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 26 16:06:58 crc kubenswrapper[5033]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-84k4b" message=< Feb 26 16:06:58 crc kubenswrapper[5033]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 26 16:06:58 crc kubenswrapper[5033]: + source /usr/local/bin/container-scripts/functions Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNBridge=br-int Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNRemote=tcp:localhost:6642 Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNEncapType=geneve Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNAvailabilityZones= Feb 26 16:06:58 crc kubenswrapper[5033]: ++ EnableChassisAsGateway=true Feb 26 16:06:58 crc kubenswrapper[5033]: ++ PhysicalNetworks= Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNHostName= Feb 26 16:06:58 crc kubenswrapper[5033]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 26 16:06:58 crc kubenswrapper[5033]: ++ ovs_dir=/var/lib/openvswitch Feb 26 16:06:58 crc kubenswrapper[5033]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 26 16:06:58 crc kubenswrapper[5033]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 26 16:06:58 crc kubenswrapper[5033]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 26 16:06:58 crc kubenswrapper[5033]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 26 16:06:58 crc kubenswrapper[5033]: + sleep 0.5 Feb 26 16:06:58 crc kubenswrapper[5033]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 26 16:06:58 crc kubenswrapper[5033]: + sleep 0.5 Feb 26 16:06:58 crc kubenswrapper[5033]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 26 16:06:58 crc kubenswrapper[5033]: + cleanup_ovsdb_server_semaphore Feb 26 16:06:58 crc kubenswrapper[5033]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 26 16:06:58 crc kubenswrapper[5033]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 26 16:06:58 crc kubenswrapper[5033]: > Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.718750 5033 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 26 16:06:58 crc kubenswrapper[5033]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 26 16:06:58 crc kubenswrapper[5033]: + source /usr/local/bin/container-scripts/functions Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNBridge=br-int Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNRemote=tcp:localhost:6642 Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNEncapType=geneve Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNAvailabilityZones= Feb 26 16:06:58 crc kubenswrapper[5033]: ++ EnableChassisAsGateway=true Feb 26 16:06:58 crc kubenswrapper[5033]: ++ PhysicalNetworks= Feb 26 16:06:58 crc kubenswrapper[5033]: ++ OVNHostName= Feb 26 16:06:58 crc kubenswrapper[5033]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 26 16:06:58 crc kubenswrapper[5033]: ++ ovs_dir=/var/lib/openvswitch Feb 26 16:06:58 crc kubenswrapper[5033]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 26 16:06:58 crc kubenswrapper[5033]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 26 16:06:58 crc kubenswrapper[5033]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 26 16:06:58 crc kubenswrapper[5033]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 26 16:06:58 crc kubenswrapper[5033]: + sleep 0.5 Feb 26 16:06:58 crc kubenswrapper[5033]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 26 16:06:58 crc kubenswrapper[5033]: + sleep 0.5 Feb 26 16:06:58 crc kubenswrapper[5033]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 26 16:06:58 crc kubenswrapper[5033]: + cleanup_ovsdb_server_semaphore Feb 26 16:06:58 crc kubenswrapper[5033]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 26 16:06:58 crc kubenswrapper[5033]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 26 16:06:58 crc kubenswrapper[5033]: > pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" containerID="cri-o://c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.719360 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" containerID="cri-o://c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" gracePeriod=29 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.731516 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zt7xr"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.751429 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.751874 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-api" containerID="cri-o://b6eace0b2103a031f250c85a90a9746babaf3bcd0afd8cd89a42175fe2fb3877" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.751876 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-log" containerID="cri-o://761688171a3882c19e60e75384d7494e2eaf7d87bf59d1ddf87fe813049c4ba5" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.781473 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6a31-account-create-update-gbdhh"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.824396 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5ntp4"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.842780 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5ntp4"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.867164 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-k9qcs"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.879808 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-k9qcs"] Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.888508 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 16:06:58 crc kubenswrapper[5033]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 26 16:06:58 crc kubenswrapper[5033]: Feb 26 16:06:58 crc kubenswrapper[5033]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 26 16:06:58 crc kubenswrapper[5033]: Feb 26 16:06:58 crc kubenswrapper[5033]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 26 16:06:58 crc kubenswrapper[5033]: Feb 26 16:06:58 crc kubenswrapper[5033]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 26 16:06:58 crc kubenswrapper[5033]: Feb 26 16:06:58 crc kubenswrapper[5033]: if [ -n "barbican" ]; then Feb 26 16:06:58 crc kubenswrapper[5033]: GRANT_DATABASE="barbican" Feb 26 16:06:58 crc kubenswrapper[5033]: else Feb 26 16:06:58 crc kubenswrapper[5033]: GRANT_DATABASE="*" Feb 26 16:06:58 crc kubenswrapper[5033]: fi Feb 26 16:06:58 crc kubenswrapper[5033]: Feb 26 16:06:58 crc kubenswrapper[5033]: # going for maximum compatibility here: Feb 26 16:06:58 crc kubenswrapper[5033]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 26 16:06:58 crc kubenswrapper[5033]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 26 16:06:58 crc kubenswrapper[5033]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 26 16:06:58 crc kubenswrapper[5033]: # support updates Feb 26 16:06:58 crc kubenswrapper[5033]: Feb 26 16:06:58 crc kubenswrapper[5033]: $MYSQL_CMD < logger="UnhandledError" Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.890315 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-9702-account-create-update-89h9d" podUID="907a1311-765c-4cd1-8e92-ed545e59c736" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.905020 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.905248 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fcedbed9-f0d1-4c33-b601-50da4f37c971" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718" gracePeriod=30 Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.915744 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" containerName="rabbitmq" containerID="cri-o://118f64c4209e08f3dc281ec9601aaee1f783225640fc7b6f23d85a36174417c3" gracePeriod=604800 Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.917358 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 26 16:06:58 crc kubenswrapper[5033]: E0226 16:06:58.917430 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data podName:94562cbd-1b98-4b30-9a97-f26fb103bb72 nodeName:}" failed. No retries permitted until 2026-02-26 16:07:00.917411313 +0000 UTC m=+1548.130009963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data") pod "rabbitmq-cell1-server-0" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72") : configmap "rabbitmq-cell1-config-data" not found Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.950765 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.955147 5033 scope.go:117] "RemoveContainer" containerID="5a4e87ec9aaac70ec455a5b5d1fdd7b241dbfd64106d9d905d511b4441d4a558" Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.970711 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:06:58 crc kubenswrapper[5033]: I0226 16:06:58.971083 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d989fd27-f36c-4503-a400-e09bb4e70398" containerName="nova-scheduler-scheduler" containerID="cri-o://8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5" gracePeriod=30 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:58.998022 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9702-account-create-update-89h9d"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.001204 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9ts2z_45a330af-c614-4321-8534-060b7e5fb93f/openstack-network-exporter/0.log" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.001493 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.016868 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_de38b113-f8c9-4c5d-913e-f0efc7a7b5ef/ovsdbserver-sb/0.log" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.016969 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.026823 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.027064 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="17f57fe7-555b-4119-b03e-fafb151494a6" containerName="nova-cell1-conductor-conductor" containerID="cri-o://ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4" gracePeriod=30 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.028459 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.034368 5033 scope.go:117] "RemoveContainer" containerID="1958b4cdd155f234eab178140592d44cbf4eeb304eaea0b1e9bbaca583aeb3ef" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.047334 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8h5d9"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.076216 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="94562cbd-1b98-4b30-9a97-f26fb103bb72" containerName="rabbitmq" containerID="cri-o://4ad1039e51c54abd4ca6cc3d8acfd102de20d99c0d488b4055beccf49a6c1a6d" gracePeriod=604800 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.104642 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8h5d9"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.113461 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lkrpq"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.121678 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-scripts\") pod \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.121736 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-combined-ca-bundle\") pod \"45a330af-c614-4321-8534-060b7e5fb93f\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.121770 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw46j\" (UniqueName: \"kubernetes.io/projected/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-kube-api-access-mw46j\") pod \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.121854 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssv4n\" (UniqueName: \"kubernetes.io/projected/45a330af-c614-4321-8534-060b7e5fb93f-kube-api-access-ssv4n\") pod \"45a330af-c614-4321-8534-060b7e5fb93f\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.121890 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdb-rundir\") pod \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.121916 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdbserver-sb-tls-certs\") pod \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.121943 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovs-rundir\") pod \"45a330af-c614-4321-8534-060b7e5fb93f\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122004 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwqk6\" (UniqueName: \"kubernetes.io/projected/c0f0a198-dc99-4384-8ab0-b978bf9275b5-kube-api-access-bwqk6\") pod \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122062 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config-secret\") pod \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122090 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122109 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-metrics-certs-tls-certs\") pod \"45a330af-c614-4321-8534-060b7e5fb93f\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122149 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a330af-c614-4321-8534-060b7e5fb93f-config\") pod \"45a330af-c614-4321-8534-060b7e5fb93f\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122192 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-combined-ca-bundle\") pod \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122225 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-combined-ca-bundle\") pod \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122269 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-metrics-certs-tls-certs\") pod \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122296 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovn-rundir\") pod \"45a330af-c614-4321-8534-060b7e5fb93f\" (UID: \"45a330af-c614-4321-8534-060b7e5fb93f\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122328 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-config\") pod \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\" (UID: \"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.122360 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config\") pod \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\" (UID: \"c0f0a198-dc99-4384-8ab0-b978bf9275b5\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.124118 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-scripts" (OuterVolumeSpecName: "scripts") pod "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" (UID: "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.127374 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "45a330af-c614-4321-8534-060b7e5fb93f" (UID: "45a330af-c614-4321-8534-060b7e5fb93f"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.131140 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.131404 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="0dcb38b4-0144-434f-adc1-63a2b63565ee" containerName="nova-cell0-conductor-conductor" containerID="cri-o://b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea" gracePeriod=30 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.136023 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a330af-c614-4321-8534-060b7e5fb93f-config" (OuterVolumeSpecName: "config") pod "45a330af-c614-4321-8534-060b7e5fb93f" (UID: "45a330af-c614-4321-8534-060b7e5fb93f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.142838 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-config" (OuterVolumeSpecName: "config") pod "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" (UID: "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.142919 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "45a330af-c614-4321-8534-060b7e5fb93f" (UID: "45a330af-c614-4321-8534-060b7e5fb93f"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.143545 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" (UID: "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.144697 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lkrpq"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.159801 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" (UID: "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.160035 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-kube-api-access-mw46j" (OuterVolumeSpecName: "kube-api-access-mw46j") pod "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" (UID: "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef"). InnerVolumeSpecName "kube-api-access-mw46j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.171900 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25q4r"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.172921 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a330af-c614-4321-8534-060b7e5fb93f-kube-api-access-ssv4n" (OuterVolumeSpecName: "kube-api-access-ssv4n") pod "45a330af-c614-4321-8534-060b7e5fb93f" (UID: "45a330af-c614-4321-8534-060b7e5fb93f"). InnerVolumeSpecName "kube-api-access-ssv4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: E0226 16:06:59.181952 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-conmon-fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-conmon-e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-conmon-273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb24a7f0_3a2d_4337_83aa_3c244c2bd3c8.slice/crio-conmon-154a195b0bbcd9352b7c0c53282fb8b79ead74d9d7ef469042e66e9b52b0dc36.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb7073ea_074d_41d7_b114_9d6b8e02c3a0.slice/crio-conmon-fb11d0b59395826f637bc346a58329527565a9575674426c6ee93c2cfdc8dac0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93066a1a_6c0c_4b1e_a257_052a4db51d70.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a492ffb_344b_445f_a93c_bff302e85b06.slice/crio-e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-conmon-994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801b1613_3ed1_423d_85cf_6b87f6711034.slice/crio-conmon-81b00d0b1c13dc1cda83736d47b6fac79d1887078858c3ef5eaedcb3b16e0883.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb2a5173_ce4c_4d4a_a136_70356adabbb9.slice/crio-761688171a3882c19e60e75384d7494e2eaf7d87bf59d1ddf87fe813049c4ba5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-conmon-cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4dc26b2_5d4e_4d28_af34_67307512a4cd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9646add7_5da6_4590_9394_b4325e8ebd2d.slice/crio-conmon-09653e26a9486bd66b87542be3842434fcfe613ab05437a4b825e241e39e81f0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d8f31df_b55c_44c6_88e4_3b0d0fb6c1e6.slice/crio-c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50dfde4d_180f_4679_b02b_62a9592eb237.slice/crio-f144e91d04c79983ad9d2426e8a65c28484d0f2b51c249d12afc1f6ee1a0c14a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-conmon-63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-conmon-16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9646add7_5da6_4590_9394_b4325e8ebd2d.slice/crio-09653e26a9486bd66b87542be3842434fcfe613ab05437a4b825e241e39e81f0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-conmon-d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a492ffb_344b_445f_a93c_bff302e85b06.slice/crio-conmon-e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea4c6e17_4b51_4387_98c8_d071c1fc7c5a.slice/crio-bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-conmon-ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd63f58_35b3_4eeb_a6af_6d1226ee2fb9.slice/crio-f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37.scope\": RecentStats: unable to find data in memory cache]" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.192599 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c0f0a198-dc99-4384-8ab0-b978bf9275b5" (UID: "c0f0a198-dc99-4384-8ab0-b978bf9275b5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.194074 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="a1a35ad9-a937-4d01-961b-a3488303610e" containerName="galera" containerID="cri-o://364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9" gracePeriod=30 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.199775 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f0a198-dc99-4384-8ab0-b978bf9275b5-kube-api-access-bwqk6" (OuterVolumeSpecName: "kube-api-access-bwqk6") pod "c0f0a198-dc99-4384-8ab0-b978bf9275b5" (UID: "c0f0a198-dc99-4384-8ab0-b978bf9275b5"). InnerVolumeSpecName "kube-api-access-bwqk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.199927 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-25q4r"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.201326 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45a330af-c614-4321-8534-060b7e5fb93f" (UID: "45a330af-c614-4321-8534-060b7e5fb93f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.231460 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.231919 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.231934 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.231947 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw46j\" (UniqueName: \"kubernetes.io/projected/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-kube-api-access-mw46j\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.231961 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssv4n\" (UniqueName: \"kubernetes.io/projected/45a330af-c614-4321-8534-060b7e5fb93f-kube-api-access-ssv4n\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.231974 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.231986 5033 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.231997 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwqk6\" (UniqueName: \"kubernetes.io/projected/c0f0a198-dc99-4384-8ab0-b978bf9275b5-kube-api-access-bwqk6\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.232035 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.232051 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a330af-c614-4321-8534-060b7e5fb93f-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.232063 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/45a330af-c614-4321-8534-060b7e5fb93f-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.232076 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.262473 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.299845 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" (UID: "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.334156 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.334293 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.335804 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c0f0a198-dc99-4384-8ab0-b978bf9275b5" (UID: "c0f0a198-dc99-4384-8ab0-b978bf9275b5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.381696 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0f0a198-dc99-4384-8ab0-b978bf9275b5" (UID: "c0f0a198-dc99-4384-8ab0-b978bf9275b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.436876 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.437055 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f0a198-dc99-4384-8ab0-b978bf9275b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.452641 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9ts2z_45a330af-c614-4321-8534-060b7e5fb93f/openstack-network-exporter/0.log" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.452740 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9ts2z" event={"ID":"45a330af-c614-4321-8534-060b7e5fb93f","Type":"ContainerDied","Data":"573e660c9a39f1b32a6850206b5db2c131d9a2979b4c49f52b53d3cab6204546"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.452791 5033 scope.go:117] "RemoveContainer" containerID="f0481a06ae138ce8c43ec40c4c5ac9624c819724a7da33627fecd75b0cbe6925" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.452904 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9ts2z" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.459171 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9702-account-create-update-89h9d" event={"ID":"907a1311-765c-4cd1-8e92-ed545e59c736","Type":"ContainerStarted","Data":"750ef9747624e6b358021328914a35ef482b6362831d07b1e96bc487f1becf1b"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.476901 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" (UID: "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.489193 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-c4c558dd7-x5jl5"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.489892 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-c4c558dd7-x5jl5" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-httpd" containerID="cri-o://6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765" gracePeriod=30 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.489932 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-c4c558dd7-x5jl5" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-server" containerID="cri-o://9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa" gracePeriod=30 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.490915 5033 generic.go:334] "Generic (PLEG): container finished" podID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.491057 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-84k4b" event={"ID":"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6","Type":"ContainerDied","Data":"c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.498312 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" (UID: "de38b113-f8c9-4c5d-913e-f0efc7a7b5ef"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.505047 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.506090 5033 generic.go:334] "Generic (PLEG): container finished" podID="50dfde4d-180f-4679-b02b-62a9592eb237" containerID="e792002563a7c6234a552b17da91049d9907ca9b6088ef5dc670bcb69af229ae" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.506121 5033 generic.go:334] "Generic (PLEG): container finished" podID="50dfde4d-180f-4679-b02b-62a9592eb237" containerID="f144e91d04c79983ad9d2426e8a65c28484d0f2b51c249d12afc1f6ee1a0c14a" exitCode=143 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.506162 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c49db967-q8hjf" event={"ID":"50dfde4d-180f-4679-b02b-62a9592eb237","Type":"ContainerDied","Data":"e792002563a7c6234a552b17da91049d9907ca9b6088ef5dc670bcb69af229ae"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.506190 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c49db967-q8hjf" event={"ID":"50dfde4d-180f-4679-b02b-62a9592eb237","Type":"ContainerDied","Data":"f144e91d04c79983ad9d2426e8a65c28484d0f2b51c249d12afc1f6ee1a0c14a"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.510650 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_28081635-21dd-4dbf-ae6a-354bbad0dbdd/ovsdbserver-nb/0.log" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.510936 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.535033 5033 generic.go:334] "Generic (PLEG): container finished" podID="801b1613-3ed1-423d-85cf-6b87f6711034" containerID="81b00d0b1c13dc1cda83736d47b6fac79d1887078858c3ef5eaedcb3b16e0883" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.535122 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"801b1613-3ed1-423d-85cf-6b87f6711034","Type":"ContainerDied","Data":"81b00d0b1c13dc1cda83736d47b6fac79d1887078858c3ef5eaedcb3b16e0883"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538182 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdb-rundir\") pod \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538240 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538285 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts\") pod \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538305 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-metrics-certs-tls-certs\") pod \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538345 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-combined-ca-bundle\") pod \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538380 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-sb\") pod \"8a492ffb-344b-445f-a93c-bff302e85b06\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538437 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zczdr\" (UniqueName: \"kubernetes.io/projected/28081635-21dd-4dbf-ae6a-354bbad0dbdd-kube-api-access-zczdr\") pod \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538475 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-nb\") pod \"8a492ffb-344b-445f-a93c-bff302e85b06\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538501 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhnph\" (UniqueName: \"kubernetes.io/projected/8a492ffb-344b-445f-a93c-bff302e85b06-kube-api-access-fhnph\") pod \"8a492ffb-344b-445f-a93c-bff302e85b06\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538528 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-config\") pod \"8a492ffb-344b-445f-a93c-bff302e85b06\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538562 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config\") pod \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538581 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-svc\") pod \"8a492ffb-344b-445f-a93c-bff302e85b06\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538598 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-swift-storage-0\") pod \"8a492ffb-344b-445f-a93c-bff302e85b06\" (UID: \"8a492ffb-344b-445f-a93c-bff302e85b06\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.538641 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdbserver-nb-tls-certs\") pod \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\" (UID: \"28081635-21dd-4dbf-ae6a-354bbad0dbdd\") " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.539072 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.539085 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.539993 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts" (OuterVolumeSpecName: "scripts") pod "28081635-21dd-4dbf-ae6a-354bbad0dbdd" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.540502 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "28081635-21dd-4dbf-ae6a-354bbad0dbdd" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.546409 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "45a330af-c614-4321-8534-060b7e5fb93f" (UID: "45a330af-c614-4321-8534-060b7e5fb93f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.552873 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28081635-21dd-4dbf-ae6a-354bbad0dbdd-kube-api-access-zczdr" (OuterVolumeSpecName: "kube-api-access-zczdr") pod "28081635-21dd-4dbf-ae6a-354bbad0dbdd" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd"). InnerVolumeSpecName "kube-api-access-zczdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.558212 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "28081635-21dd-4dbf-ae6a-354bbad0dbdd" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.563310 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config" (OuterVolumeSpecName: "config") pod "28081635-21dd-4dbf-ae6a-354bbad0dbdd" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.565998 5033 generic.go:334] "Generic (PLEG): container finished" podID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerID="761688171a3882c19e60e75384d7494e2eaf7d87bf59d1ddf87fe813049c4ba5" exitCode=143 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.566098 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb2a5173-ce4c-4d4a-a136-70356adabbb9","Type":"ContainerDied","Data":"761688171a3882c19e60e75384d7494e2eaf7d87bf59d1ddf87fe813049c4ba5"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.568225 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="fcedbed9-f0d1-4c33-b601-50da4f37c971" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.202:6080/vnc_lite.html\": dial tcp 10.217.0.202:6080: connect: connection refused" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.568291 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a492ffb-344b-445f-a93c-bff302e85b06-kube-api-access-fhnph" (OuterVolumeSpecName: "kube-api-access-fhnph") pod "8a492ffb-344b-445f-a93c-bff302e85b06" (UID: "8a492ffb-344b-445f-a93c-bff302e85b06"). InnerVolumeSpecName "kube-api-access-fhnph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.576216 5033 generic.go:334] "Generic (PLEG): container finished" podID="8a492ffb-344b-445f-a93c-bff302e85b06" containerID="e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.576360 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" event={"ID":"8a492ffb-344b-445f-a93c-bff302e85b06","Type":"ContainerDied","Data":"e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.576397 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" event={"ID":"8a492ffb-344b-445f-a93c-bff302e85b06","Type":"ContainerDied","Data":"bd3def5d808e13012078d7e20e0c75fb0ec7250471d922654889b4f468274f6b"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.576416 5033 scope.go:117] "RemoveContainer" containerID="e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.576565 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-52gpj" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.577983 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-c4c558dd7-x5jl5" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.172:8080/healthcheck\": dial tcp 10.217.0.172:8080: connect: connection refused" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.578199 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-c4c558dd7-x5jl5" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.172:8080/healthcheck\": dial tcp 10.217.0.172:8080: connect: connection refused" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.604732 5033 generic.go:334] "Generic (PLEG): container finished" podID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerID="3a534aa8128f826e879da52f7e79951a6fad47306bfecf8692f8608dfd09639b" exitCode=143 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.604820 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3","Type":"ContainerDied","Data":"3a534aa8128f826e879da52f7e79951a6fad47306bfecf8692f8608dfd09639b"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.608878 5033 generic.go:334] "Generic (PLEG): container finished" podID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerID="a46ddf9c39fa3f9620a8c8a0c8752e122ad6d05de939c29de0a166524be100e8" exitCode=143 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.608968 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bf0b01b9-0085-4552-8a7d-1e7359dad79b","Type":"ContainerDied","Data":"a46ddf9c39fa3f9620a8c8a0c8752e122ad6d05de939c29de0a166524be100e8"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.618854 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_28081635-21dd-4dbf-ae6a-354bbad0dbdd/ovsdbserver-nb/0.log" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.619041 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.619821 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"28081635-21dd-4dbf-ae6a-354bbad0dbdd","Type":"ContainerDied","Data":"1f2c0cb3d9d45570f2440a5830cd21d3c1256ee1dd97d33d2ffba258534c32c5"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.626870 5033 generic.go:334] "Generic (PLEG): container finished" podID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerID="fb11d0b59395826f637bc346a58329527565a9575674426c6ee93c2cfdc8dac0" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.626872 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28081635-21dd-4dbf-ae6a-354bbad0dbdd" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.626965 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d9577fb9-j9mmc" event={"ID":"db7073ea-074d-41d7-b114-9d6b8e02c3a0","Type":"ContainerDied","Data":"fb11d0b59395826f637bc346a58329527565a9575674426c6ee93c2cfdc8dac0"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.636809 5033 generic.go:334] "Generic (PLEG): container finished" podID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerID="09653e26a9486bd66b87542be3842434fcfe613ab05437a4b825e241e39e81f0" exitCode=143 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.651859 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zczdr\" (UniqueName: \"kubernetes.io/projected/28081635-21dd-4dbf-ae6a-354bbad0dbdd-kube-api-access-zczdr\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.651892 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a330af-c614-4321-8534-060b7e5fb93f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.651906 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhnph\" (UniqueName: \"kubernetes.io/projected/8a492ffb-344b-445f-a93c-bff302e85b06-kube-api-access-fhnph\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.651918 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.651929 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.651955 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.651967 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28081635-21dd-4dbf-ae6a-354bbad0dbdd-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.651982 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.665486 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="005e3882-a184-4d2d-9c00-a3de51d47b6e" path="/var/lib/kubelet/pods/005e3882-a184-4d2d-9c00-a3de51d47b6e/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.666511 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0558b5db-1cf4-477a-8874-cf2931891f31" path="/var/lib/kubelet/pods/0558b5db-1cf4-477a-8874-cf2931891f31/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.667122 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329480df-036d-4b95-ad02-08a923116026" path="/var/lib/kubelet/pods/329480df-036d-4b95-ad02-08a923116026/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.668307 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="494e1101-248e-4991-82f0-03bbec4fb3c2" path="/var/lib/kubelet/pods/494e1101-248e-4991-82f0-03bbec4fb3c2/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.669361 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="644c07f3-9ac5-48e9-92e3-641647c2e147" path="/var/lib/kubelet/pods/644c07f3-9ac5-48e9-92e3-641647c2e147/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.670225 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739b08e5-9e18-4d87-aded-724b5d2ab937" path="/var/lib/kubelet/pods/739b08e5-9e18-4d87-aded-724b5d2ab937/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.670936 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d77fb6-14aa-4c7d-9ad1-093e95473c0f" path="/var/lib/kubelet/pods/73d77fb6-14aa-4c7d-9ad1-093e95473c0f/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.672656 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="771e4d50-f290-4165-99c3-dec5b3582458" path="/var/lib/kubelet/pods/771e4d50-f290-4165-99c3-dec5b3582458/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.673955 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4673c4-7bff-4708-8ace-4931c3091789" path="/var/lib/kubelet/pods/9c4673c4-7bff-4708-8ace-4931c3091789/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.679002 5033 scope.go:117] "RemoveContainer" containerID="fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.679063 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" path="/var/lib/kubelet/pods/a4dc26b2-5d4e-4d28-af34-67307512a4cd/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.682784 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f0a198-dc99-4384-8ab0-b978bf9275b5" path="/var/lib/kubelet/pods/c0f0a198-dc99-4384-8ab0-b978bf9275b5/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.684157 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7261968-dd9e-4a9b-bdd9-58ee979d63ff" path="/var/lib/kubelet/pods/c7261968-dd9e-4a9b-bdd9-58ee979d63ff/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.685626 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1" path="/var/lib/kubelet/pods/cc8c6cc9-6b8a-4503-b7fa-2a7852b393a1/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.687279 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6bddd46-8347-483a-b49d-470d41ee528c" path="/var/lib/kubelet/pods/e6bddd46-8347-483a-b49d-470d41ee528c/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.688863 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec28e69b-de68-42a0-bc60-c41a9d87c90e" path="/var/lib/kubelet/pods/ec28e69b-de68-42a0-bc60-c41a9d87c90e/volumes" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.700736 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a492ffb-344b-445f-a93c-bff302e85b06" (UID: "8a492ffb-344b-445f-a93c-bff302e85b06"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.700845 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a492ffb-344b-445f-a93c-bff302e85b06" (UID: "8a492ffb-344b-445f-a93c-bff302e85b06"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.706724 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86984d6b66-tg6tl" event={"ID":"9646add7-5da6-4590-9394-b4325e8ebd2d","Type":"ContainerDied","Data":"09653e26a9486bd66b87542be3842434fcfe613ab05437a4b825e241e39e81f0"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.713397 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.718904 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8a492ffb-344b-445f-a93c-bff302e85b06" (UID: "8a492ffb-344b-445f-a93c-bff302e85b06"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.721303 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.724563 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a492ffb-344b-445f-a93c-bff302e85b06" (UID: "8a492ffb-344b-445f-a93c-bff302e85b06"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.740004 5033 generic.go:334] "Generic (PLEG): container finished" podID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerID="e957c4c3c0035ebbddff5e5a96c325ad85985b3bd7c2e72eaa4ed32e100f9146" exitCode=143 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.740247 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d648bbbdd-g6xz5" event={"ID":"5ef30f16-84a1-4809-8ec4-0c36e315c5f3","Type":"ContainerDied","Data":"e957c4c3c0035ebbddff5e5a96c325ad85985b3bd7c2e72eaa4ed32e100f9146"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.754539 5033 scope.go:117] "RemoveContainer" containerID="e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781" Feb 26 16:06:59 crc kubenswrapper[5033]: E0226 16:06:59.756373 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781\": container with ID starting with e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781 not found: ID does not exist" containerID="e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.756412 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781"} err="failed to get container status \"e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781\": rpc error: code = NotFound desc = could not find container \"e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781\": container with ID starting with e003007e26bfe503cb7bcee2049bdfc0010d749264c85ec181f495736404e781 not found: ID does not exist" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.756444 5033 scope.go:117] "RemoveContainer" containerID="fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f" Feb 26 16:06:59 crc kubenswrapper[5033]: E0226 16:06:59.757359 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f\": container with ID starting with fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f not found: ID does not exist" containerID="fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.757401 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f"} err="failed to get container status \"fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f\": rpc error: code = NotFound desc = could not find container \"fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f\": container with ID starting with fbd80b9c55c71002f8dc02d489c41145f1618aa92c65b9b62c3e9fe933c8071f not found: ID does not exist" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.757426 5033 scope.go:117] "RemoveContainer" containerID="806513972f7ac7f35e48dccd32d3d8681e9469c01f754dd9f30afc6dcbc44290" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.766722 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "28081635-21dd-4dbf-ae6a-354bbad0dbdd" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.780892 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-config" (OuterVolumeSpecName: "config") pod "8a492ffb-344b-445f-a93c-bff302e85b06" (UID: "8a492ffb-344b-445f-a93c-bff302e85b06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.787346 5033 generic.go:334] "Generic (PLEG): container finished" podID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerID="bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520" exitCode=143 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.787523 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a","Type":"ContainerDied","Data":"bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.796669 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "28081635-21dd-4dbf-ae6a-354bbad0dbdd" (UID: "28081635-21dd-4dbf-ae6a-354bbad0dbdd"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.800731 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_de38b113-f8c9-4c5d-913e-f0efc7a7b5ef/ovsdbserver-sb/0.log" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.800829 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"de38b113-f8c9-4c5d-913e-f0efc7a7b5ef","Type":"ContainerDied","Data":"9613ae0291561f00f942b9c24946e7666ec483d52649f7ff145858771804f4bd"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.800999 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.822816 5033 scope.go:117] "RemoveContainer" containerID="014da60b53d6c5986f7401d79c154a31b90ca5bbae0faef6674704fd41997323" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.835217 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.835274 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.835300 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.835312 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.835326 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.835338 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.835352 5033 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a492ffb-344b-445f-a93c-bff302e85b06-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.835365 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28081635-21dd-4dbf-ae6a-354bbad0dbdd-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:59 crc kubenswrapper[5033]: E0226 16:06:59.835493 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 26 16:06:59 crc kubenswrapper[5033]: E0226 16:06:59.835554 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data podName:443e562a-aefe-46f3-a86f-59205f5849c7 nodeName:}" failed. No retries permitted until 2026-02-26 16:07:03.835536249 +0000 UTC m=+1551.048134909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data") pod "rabbitmq-server-0" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7") : configmap "rabbitmq-config-data" not found Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.856605 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.856678 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.856816 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.856907 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.856929 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.856968 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.856984 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.856994 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857005 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857016 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857060 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857074 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857095 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857104 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857135 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857248 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857327 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857342 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857386 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857400 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857418 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857455 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857472 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857485 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857497 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857510 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857558 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.857574 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.890833 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-9ts2z"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.903906 5033 generic.go:334] "Generic (PLEG): container finished" podID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerID="21a665deef8b6a56264f1e3d28a3709293482a49500e908c4c8d0cd127477c4b" exitCode=0 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.903991 5033 generic.go:334] "Generic (PLEG): container finished" podID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerID="154a195b0bbcd9352b7c0c53282fb8b79ead74d9d7ef469042e66e9b52b0dc36" exitCode=143 Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.904035 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" event={"ID":"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8","Type":"ContainerDied","Data":"21a665deef8b6a56264f1e3d28a3709293482a49500e908c4c8d0cd127477c4b"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.904090 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" event={"ID":"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8","Type":"ContainerDied","Data":"154a195b0bbcd9352b7c0c53282fb8b79ead74d9d7ef469042e66e9b52b0dc36"} Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.909189 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-9ts2z"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.914631 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.939685 5033 scope.go:117] "RemoveContainer" containerID="f4a8a899d4825519a257a55cdd9c58e86c24aafa460378cd50a7461c75a3be78" Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.948113 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 16:06:59 crc kubenswrapper[5033]: I0226 16:06:59.976497 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.031906 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.052163 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-combined-ca-bundle\") pod \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.052653 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szz94\" (UniqueName: \"kubernetes.io/projected/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-kube-api-access-szz94\") pod \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.052742 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data-custom\") pod \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.052840 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data\") pod \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.052914 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-logs\") pod \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\" (UID: \"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.054625 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-logs" (OuterVolumeSpecName: "logs") pod "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" (UID: "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.062369 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-kube-api-access-szz94" (OuterVolumeSpecName: "kube-api-access-szz94") pod "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" (UID: "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8"). InnerVolumeSpecName "kube-api-access-szz94". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.070404 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" (UID: "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.085421 5033 scope.go:117] "RemoveContainer" containerID="7d15896c86ef74e743d6d11de6e7eb42b79dd557583bbade3356774639583f07" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.087695 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-52gpj"] Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.092908 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" (UID: "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.099605 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-52gpj"] Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.110445 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.148849 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.154671 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psxzm\" (UniqueName: \"kubernetes.io/projected/50dfde4d-180f-4679-b02b-62a9592eb237-kube-api-access-psxzm\") pod \"50dfde4d-180f-4679-b02b-62a9592eb237\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.154784 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data-custom\") pod \"50dfde4d-180f-4679-b02b-62a9592eb237\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.154852 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50dfde4d-180f-4679-b02b-62a9592eb237-logs\") pod \"50dfde4d-180f-4679-b02b-62a9592eb237\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.154909 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-combined-ca-bundle\") pod \"50dfde4d-180f-4679-b02b-62a9592eb237\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.154936 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data\") pod \"50dfde4d-180f-4679-b02b-62a9592eb237\" (UID: \"50dfde4d-180f-4679-b02b-62a9592eb237\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.155664 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50dfde4d-180f-4679-b02b-62a9592eb237-logs" (OuterVolumeSpecName: "logs") pod "50dfde4d-180f-4679-b02b-62a9592eb237" (UID: "50dfde4d-180f-4679-b02b-62a9592eb237"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.155687 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szz94\" (UniqueName: \"kubernetes.io/projected/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-kube-api-access-szz94\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.155705 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.155714 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.155722 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.162069 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "50dfde4d-180f-4679-b02b-62a9592eb237" (UID: "50dfde4d-180f-4679-b02b-62a9592eb237"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.162247 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50dfde4d-180f-4679-b02b-62a9592eb237-kube-api-access-psxzm" (OuterVolumeSpecName: "kube-api-access-psxzm") pod "50dfde4d-180f-4679-b02b-62a9592eb237" (UID: "50dfde4d-180f-4679-b02b-62a9592eb237"). InnerVolumeSpecName "kube-api-access-psxzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.211677 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data" (OuterVolumeSpecName: "config-data") pod "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" (UID: "eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.212524 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.213000 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.213287 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.213321 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.233435 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50dfde4d-180f-4679-b02b-62a9592eb237" (UID: "50dfde4d-180f-4679-b02b-62a9592eb237"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.239485 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6a31-account-create-update-gbdhh"] Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.258236 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50dfde4d-180f-4679-b02b-62a9592eb237-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.258286 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.258295 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.258303 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psxzm\" (UniqueName: \"kubernetes.io/projected/50dfde4d-180f-4679-b02b-62a9592eb237-kube-api-access-psxzm\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.258312 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.261799 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.298153 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fdf9-account-create-update-lshpf"] Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.307934 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3699-account-create-update-9vvpr"] Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.326139 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 16:07:00 crc kubenswrapper[5033]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: if [ -n "neutron" ]; then Feb 26 16:07:00 crc kubenswrapper[5033]: GRANT_DATABASE="neutron" Feb 26 16:07:00 crc kubenswrapper[5033]: else Feb 26 16:07:00 crc kubenswrapper[5033]: GRANT_DATABASE="*" Feb 26 16:07:00 crc kubenswrapper[5033]: fi Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: # going for maximum compatibility here: Feb 26 16:07:00 crc kubenswrapper[5033]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 26 16:07:00 crc kubenswrapper[5033]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 26 16:07:00 crc kubenswrapper[5033]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 26 16:07:00 crc kubenswrapper[5033]: # support updates Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: $MYSQL_CMD < logger="UnhandledError" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.326461 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-czhv8"] Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.329012 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-6a31-account-create-update-gbdhh" podUID="90168acb-f862-4f73-bb8a-dd5b308977e1" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.338696 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data" (OuterVolumeSpecName: "config-data") pod "50dfde4d-180f-4679-b02b-62a9592eb237" (UID: "50dfde4d-180f-4679-b02b-62a9592eb237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.341790 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.344785 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.344852 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.346957 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 16:07:00 crc kubenswrapper[5033]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: if [ -n "placement" ]; then Feb 26 16:07:00 crc kubenswrapper[5033]: GRANT_DATABASE="placement" Feb 26 16:07:00 crc kubenswrapper[5033]: else Feb 26 16:07:00 crc kubenswrapper[5033]: GRANT_DATABASE="*" Feb 26 16:07:00 crc kubenswrapper[5033]: fi Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: # going for maximum compatibility here: Feb 26 16:07:00 crc kubenswrapper[5033]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 26 16:07:00 crc kubenswrapper[5033]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 26 16:07:00 crc kubenswrapper[5033]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 26 16:07:00 crc kubenswrapper[5033]: # support updates Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: $MYSQL_CMD < logger="UnhandledError" Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.355565 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-fdf9-account-create-update-lshpf" podUID="540a469d-81f3-40f5-9826-27384a03ce2d" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.359919 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dfde4d-180f-4679-b02b-62a9592eb237-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.361445 5033 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 16:07:00 crc kubenswrapper[5033]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: if [ -n "glance" ]; then Feb 26 16:07:00 crc kubenswrapper[5033]: GRANT_DATABASE="glance" Feb 26 16:07:00 crc kubenswrapper[5033]: else Feb 26 16:07:00 crc kubenswrapper[5033]: GRANT_DATABASE="*" Feb 26 16:07:00 crc kubenswrapper[5033]: fi Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: # going for maximum compatibility here: Feb 26 16:07:00 crc kubenswrapper[5033]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 26 16:07:00 crc kubenswrapper[5033]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 26 16:07:00 crc kubenswrapper[5033]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 26 16:07:00 crc kubenswrapper[5033]: # support updates Feb 26 16:07:00 crc kubenswrapper[5033]: Feb 26 16:07:00 crc kubenswrapper[5033]: $MYSQL_CMD < logger="UnhandledError" Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.362528 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-3699-account-create-update-9vvpr" podUID="c13ba380-e3f6-4322-a434-7b69da9e4e3f" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.513805 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.523149 5033 scope.go:117] "RemoveContainer" containerID="6a0a05a28e34cceca86396fc370a3c3c2d613869b745809e63d4581ad8187860" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.633273 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.674682 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96dks\" (UniqueName: \"kubernetes.io/projected/907a1311-765c-4cd1-8e92-ed545e59c736-kube-api-access-96dks\") pod \"907a1311-765c-4cd1-8e92-ed545e59c736\" (UID: \"907a1311-765c-4cd1-8e92-ed545e59c736\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.674850 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907a1311-765c-4cd1-8e92-ed545e59c736-operator-scripts\") pod \"907a1311-765c-4cd1-8e92-ed545e59c736\" (UID: \"907a1311-765c-4cd1-8e92-ed545e59c736\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.676159 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/907a1311-765c-4cd1-8e92-ed545e59c736-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "907a1311-765c-4cd1-8e92-ed545e59c736" (UID: "907a1311-765c-4cd1-8e92-ed545e59c736"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.713186 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907a1311-765c-4cd1-8e92-ed545e59c736-kube-api-access-96dks" (OuterVolumeSpecName: "kube-api-access-96dks") pod "907a1311-765c-4cd1-8e92-ed545e59c736" (UID: "907a1311-765c-4cd1-8e92-ed545e59c736"). InnerVolumeSpecName "kube-api-access-96dks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.722991 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.768988 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.777498 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2b25\" (UniqueName: \"kubernetes.io/projected/fcedbed9-f0d1-4c33-b601-50da4f37c971-kube-api-access-h2b25\") pod \"fcedbed9-f0d1-4c33-b601-50da4f37c971\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.777610 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-nova-novncproxy-tls-certs\") pod \"fcedbed9-f0d1-4c33-b601-50da4f37c971\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.777683 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-combined-ca-bundle\") pod \"fcedbed9-f0d1-4c33-b601-50da4f37c971\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.777746 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-vencrypt-tls-certs\") pod \"fcedbed9-f0d1-4c33-b601-50da4f37c971\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.777799 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-config-data\") pod \"fcedbed9-f0d1-4c33-b601-50da4f37c971\" (UID: \"fcedbed9-f0d1-4c33-b601-50da4f37c971\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.778183 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96dks\" (UniqueName: \"kubernetes.io/projected/907a1311-765c-4cd1-8e92-ed545e59c736-kube-api-access-96dks\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.778200 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/907a1311-765c-4cd1-8e92-ed545e59c736-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.791228 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcedbed9-f0d1-4c33-b601-50da4f37c971-kube-api-access-h2b25" (OuterVolumeSpecName: "kube-api-access-h2b25") pod "fcedbed9-f0d1-4c33-b601-50da4f37c971" (UID: "fcedbed9-f0d1-4c33-b601-50da4f37c971"). InnerVolumeSpecName "kube-api-access-h2b25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.832510 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcedbed9-f0d1-4c33-b601-50da4f37c971" (UID: "fcedbed9-f0d1-4c33-b601-50da4f37c971"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.863972 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-config-data" (OuterVolumeSpecName: "config-data") pod "fcedbed9-f0d1-4c33-b601-50da4f37c971" (UID: "fcedbed9-f0d1-4c33-b601-50da4f37c971"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.880687 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-internal-tls-certs\") pod \"eb635e2d-e92d-4f18-be4b-972e388e8c25\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.880800 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-run-httpd\") pod \"eb635e2d-e92d-4f18-be4b-972e388e8c25\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.880899 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-operator-scripts\") pod \"a1a35ad9-a937-4d01-961b-a3488303610e\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.880926 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-galera-tls-certs\") pod \"a1a35ad9-a937-4d01-961b-a3488303610e\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.880978 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-etc-swift\") pod \"eb635e2d-e92d-4f18-be4b-972e388e8c25\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.881013 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29chb\" (UniqueName: \"kubernetes.io/projected/a1a35ad9-a937-4d01-961b-a3488303610e-kube-api-access-29chb\") pod \"a1a35ad9-a937-4d01-961b-a3488303610e\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.881044 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-combined-ca-bundle\") pod \"eb635e2d-e92d-4f18-be4b-972e388e8c25\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.881248 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-public-tls-certs\") pod \"eb635e2d-e92d-4f18-be4b-972e388e8c25\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.881360 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"a1a35ad9-a937-4d01-961b-a3488303610e\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.881407 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-config-data\") pod \"eb635e2d-e92d-4f18-be4b-972e388e8c25\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.881428 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsml4\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-kube-api-access-lsml4\") pod \"eb635e2d-e92d-4f18-be4b-972e388e8c25\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.881464 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-kolla-config\") pod \"a1a35ad9-a937-4d01-961b-a3488303610e\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.881504 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-generated\") pod \"a1a35ad9-a937-4d01-961b-a3488303610e\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.881556 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-combined-ca-bundle\") pod \"a1a35ad9-a937-4d01-961b-a3488303610e\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.882472 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-log-httpd\") pod \"eb635e2d-e92d-4f18-be4b-972e388e8c25\" (UID: \"eb635e2d-e92d-4f18-be4b-972e388e8c25\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.882616 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-default\") pod \"a1a35ad9-a937-4d01-961b-a3488303610e\" (UID: \"a1a35ad9-a937-4d01-961b-a3488303610e\") " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.883368 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.883475 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.883541 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2b25\" (UniqueName: \"kubernetes.io/projected/fcedbed9-f0d1-4c33-b601-50da4f37c971-kube-api-access-h2b25\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.884577 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eb635e2d-e92d-4f18-be4b-972e388e8c25" (UID: "eb635e2d-e92d-4f18-be4b-972e388e8c25"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.885105 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a1a35ad9-a937-4d01-961b-a3488303610e" (UID: "a1a35ad9-a937-4d01-961b-a3488303610e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.885481 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "a1a35ad9-a937-4d01-961b-a3488303610e" (UID: "a1a35ad9-a937-4d01-961b-a3488303610e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.885863 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eb635e2d-e92d-4f18-be4b-972e388e8c25" (UID: "eb635e2d-e92d-4f18-be4b-972e388e8c25"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.886347 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "a1a35ad9-a937-4d01-961b-a3488303610e" (UID: "a1a35ad9-a937-4d01-961b-a3488303610e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.886786 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1a35ad9-a937-4d01-961b-a3488303610e" (UID: "a1a35ad9-a937-4d01-961b-a3488303610e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.890890 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-kube-api-access-lsml4" (OuterVolumeSpecName: "kube-api-access-lsml4") pod "eb635e2d-e92d-4f18-be4b-972e388e8c25" (UID: "eb635e2d-e92d-4f18-be4b-972e388e8c25"). InnerVolumeSpecName "kube-api-access-lsml4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.910449 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a35ad9-a937-4d01-961b-a3488303610e-kube-api-access-29chb" (OuterVolumeSpecName: "kube-api-access-29chb") pod "a1a35ad9-a937-4d01-961b-a3488303610e" (UID: "a1a35ad9-a937-4d01-961b-a3488303610e"). InnerVolumeSpecName "kube-api-access-29chb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.923845 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "eb635e2d-e92d-4f18-be4b-972e388e8c25" (UID: "eb635e2d-e92d-4f18-be4b-972e388e8c25"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.933934 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "a1a35ad9-a937-4d01-961b-a3488303610e" (UID: "a1a35ad9-a937-4d01-961b-a3488303610e"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.944140 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3699-account-create-update-9vvpr" event={"ID":"c13ba380-e3f6-4322-a434-7b69da9e4e3f","Type":"ContainerStarted","Data":"4caa7dcecccb1554befa146ca298dee4948c88c2173e347aef8748f425492cf8"} Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.958510 5033 generic.go:334] "Generic (PLEG): container finished" podID="a1a35ad9-a937-4d01-961b-a3488303610e" containerID="364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9" exitCode=0 Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.958636 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1a35ad9-a937-4d01-961b-a3488303610e","Type":"ContainerDied","Data":"364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9"} Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.958674 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a1a35ad9-a937-4d01-961b-a3488303610e","Type":"ContainerDied","Data":"dcc888e2710bee66d73e2bd3cf8c8e91464ec13f8c0de19b6adfd024271004aa"} Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.958693 5033 scope.go:117] "RemoveContainer" containerID="364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.958850 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.970951 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "fcedbed9-f0d1-4c33-b601-50da4f37c971" (UID: "fcedbed9-f0d1-4c33-b601-50da4f37c971"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.972737 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" event={"ID":"eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8","Type":"ContainerDied","Data":"e2992f9194b8ed7eeaf8b826192cb625bbf131c35506c9d3572d8183375270b0"} Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.972877 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-564dc86f78-j9p7h" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.981857 5033 generic.go:334] "Generic (PLEG): container finished" podID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerID="9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa" exitCode=0 Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.981899 5033 generic.go:334] "Generic (PLEG): container finished" podID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerID="6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765" exitCode=0 Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.982050 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c4c558dd7-x5jl5" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.982514 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c4c558dd7-x5jl5" event={"ID":"eb635e2d-e92d-4f18-be4b-972e388e8c25","Type":"ContainerDied","Data":"9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa"} Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.982550 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c4c558dd7-x5jl5" event={"ID":"eb635e2d-e92d-4f18-be4b-972e388e8c25","Type":"ContainerDied","Data":"6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765"} Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.982561 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c4c558dd7-x5jl5" event={"ID":"eb635e2d-e92d-4f18-be4b-972e388e8c25","Type":"ContainerDied","Data":"b5d444dfc4ae3c7793f31a10c5f76fd337e67d0e948feee939f1da607736576e"} Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.983714 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1a35ad9-a937-4d01-961b-a3488303610e" (UID: "a1a35ad9-a937-4d01-961b-a3488303610e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985160 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985176 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsml4\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-kube-api-access-lsml4\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985186 5033 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985196 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985205 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985214 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985222 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985231 5033 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985239 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb635e2d-e92d-4f18-be4b-972e388e8c25-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.985234 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 26 16:07:00 crc kubenswrapper[5033]: E0226 16:07:00.985334 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data podName:94562cbd-1b98-4b30-9a97-f26fb103bb72 nodeName:}" failed. No retries permitted until 2026-02-26 16:07:04.985313634 +0000 UTC m=+1552.197912284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data") pod "rabbitmq-cell1-server-0" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72") : configmap "rabbitmq-cell1-config-data" not found Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985249 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1a35ad9-a937-4d01-961b-a3488303610e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985656 5033 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb635e2d-e92d-4f18-be4b-972e388e8c25-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.985674 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29chb\" (UniqueName: \"kubernetes.io/projected/a1a35ad9-a937-4d01-961b-a3488303610e-kube-api-access-29chb\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.993925 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9702-account-create-update-89h9d" Feb 26 16:07:00 crc kubenswrapper[5033]: I0226 16:07:00.993997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9702-account-create-update-89h9d" event={"ID":"907a1311-765c-4cd1-8e92-ed545e59c736","Type":"ContainerDied","Data":"750ef9747624e6b358021328914a35ef482b6362831d07b1e96bc487f1becf1b"} Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.015113 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-czhv8" event={"ID":"bd95e3d9-8fc5-4f14-a009-cd1749743679","Type":"ContainerStarted","Data":"24299c16c292f7a9965a7905be0a3de8da162ef546f6ff8af5a6a1872eed04c9"} Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.015728 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-czhv8" event={"ID":"bd95e3d9-8fc5-4f14-a009-cd1749743679","Type":"ContainerStarted","Data":"98a8ae6ca046024521c9f896d30f4398fee105467c89b057553b81c56c1d4d84"} Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.018753 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5 is running failed: container process not found" containerID="8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.020433 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5 is running failed: container process not found" containerID="8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.023178 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5 is running failed: container process not found" containerID="8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.023243 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d989fd27-f36c-4503-a400-e09bb4e70398" containerName="nova-scheduler-scheduler" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.023480 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fdf9-account-create-update-lshpf" event={"ID":"540a469d-81f3-40f5-9826-27384a03ce2d","Type":"ContainerStarted","Data":"c3d562a204d3a5727636a6f222bf37664bf9a81038b19ba4c86ea6df4568383b"} Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.035675 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-564dc86f78-j9p7h"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.039437 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c49db967-q8hjf" event={"ID":"50dfde4d-180f-4679-b02b-62a9592eb237","Type":"ContainerDied","Data":"aae91e5df7617b6ed34e30efc4eb81b28d75b8e331eae519cfc2026fb556154d"} Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.039531 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c49db967-q8hjf" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.045776 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.050095 5033 generic.go:334] "Generic (PLEG): container finished" podID="fcedbed9-f0d1-4c33-b601-50da4f37c971" containerID="adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718" exitCode=0 Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.050226 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fcedbed9-f0d1-4c33-b601-50da4f37c971","Type":"ContainerDied","Data":"adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718"} Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.050272 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fcedbed9-f0d1-4c33-b601-50da4f37c971","Type":"ContainerDied","Data":"ecd9f15b749cf18ee568635a8a33c7bc28bd0df9ed382523cdeef7116405b0be"} Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.055711 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.058941 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-564dc86f78-j9p7h"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.059093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "fcedbed9-f0d1-4c33-b601-50da4f37c971" (UID: "fcedbed9-f0d1-4c33-b601-50da4f37c971"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.064430 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-czhv8" podStartSLOduration=6.064361853 podStartE2EDuration="6.064361853s" podCreationTimestamp="2026-02-26 16:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:01.027067579 +0000 UTC m=+1548.239666229" watchObservedRunningTime="2026-02-26 16:07:01.064361853 +0000 UTC m=+1548.276960503" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.066157 5033 scope.go:117] "RemoveContainer" containerID="a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.066334 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6a31-account-create-update-gbdhh" event={"ID":"90168acb-f862-4f73-bb8a-dd5b308977e1","Type":"ContainerStarted","Data":"4b18d2ee9f3fe2dc97c4a16429a82d44a4d0ee14160c0d366d7c1db9ab4fca54"} Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.089784 5033 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 26 16:07:01 crc kubenswrapper[5033]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-26T16:06:59Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 26 16:07:01 crc kubenswrapper[5033]: /etc/init.d/functions: line 589: 442 Alarm clock "$@" Feb 26 16:07:01 crc kubenswrapper[5033]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-fxrwp" message=< Feb 26 16:07:01 crc kubenswrapper[5033]: Exiting ovn-controller (1) [FAILED] Feb 26 16:07:01 crc kubenswrapper[5033]: Killing ovn-controller (1) [ OK ] Feb 26 16:07:01 crc kubenswrapper[5033]: 2026-02-26T16:06:59Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 26 16:07:01 crc kubenswrapper[5033]: /etc/init.d/functions: line 589: 442 Alarm clock "$@" Feb 26 16:07:01 crc kubenswrapper[5033]: > Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.089830 5033 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 26 16:07:01 crc kubenswrapper[5033]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-26T16:06:59Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 26 16:07:01 crc kubenswrapper[5033]: /etc/init.d/functions: line 589: 442 Alarm clock "$@" Feb 26 16:07:01 crc kubenswrapper[5033]: > pod="openstack/ovn-controller-fxrwp" podUID="44171111-32bf-415d-97c9-f966209cd0a0" containerName="ovn-controller" containerID="cri-o://89deba044f841c417bb427e36c48c0c1ad7827ea068666457c484ac571cf5b5c" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.089869 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-fxrwp" podUID="44171111-32bf-415d-97c9-f966209cd0a0" containerName="ovn-controller" containerID="cri-o://89deba044f841c417bb427e36c48c0c1ad7827ea068666457c484ac571cf5b5c" gracePeriod=27 Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.090111 5033 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcedbed9-f0d1-4c33-b601-50da4f37c971-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.090147 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.093216 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fxrwp" podUID="44171111-32bf-415d-97c9-f966209cd0a0" containerName="ovn-controller" probeResult="failure" output="" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.106442 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb635e2d-e92d-4f18-be4b-972e388e8c25" (UID: "eb635e2d-e92d-4f18-be4b-972e388e8c25"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.107839 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb635e2d-e92d-4f18-be4b-972e388e8c25" (UID: "eb635e2d-e92d-4f18-be4b-972e388e8c25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.138878 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-config-data" (OuterVolumeSpecName: "config-data") pod "eb635e2d-e92d-4f18-be4b-972e388e8c25" (UID: "eb635e2d-e92d-4f18-be4b-972e388e8c25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.148552 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9702-account-create-update-89h9d"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.178041 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9702-account-create-update-89h9d"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.191693 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.191728 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.191742 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.192714 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "a1a35ad9-a937-4d01-961b-a3488303610e" (UID: "a1a35ad9-a937-4d01-961b-a3488303610e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.197098 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb635e2d-e92d-4f18-be4b-972e388e8c25" (UID: "eb635e2d-e92d-4f18-be4b-972e388e8c25"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.260165 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.275181 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="ceilometer-central-agent" containerID="cri-o://efe61e5ba101f92c1e8b4416d9d0570b31c5948ae215ef72723140bf034dbde5" gracePeriod=30 Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.275591 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="proxy-httpd" containerID="cri-o://fd787498335d14e2033ee5b470f24eb5998e703815bbba943d72e487ec251728" gracePeriod=30 Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.275643 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="sg-core" containerID="cri-o://274baa6f0fac074022c217f76e1ef56693c6f2d00d433a2f9b4042c158da447b" gracePeriod=30 Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.275678 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="ceilometer-notification-agent" containerID="cri-o://7c2836b72b99580d0853eb6d7ae0794bc4651ce530042e155abe1f4de07321f5" gracePeriod=30 Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.326838 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb635e2d-e92d-4f18-be4b-972e388e8c25-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.327755 5033 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a35ad9-a937-4d01-961b-a3488303610e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.341488 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6d648bbbdd-g6xz5" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.155:8778/\": dial tcp 10.217.0.155:8778: connect: connection refused" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.341829 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6d648bbbdd-g6xz5" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.155:8778/\": dial tcp 10.217.0.155:8778: connect: connection refused" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.374734 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.375073 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="783187df-a438-4fb2-97be-3565e08e88ac" containerName="kube-state-metrics" containerID="cri-o://33f8af65abd2984345edb8a627a3f3c55b944bdf81de100064126a81f6d0b224" gracePeriod=30 Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.479735 5033 scope.go:117] "RemoveContainer" containerID="364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.484009 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9\": container with ID starting with 364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9 not found: ID does not exist" containerID="364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.484052 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9"} err="failed to get container status \"364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9\": rpc error: code = NotFound desc = could not find container \"364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9\": container with ID starting with 364e6ae38fc592df77430d5666c9c34fab4b516d22d9e62cba60f9e44a6e33b9 not found: ID does not exist" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.484078 5033 scope.go:117] "RemoveContainer" containerID="a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.484607 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb\": container with ID starting with a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb not found: ID does not exist" containerID="a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.484633 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb"} err="failed to get container status \"a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb\": rpc error: code = NotFound desc = could not find container \"a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb\": container with ID starting with a7db8d0fda6064ce2ea4f2f0bffdfc979f83285bb234c5fb74261b6ab01742bb not found: ID does not exist" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.484650 5033 scope.go:117] "RemoveContainer" containerID="21a665deef8b6a56264f1e3d28a3709293482a49500e908c4c8d0cd127477c4b" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.509685 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.169:8776/healthcheck\": read tcp 10.217.0.2:47226->10.217.0.169:8776: read: connection reset by peer" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.604097 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-65a8-account-create-update-md5x4"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.840098 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" path="/var/lib/kubelet/pods/28081635-21dd-4dbf-ae6a-354bbad0dbdd/volumes" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.841001 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a330af-c614-4321-8534-060b7e5fb93f" path="/var/lib/kubelet/pods/45a330af-c614-4321-8534-060b7e5fb93f/volumes" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.841617 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a492ffb-344b-445f-a93c-bff302e85b06" path="/var/lib/kubelet/pods/8a492ffb-344b-445f-a93c-bff302e85b06/volumes" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.842743 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907a1311-765c-4cd1-8e92-ed545e59c736" path="/var/lib/kubelet/pods/907a1311-765c-4cd1-8e92-ed545e59c736/volumes" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.843319 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" path="/var/lib/kubelet/pods/de38b113-f8c9-4c5d-913e-f0efc7a7b5ef/volumes" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.843941 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" path="/var/lib/kubelet/pods/eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8/volumes" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.845602 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-65a8-account-create-update-md5x4"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.845632 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.845649 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-65a8-account-create-update-h8g9k"] Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846094 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerName="extract-content" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846114 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerName="extract-content" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846130 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-server" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846137 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-server" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846151 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-httpd" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846156 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-httpd" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846164 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a330af-c614-4321-8534-060b7e5fb93f" containerName="openstack-network-exporter" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846170 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a330af-c614-4321-8534-060b7e5fb93f" containerName="openstack-network-exporter" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846179 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a35ad9-a937-4d01-961b-a3488303610e" containerName="galera" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846185 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a35ad9-a937-4d01-961b-a3488303610e" containerName="galera" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846195 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerName="openstack-network-exporter" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846202 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerName="openstack-network-exporter" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846214 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerName="registry-server" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846220 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerName="registry-server" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846228 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerName="barbican-keystone-listener-log" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846234 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerName="barbican-keystone-listener-log" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846244 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerName="barbican-keystone-listener" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846269 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerName="barbican-keystone-listener" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846286 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a492ffb-344b-445f-a93c-bff302e85b06" containerName="dnsmasq-dns" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846294 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a492ffb-344b-445f-a93c-bff302e85b06" containerName="dnsmasq-dns" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846318 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dfde4d-180f-4679-b02b-62a9592eb237" containerName="barbican-worker-log" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846329 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dfde4d-180f-4679-b02b-62a9592eb237" containerName="barbican-worker-log" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846340 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a492ffb-344b-445f-a93c-bff302e85b06" containerName="init" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846347 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a492ffb-344b-445f-a93c-bff302e85b06" containerName="init" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846362 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dfde4d-180f-4679-b02b-62a9592eb237" containerName="barbican-worker" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846370 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dfde4d-180f-4679-b02b-62a9592eb237" containerName="barbican-worker" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846384 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerName="ovsdbserver-sb" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846391 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerName="ovsdbserver-sb" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846404 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a35ad9-a937-4d01-961b-a3488303610e" containerName="mysql-bootstrap" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846412 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a35ad9-a937-4d01-961b-a3488303610e" containerName="mysql-bootstrap" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846428 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerName="extract-utilities" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846433 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerName="extract-utilities" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846441 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcedbed9-f0d1-4c33-b601-50da4f37c971" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846447 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcedbed9-f0d1-4c33-b601-50da4f37c971" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846455 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerName="openstack-network-exporter" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846461 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerName="openstack-network-exporter" Feb 26 16:07:01 crc kubenswrapper[5033]: E0226 16:07:01.846472 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerName="ovsdbserver-nb" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846478 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerName="ovsdbserver-nb" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846650 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerName="openstack-network-exporter" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846668 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-server" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846685 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a492ffb-344b-445f-a93c-bff302e85b06" containerName="dnsmasq-dns" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846698 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dfde4d-180f-4679-b02b-62a9592eb237" containerName="barbican-worker" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846711 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcedbed9-f0d1-4c33-b601-50da4f37c971" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846719 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dfde4d-180f-4679-b02b-62a9592eb237" containerName="barbican-worker-log" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846730 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerName="barbican-keystone-listener" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846775 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a330af-c614-4321-8534-060b7e5fb93f" containerName="openstack-network-exporter" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846795 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4dc26b2-5d4e-4d28-af34-67307512a4cd" containerName="registry-server" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846805 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="28081635-21dd-4dbf-ae6a-354bbad0dbdd" containerName="ovsdbserver-nb" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846858 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerName="ovsdbserver-sb" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846873 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="de38b113-f8c9-4c5d-913e-f0efc7a7b5ef" containerName="openstack-network-exporter" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846884 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb24a7f0-3a2d-4337-83aa-3c244c2bd3c8" containerName="barbican-keystone-listener-log" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846900 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" containerName="proxy-httpd" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.846908 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a35ad9-a937-4d01-961b-a3488303610e" containerName="galera" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.847608 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65a8-account-create-update-h8g9k"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.847629 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mnxgk"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.847643 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mnxgk"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.847657 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9xg9v"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.847671 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9xg9v"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.847684 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7d658b969f-npt2l"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.848131 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.848166 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-65a8-account-create-update-h8g9k"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.848178 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-zwrrq"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.851887 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-zwrrq"] Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.852124 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7d658b969f-npt2l" podUID="8e80c8c9-0f28-488c-9b68-ff4d379f9059" containerName="keystone-api" containerID="cri-o://e93416b1954dbbe264e3883541d6c591523a47b3cbef0c1e3095a85863fc60b8" gracePeriod=30 Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.855961 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65a8-account-create-update-h8g9k" Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.856076 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="be41eafa-8738-4d4a-9f9e-592c105bc766" containerName="memcached" containerID="cri-o://067a1b260955a76eecc32e5016cd673aa068e9dd6e33251557d30945591fa778" gracePeriod=30 Feb 26 16:07:01 crc kubenswrapper[5033]: I0226 16:07:01.886509 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-czhv8"] Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.034947 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86984d6b66-tg6tl" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.035241 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86984d6b66-tg6tl" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.086661 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": read tcp 10.217.0.2:47036->10.217.0.208:8775: read: connection reset by peer" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.087015 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": read tcp 10.217.0.2:47024->10.217.0.208:8775: read: connection reset by peer" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.092562 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fxrwp_44171111-32bf-415d-97c9-f966209cd0a0/ovn-controller/0.log" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.092605 5033 generic.go:334] "Generic (PLEG): container finished" podID="44171111-32bf-415d-97c9-f966209cd0a0" containerID="89deba044f841c417bb427e36c48c0c1ad7827ea068666457c484ac571cf5b5c" exitCode=143 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.092680 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fxrwp" event={"ID":"44171111-32bf-415d-97c9-f966209cd0a0","Type":"ContainerDied","Data":"89deba044f841c417bb427e36c48c0c1ad7827ea068666457c484ac571cf5b5c"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.100151 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3699-account-create-update-9vvpr" event={"ID":"c13ba380-e3f6-4322-a434-7b69da9e4e3f","Type":"ContainerDied","Data":"4caa7dcecccb1554befa146ca298dee4948c88c2173e347aef8748f425492cf8"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.100190 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4caa7dcecccb1554befa146ca298dee4948c88c2173e347aef8748f425492cf8" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.127110 5033 generic.go:334] "Generic (PLEG): container finished" podID="783187df-a438-4fb2-97be-3565e08e88ac" containerID="33f8af65abd2984345edb8a627a3f3c55b944bdf81de100064126a81f6d0b224" exitCode=2 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.127191 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"783187df-a438-4fb2-97be-3565e08e88ac","Type":"ContainerDied","Data":"33f8af65abd2984345edb8a627a3f3c55b944bdf81de100064126a81f6d0b224"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.155477 5033 generic.go:334] "Generic (PLEG): container finished" podID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerID="063b18146099b5100ca56d54d72414c86d85a242467f6339a74297d196991539" exitCode=0 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.155547 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ab4047a-07ab-4c15-88eb-7dbefe5bea78","Type":"ContainerDied","Data":"063b18146099b5100ca56d54d72414c86d85a242467f6339a74297d196991539"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.164880 5033 generic.go:334] "Generic (PLEG): container finished" podID="bd95e3d9-8fc5-4f14-a009-cd1749743679" containerID="24299c16c292f7a9965a7905be0a3de8da162ef546f6ff8af5a6a1872eed04c9" exitCode=1 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.164945 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-czhv8" event={"ID":"bd95e3d9-8fc5-4f14-a009-cd1749743679","Type":"ContainerDied","Data":"24299c16c292f7a9965a7905be0a3de8da162ef546f6ff8af5a6a1872eed04c9"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.165445 5033 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-czhv8" secret="" err="secret \"galera-openstack-dockercfg-q7jlj\" not found" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.165480 5033 scope.go:117] "RemoveContainer" containerID="24299c16c292f7a9965a7905be0a3de8da162ef546f6ff8af5a6a1872eed04c9" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.210994 5033 generic.go:334] "Generic (PLEG): container finished" podID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerID="31c65a36bdf508f77701ec37bf337b3a58e6c854126c11d769248ce82adecb01" exitCode=0 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.211103 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86984d6b66-tg6tl" event={"ID":"9646add7-5da6-4590-9394-b4325e8ebd2d","Type":"ContainerDied","Data":"31c65a36bdf508f77701ec37bf337b3a58e6c854126c11d769248ce82adecb01"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.231770 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6a31-account-create-update-gbdhh" event={"ID":"90168acb-f862-4f73-bb8a-dd5b308977e1","Type":"ContainerDied","Data":"4b18d2ee9f3fe2dc97c4a16429a82d44a4d0ee14160c0d366d7c1db9ab4fca54"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.231805 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b18d2ee9f3fe2dc97c4a16429a82d44a4d0ee14160c0d366d7c1db9ab4fca54" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.242581 5033 generic.go:334] "Generic (PLEG): container finished" podID="d989fd27-f36c-4503-a400-e09bb4e70398" containerID="8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5" exitCode=0 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.242686 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d989fd27-f36c-4503-a400-e09bb4e70398","Type":"ContainerDied","Data":"8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.242716 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d989fd27-f36c-4503-a400-e09bb4e70398","Type":"ContainerDied","Data":"0800dcfb803da44e95f2e664091861b1ee91295b19444a1da50a57c7fc5f4635"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.242729 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0800dcfb803da44e95f2e664091861b1ee91295b19444a1da50a57c7fc5f4635" Feb 26 16:07:02 crc kubenswrapper[5033]: E0226 16:07:02.256975 5033 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 26 16:07:02 crc kubenswrapper[5033]: E0226 16:07:02.257040 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts podName:bd95e3d9-8fc5-4f14-a009-cd1749743679 nodeName:}" failed. No retries permitted until 2026-02-26 16:07:02.757021441 +0000 UTC m=+1549.969620091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts") pod "root-account-create-update-czhv8" (UID: "bd95e3d9-8fc5-4f14-a009-cd1749743679") : configmap "openstack-scripts" not found Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.268403 5033 generic.go:334] "Generic (PLEG): container finished" podID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerID="e00abe8f8586a8c6516438836f22e8cd867774bd04666cb5631a9962be629702" exitCode=0 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.268466 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bf0b01b9-0085-4552-8a7d-1e7359dad79b","Type":"ContainerDied","Data":"e00abe8f8586a8c6516438836f22e8cd867774bd04666cb5631a9962be629702"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.275748 5033 generic.go:334] "Generic (PLEG): container finished" podID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerID="37b6556400506b1236e7114115d1a4566edf6e5a76d32b1374c1c673791271cf" exitCode=0 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.275815 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d648bbbdd-g6xz5" event={"ID":"5ef30f16-84a1-4809-8ec4-0c36e315c5f3","Type":"ContainerDied","Data":"37b6556400506b1236e7114115d1a4566edf6e5a76d32b1374c1c673791271cf"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.282070 5033 generic.go:334] "Generic (PLEG): container finished" podID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerID="fd787498335d14e2033ee5b470f24eb5998e703815bbba943d72e487ec251728" exitCode=0 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.282098 5033 generic.go:334] "Generic (PLEG): container finished" podID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerID="274baa6f0fac074022c217f76e1ef56693c6f2d00d433a2f9b4042c158da447b" exitCode=2 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.282151 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerDied","Data":"fd787498335d14e2033ee5b470f24eb5998e703815bbba943d72e487ec251728"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.282178 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerDied","Data":"274baa6f0fac074022c217f76e1ef56693c6f2d00d433a2f9b4042c158da447b"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.287442 5033 generic.go:334] "Generic (PLEG): container finished" podID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerID="2f2bce733fa63476566ab5bf20d7a8fb1718f0f35fccb3db14100b8bf61575d0" exitCode=0 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.287507 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3","Type":"ContainerDied","Data":"2f2bce733fa63476566ab5bf20d7a8fb1718f0f35fccb3db14100b8bf61575d0"} Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.383713 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" containerName="galera" containerID="cri-o://5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396" gracePeriod=30 Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.588760 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5c49db967-q8hjf"] Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.596988 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5c49db967-q8hjf"] Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.599340 5033 scope.go:117] "RemoveContainer" containerID="154a195b0bbcd9352b7c0c53282fb8b79ead74d9d7ef469042e66e9b52b0dc36" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.617405 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.635205 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.636884 5033 scope.go:117] "RemoveContainer" containerID="9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.658542 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.680586 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8b6t\" (UniqueName: \"kubernetes.io/projected/c13ba380-e3f6-4322-a434-7b69da9e4e3f-kube-api-access-t8b6t\") pod \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\" (UID: \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\") " Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.680716 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13ba380-e3f6-4322-a434-7b69da9e4e3f-operator-scripts\") pod \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\" (UID: \"c13ba380-e3f6-4322-a434-7b69da9e4e3f\") " Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.690881 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c13ba380-e3f6-4322-a434-7b69da9e4e3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c13ba380-e3f6-4322-a434-7b69da9e4e3f" (UID: "c13ba380-e3f6-4322-a434-7b69da9e4e3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.700768 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13ba380-e3f6-4322-a434-7b69da9e4e3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.708543 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13ba380-e3f6-4322-a434-7b69da9e4e3f-kube-api-access-t8b6t" (OuterVolumeSpecName: "kube-api-access-t8b6t") pod "c13ba380-e3f6-4322-a434-7b69da9e4e3f" (UID: "c13ba380-e3f6-4322-a434-7b69da9e4e3f"). InnerVolumeSpecName "kube-api-access-t8b6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.752035 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-c4c558dd7-x5jl5"] Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.774170 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-c4c558dd7-x5jl5"] Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.797335 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.804422 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8b6t\" (UniqueName: \"kubernetes.io/projected/c13ba380-e3f6-4322-a434-7b69da9e4e3f-kube-api-access-t8b6t\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:02 crc kubenswrapper[5033]: E0226 16:07:02.804569 5033 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 26 16:07:02 crc kubenswrapper[5033]: E0226 16:07:02.804658 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts podName:bd95e3d9-8fc5-4f14-a009-cd1749743679 nodeName:}" failed. No retries permitted until 2026-02-26 16:07:03.804639178 +0000 UTC m=+1551.017237828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts") pod "root-account-create-update-czhv8" (UID: "bd95e3d9-8fc5-4f14-a009-cd1749743679") : configmap "openstack-scripts" not found Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.807447 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.825851 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.836812 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65a8-account-create-update-h8g9k" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.860288 5033 scope.go:117] "RemoveContainer" containerID="6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.861868 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.879295 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.905421 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpf4t\" (UniqueName: \"kubernetes.io/projected/d989fd27-f36c-4503-a400-e09bb4e70398-kube-api-access-vpf4t\") pod \"d989fd27-f36c-4503-a400-e09bb4e70398\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.905734 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-config-data\") pod \"d989fd27-f36c-4503-a400-e09bb4e70398\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.905818 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-combined-ca-bundle\") pod \"d989fd27-f36c-4503-a400-e09bb4e70398\" (UID: \"d989fd27-f36c-4503-a400-e09bb4e70398\") " Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.917878 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.918727 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.921048 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fxrwp_44171111-32bf-415d-97c9-f966209cd0a0/ovn-controller/0.log" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.921099 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fxrwp" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.925167 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d989fd27-f36c-4503-a400-e09bb4e70398-kube-api-access-vpf4t" (OuterVolumeSpecName: "kube-api-access-vpf4t") pod "d989fd27-f36c-4503-a400-e09bb4e70398" (UID: "d989fd27-f36c-4503-a400-e09bb4e70398"). InnerVolumeSpecName "kube-api-access-vpf4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.925927 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.958366 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-config-data" (OuterVolumeSpecName: "config-data") pod "d989fd27-f36c-4503-a400-e09bb4e70398" (UID: "d989fd27-f36c-4503-a400-e09bb4e70398"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:02 crc kubenswrapper[5033]: I0226 16:07:02.984163 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d989fd27-f36c-4503-a400-e09bb4e70398" (UID: "d989fd27-f36c-4503-a400-e09bb4e70398"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.010986 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data-custom\") pod \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.011037 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-log-ovn\") pod \"44171111-32bf-415d-97c9-f966209cd0a0\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.011054 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-config-data\") pod \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.011076 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540a469d-81f3-40f5-9826-27384a03ce2d-operator-scripts\") pod \"540a469d-81f3-40f5-9826-27384a03ce2d\" (UID: \"540a469d-81f3-40f5-9826-27384a03ce2d\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.011096 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-combined-ca-bundle\") pod \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.011136 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-combined-ca-bundle\") pod \"783187df-a438-4fb2-97be-3565e08e88ac\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.011162 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5zjr\" (UniqueName: \"kubernetes.io/projected/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-kube-api-access-s5zjr\") pod \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.011192 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-scripts\") pod \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.011212 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-config\") pod \"783187df-a438-4fb2-97be-3565e08e88ac\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.011230 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndlpk\" (UniqueName: \"kubernetes.io/projected/783187df-a438-4fb2-97be-3565e08e88ac-kube-api-access-ndlpk\") pod \"783187df-a438-4fb2-97be-3565e08e88ac\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.012669 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn9n9\" (UniqueName: \"kubernetes.io/projected/44171111-32bf-415d-97c9-f966209cd0a0-kube-api-access-kn9n9\") pod \"44171111-32bf-415d-97c9-f966209cd0a0\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.012770 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run\") pod \"44171111-32bf-415d-97c9-f966209cd0a0\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.012852 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfn2j\" (UniqueName: \"kubernetes.io/projected/90168acb-f862-4f73-bb8a-dd5b308977e1-kube-api-access-jfn2j\") pod \"90168acb-f862-4f73-bb8a-dd5b308977e1\" (UID: \"90168acb-f862-4f73-bb8a-dd5b308977e1\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.012924 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44171111-32bf-415d-97c9-f966209cd0a0-scripts\") pod \"44171111-32bf-415d-97c9-f966209cd0a0\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.012995 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-logs\") pod \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.013062 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-public-tls-certs\") pod \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.013146 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-internal-tls-certs\") pod \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.013225 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run-ovn\") pod \"44171111-32bf-415d-97c9-f966209cd0a0\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.013332 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-combined-ca-bundle\") pod \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.015027 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "44171111-32bf-415d-97c9-f966209cd0a0" (UID: "44171111-32bf-415d-97c9-f966209cd0a0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.015654 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/540a469d-81f3-40f5-9826-27384a03ce2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "540a469d-81f3-40f5-9826-27384a03ce2d" (UID: "540a469d-81f3-40f5-9826-27384a03ce2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.015685 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run" (OuterVolumeSpecName: "var-run") pod "44171111-32bf-415d-97c9-f966209cd0a0" (UID: "44171111-32bf-415d-97c9-f966209cd0a0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.016095 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "44171111-32bf-415d-97c9-f966209cd0a0" (UID: "44171111-32bf-415d-97c9-f966209cd0a0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.018234 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-logs" (OuterVolumeSpecName: "logs") pod "5ef30f16-84a1-4809-8ec4-0c36e315c5f3" (UID: "5ef30f16-84a1-4809-8ec4-0c36e315c5f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.018582 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9ab4047a-07ab-4c15-88eb-7dbefe5bea78" (UID: "9ab4047a-07ab-4c15-88eb-7dbefe5bea78"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.019084 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-certs\") pod \"783187df-a438-4fb2-97be-3565e08e88ac\" (UID: \"783187df-a438-4fb2-97be-3565e08e88ac\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.022604 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44171111-32bf-415d-97c9-f966209cd0a0-scripts" (OuterVolumeSpecName: "scripts") pod "44171111-32bf-415d-97c9-f966209cd0a0" (UID: "44171111-32bf-415d-97c9-f966209cd0a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.026794 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90168acb-f862-4f73-bb8a-dd5b308977e1-operator-scripts\") pod \"90168acb-f862-4f73-bb8a-dd5b308977e1\" (UID: \"90168acb-f862-4f73-bb8a-dd5b308977e1\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.026838 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-combined-ca-bundle\") pod \"44171111-32bf-415d-97c9-f966209cd0a0\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.026883 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-logs\") pod \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.026914 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bchzp\" (UniqueName: \"kubernetes.io/projected/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-kube-api-access-bchzp\") pod \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.026952 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-scripts\") pod \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.026988 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs9bq\" (UniqueName: \"kubernetes.io/projected/540a469d-81f3-40f5-9826-27384a03ce2d-kube-api-access-bs9bq\") pod \"540a469d-81f3-40f5-9826-27384a03ce2d\" (UID: \"540a469d-81f3-40f5-9826-27384a03ce2d\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027005 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-internal-tls-certs\") pod \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027021 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-etc-machine-id\") pod \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027040 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-public-tls-certs\") pod \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\" (UID: \"5ef30f16-84a1-4809-8ec4-0c36e315c5f3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027099 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data\") pod \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\" (UID: \"9ab4047a-07ab-4c15-88eb-7dbefe5bea78\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027131 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-ovn-controller-tls-certs\") pod \"44171111-32bf-415d-97c9-f966209cd0a0\" (UID: \"44171111-32bf-415d-97c9-f966209cd0a0\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027686 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027703 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d989fd27-f36c-4503-a400-e09bb4e70398-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027713 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027722 5033 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027731 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/540a469d-81f3-40f5-9826-27384a03ce2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027740 5033 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027748 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44171111-32bf-415d-97c9-f966209cd0a0-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027755 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027763 5033 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44171111-32bf-415d-97c9-f966209cd0a0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.027771 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpf4t\" (UniqueName: \"kubernetes.io/projected/d989fd27-f36c-4503-a400-e09bb4e70398-kube-api-access-vpf4t\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.030738 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-logs" (OuterVolumeSpecName: "logs") pod "9ab4047a-07ab-4c15-88eb-7dbefe5bea78" (UID: "9ab4047a-07ab-4c15-88eb-7dbefe5bea78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.030990 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9ab4047a-07ab-4c15-88eb-7dbefe5bea78" (UID: "9ab4047a-07ab-4c15-88eb-7dbefe5bea78"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.035811 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-scripts" (OuterVolumeSpecName: "scripts") pod "5ef30f16-84a1-4809-8ec4-0c36e315c5f3" (UID: "5ef30f16-84a1-4809-8ec4-0c36e315c5f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.035833 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-scripts" (OuterVolumeSpecName: "scripts") pod "9ab4047a-07ab-4c15-88eb-7dbefe5bea78" (UID: "9ab4047a-07ab-4c15-88eb-7dbefe5bea78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.035835 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90168acb-f862-4f73-bb8a-dd5b308977e1-kube-api-access-jfn2j" (OuterVolumeSpecName: "kube-api-access-jfn2j") pod "90168acb-f862-4f73-bb8a-dd5b308977e1" (UID: "90168acb-f862-4f73-bb8a-dd5b308977e1"). InnerVolumeSpecName "kube-api-access-jfn2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.038774 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90168acb-f862-4f73-bb8a-dd5b308977e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90168acb-f862-4f73-bb8a-dd5b308977e1" (UID: "90168acb-f862-4f73-bb8a-dd5b308977e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.049482 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44171111-32bf-415d-97c9-f966209cd0a0-kube-api-access-kn9n9" (OuterVolumeSpecName: "kube-api-access-kn9n9") pod "44171111-32bf-415d-97c9-f966209cd0a0" (UID: "44171111-32bf-415d-97c9-f966209cd0a0"). InnerVolumeSpecName "kube-api-access-kn9n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.062382 5033 scope.go:117] "RemoveContainer" containerID="9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.062893 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-kube-api-access-s5zjr" (OuterVolumeSpecName: "kube-api-access-s5zjr") pod "9ab4047a-07ab-4c15-88eb-7dbefe5bea78" (UID: "9ab4047a-07ab-4c15-88eb-7dbefe5bea78"). InnerVolumeSpecName "kube-api-access-s5zjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.063336 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa\": container with ID starting with 9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa not found: ID does not exist" containerID="9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.063445 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa"} err="failed to get container status \"9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa\": rpc error: code = NotFound desc = could not find container \"9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa\": container with ID starting with 9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa not found: ID does not exist" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.063694 5033 scope.go:117] "RemoveContainer" containerID="6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.063499 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/540a469d-81f3-40f5-9826-27384a03ce2d-kube-api-access-bs9bq" (OuterVolumeSpecName: "kube-api-access-bs9bq") pod "540a469d-81f3-40f5-9826-27384a03ce2d" (UID: "540a469d-81f3-40f5-9826-27384a03ce2d"). InnerVolumeSpecName "kube-api-access-bs9bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.064467 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783187df-a438-4fb2-97be-3565e08e88ac-kube-api-access-ndlpk" (OuterVolumeSpecName: "kube-api-access-ndlpk") pod "783187df-a438-4fb2-97be-3565e08e88ac" (UID: "783187df-a438-4fb2-97be-3565e08e88ac"). InnerVolumeSpecName "kube-api-access-ndlpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.064679 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-kube-api-access-bchzp" (OuterVolumeSpecName: "kube-api-access-bchzp") pod "5ef30f16-84a1-4809-8ec4-0c36e315c5f3" (UID: "5ef30f16-84a1-4809-8ec4-0c36e315c5f3"). InnerVolumeSpecName "kube-api-access-bchzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.068485 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765\": container with ID starting with 6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765 not found: ID does not exist" containerID="6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.068712 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765"} err="failed to get container status \"6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765\": rpc error: code = NotFound desc = could not find container \"6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765\": container with ID starting with 6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765 not found: ID does not exist" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.068809 5033 scope.go:117] "RemoveContainer" containerID="9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.070608 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa"} err="failed to get container status \"9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa\": rpc error: code = NotFound desc = could not find container \"9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa\": container with ID starting with 9c5a8ae1162f38de182e1f8f3afe665a13de1848543dbe8dd016db6f366a1afa not found: ID does not exist" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.070743 5033 scope.go:117] "RemoveContainer" containerID="6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.070922 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.073677 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765"} err="failed to get container status \"6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765\": rpc error: code = NotFound desc = could not find container \"6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765\": container with ID starting with 6f5ac929854a245eee36cb0b33662eed5e7571cb232114f2c6eb531e36319765 not found: ID does not exist" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.073722 5033 scope.go:117] "RemoveContainer" containerID="e792002563a7c6234a552b17da91049d9907ca9b6088ef5dc670bcb69af229ae" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.084308 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.095695 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130385 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5zjr\" (UniqueName: \"kubernetes.io/projected/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-kube-api-access-s5zjr\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130409 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130420 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndlpk\" (UniqueName: \"kubernetes.io/projected/783187df-a438-4fb2-97be-3565e08e88ac-kube-api-access-ndlpk\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130429 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn9n9\" (UniqueName: \"kubernetes.io/projected/44171111-32bf-415d-97c9-f966209cd0a0-kube-api-access-kn9n9\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130438 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfn2j\" (UniqueName: \"kubernetes.io/projected/90168acb-f862-4f73-bb8a-dd5b308977e1-kube-api-access-jfn2j\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130447 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90168acb-f862-4f73-bb8a-dd5b308977e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130455 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130462 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bchzp\" (UniqueName: \"kubernetes.io/projected/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-kube-api-access-bchzp\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130470 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130479 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs9bq\" (UniqueName: \"kubernetes.io/projected/540a469d-81f3-40f5-9826-27384a03ce2d-kube-api-access-bs9bq\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.130487 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.153100 5033 scope.go:117] "RemoveContainer" containerID="f144e91d04c79983ad9d2426e8a65c28484d0f2b51c249d12afc1f6ee1a0c14a" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.160506 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "783187df-a438-4fb2-97be-3565e08e88ac" (UID: "783187df-a438-4fb2-97be-3565e08e88ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.160552 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4 is running failed: container process not found" containerID="ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.160845 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4 is running failed: container process not found" containerID="ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.161185 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4 is running failed: container process not found" containerID="ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.161241 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="17f57fe7-555b-4119-b03e-fafb151494a6" containerName="nova-cell1-conductor-conductor" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.169104 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ab4047a-07ab-4c15-88eb-7dbefe5bea78" (UID: "9ab4047a-07ab-4c15-88eb-7dbefe5bea78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231093 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-internal-tls-certs\") pod \"9646add7-5da6-4590-9394-b4325e8ebd2d\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231147 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-public-tls-certs\") pod \"9646add7-5da6-4590-9394-b4325e8ebd2d\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231169 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-nova-metadata-tls-certs\") pod \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231198 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frc4t\" (UniqueName: \"kubernetes.io/projected/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-kube-api-access-frc4t\") pod \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231283 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-internal-tls-certs\") pod \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231301 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q9gx\" (UniqueName: \"kubernetes.io/projected/9646add7-5da6-4590-9394-b4325e8ebd2d-kube-api-access-4q9gx\") pod \"9646add7-5da6-4590-9394-b4325e8ebd2d\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231328 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-config-data\") pod \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231356 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231387 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646add7-5da6-4590-9394-b4325e8ebd2d-logs\") pod \"9646add7-5da6-4590-9394-b4325e8ebd2d\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231403 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data\") pod \"9646add7-5da6-4590-9394-b4325e8ebd2d\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231423 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-logs\") pod \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231487 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-httpd-run\") pod \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231511 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgfhn\" (UniqueName: \"kubernetes.io/projected/bf0b01b9-0085-4552-8a7d-1e7359dad79b-kube-api-access-vgfhn\") pod \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231540 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-config-data\") pod \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231556 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-logs\") pod \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231583 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-combined-ca-bundle\") pod \"9646add7-5da6-4590-9394-b4325e8ebd2d\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231620 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-combined-ca-bundle\") pod \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231658 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data-custom\") pod \"9646add7-5da6-4590-9394-b4325e8ebd2d\" (UID: \"9646add7-5da6-4590-9394-b4325e8ebd2d\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231700 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-scripts\") pod \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\" (UID: \"bf0b01b9-0085-4552-8a7d-1e7359dad79b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.231729 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-combined-ca-bundle\") pod \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\" (UID: \"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.232053 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.232069 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.234341 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-logs" (OuterVolumeSpecName: "logs") pod "bf0b01b9-0085-4552-8a7d-1e7359dad79b" (UID: "bf0b01b9-0085-4552-8a7d-1e7359dad79b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.234418 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9646add7-5da6-4590-9394-b4325e8ebd2d-logs" (OuterVolumeSpecName: "logs") pod "9646add7-5da6-4590-9394-b4325e8ebd2d" (UID: "9646add7-5da6-4590-9394-b4325e8ebd2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.234849 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bf0b01b9-0085-4552-8a7d-1e7359dad79b" (UID: "bf0b01b9-0085-4552-8a7d-1e7359dad79b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.240577 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-logs" (OuterVolumeSpecName: "logs") pod "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" (UID: "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.245606 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "783187df-a438-4fb2-97be-3565e08e88ac" (UID: "783187df-a438-4fb2-97be-3565e08e88ac"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.270986 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0b01b9-0085-4552-8a7d-1e7359dad79b-kube-api-access-vgfhn" (OuterVolumeSpecName: "kube-api-access-vgfhn") pod "bf0b01b9-0085-4552-8a7d-1e7359dad79b" (UID: "bf0b01b9-0085-4552-8a7d-1e7359dad79b"). InnerVolumeSpecName "kube-api-access-vgfhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.275657 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "bf0b01b9-0085-4552-8a7d-1e7359dad79b" (UID: "bf0b01b9-0085-4552-8a7d-1e7359dad79b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.298819 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9646add7-5da6-4590-9394-b4325e8ebd2d" (UID: "9646add7-5da6-4590-9394-b4325e8ebd2d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.319014 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-kube-api-access-frc4t" (OuterVolumeSpecName: "kube-api-access-frc4t") pod "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" (UID: "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a"). InnerVolumeSpecName "kube-api-access-frc4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.319104 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-scripts" (OuterVolumeSpecName: "scripts") pod "bf0b01b9-0085-4552-8a7d-1e7359dad79b" (UID: "bf0b01b9-0085-4552-8a7d-1e7359dad79b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.327654 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9646add7-5da6-4590-9394-b4325e8ebd2d-kube-api-access-4q9gx" (OuterVolumeSpecName: "kube-api-access-4q9gx") pod "9646add7-5da6-4590-9394-b4325e8ebd2d" (UID: "9646add7-5da6-4590-9394-b4325e8ebd2d"). InnerVolumeSpecName "kube-api-access-4q9gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334013 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q9gx\" (UniqueName: \"kubernetes.io/projected/9646add7-5da6-4590-9394-b4325e8ebd2d-kube-api-access-4q9gx\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334073 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334088 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9646add7-5da6-4590-9394-b4325e8ebd2d-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334100 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334112 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf0b01b9-0085-4552-8a7d-1e7359dad79b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334124 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgfhn\" (UniqueName: \"kubernetes.io/projected/bf0b01b9-0085-4552-8a7d-1e7359dad79b-kube-api-access-vgfhn\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334134 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334144 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334153 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334164 5033 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334178 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frc4t\" (UniqueName: \"kubernetes.io/projected/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-kube-api-access-frc4t\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334753 5033 generic.go:334] "Generic (PLEG): container finished" podID="bd95e3d9-8fc5-4f14-a009-cd1749743679" containerID="3436199015eba2036bfcbc2a3bbd95bd89f8131adcc1124651fd97c7916dfecf" exitCode=1 Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.334837 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-czhv8" event={"ID":"bd95e3d9-8fc5-4f14-a009-cd1749743679","Type":"ContainerDied","Data":"3436199015eba2036bfcbc2a3bbd95bd89f8131adcc1124651fd97c7916dfecf"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.341645 5033 generic.go:334] "Generic (PLEG): container finished" podID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerID="7c2836b72b99580d0853eb6d7ae0794bc4651ce530042e155abe1f4de07321f5" exitCode=0 Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.341685 5033 generic.go:334] "Generic (PLEG): container finished" podID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerID="efe61e5ba101f92c1e8b4416d9d0570b31c5948ae215ef72723140bf034dbde5" exitCode=0 Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.341736 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerDied","Data":"7c2836b72b99580d0853eb6d7ae0794bc4651ce530042e155abe1f4de07321f5"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.341768 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerDied","Data":"efe61e5ba101f92c1e8b4416d9d0570b31c5948ae215ef72723140bf034dbde5"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.341783 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fd331de5-997f-4bb2-a032-d174a0bf059b","Type":"ContainerDied","Data":"e16de7507f7e8a40917c116a17c4fb708d68c18d495c54e4b52aec253773b77b"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.341794 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e16de7507f7e8a40917c116a17c4fb708d68c18d495c54e4b52aec253773b77b" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.349478 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb2a5173-ce4c-4d4a-a136-70356adabbb9","Type":"ContainerDied","Data":"b6eace0b2103a031f250c85a90a9746babaf3bcd0afd8cd89a42175fe2fb3877"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.348333 5033 generic.go:334] "Generic (PLEG): container finished" podID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerID="b6eace0b2103a031f250c85a90a9746babaf3bcd0afd8cd89a42175fe2fb3877" exitCode=0 Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.351564 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb2a5173-ce4c-4d4a-a136-70356adabbb9","Type":"ContainerDied","Data":"8d76e3a5afef1c114c747943f3f2a6468ea7e8c016af89ad72742973d77f7844"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.351613 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d76e3a5afef1c114c747943f3f2a6468ea7e8c016af89ad72742973d77f7844" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.359452 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d648bbbdd-g6xz5" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.359690 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d648bbbdd-g6xz5" event={"ID":"5ef30f16-84a1-4809-8ec4-0c36e315c5f3","Type":"ContainerDied","Data":"22427c3ebd79a2624636bc4e51acacd7a681350f748159da033bf50d2e3b1787"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.380732 5033 generic.go:334] "Generic (PLEG): container finished" podID="801b1613-3ed1-423d-85cf-6b87f6711034" containerID="594bc92b21cdaf12c49af3fc339f3186f938691726d83e69d1538a692373df92" exitCode=0 Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.380814 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"801b1613-3ed1-423d-85cf-6b87f6711034","Type":"ContainerDied","Data":"594bc92b21cdaf12c49af3fc339f3186f938691726d83e69d1538a692373df92"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.393763 5033 generic.go:334] "Generic (PLEG): container finished" podID="17f57fe7-555b-4119-b03e-fafb151494a6" containerID="ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4" exitCode=0 Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.393935 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"17f57fe7-555b-4119-b03e-fafb151494a6","Type":"ContainerDied","Data":"ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.400123 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.421494 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bf0b01b9-0085-4552-8a7d-1e7359dad79b","Type":"ContainerDied","Data":"56ec88a7fa9446b4543b463ffdb1a7c796f944337d736fcb3994797aeedb19eb"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.421729 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.429802 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" (UID: "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.429801 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-config-data" (OuterVolumeSpecName: "config-data") pod "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" (UID: "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.430550 5033 generic.go:334] "Generic (PLEG): container finished" podID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerID="4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166" exitCode=0 Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.430608 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a","Type":"ContainerDied","Data":"4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.430636 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ea4c6e17-4b51-4387-98c8-d071c1fc7c5a","Type":"ContainerDied","Data":"908ad1a560cc30affda4c6aead0dd22bd4e7d2a13900651cbf3603b49754f852"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.430690 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.433780 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"783187df-a438-4fb2-97be-3565e08e88ac","Type":"ContainerDied","Data":"6cf62275935f253d6f1c1359b42290db1153e8c9f13019cbba8b8a9eee589f3f"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.433862 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.435392 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.435411 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.435421 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.436689 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86984d6b66-tg6tl" event={"ID":"9646add7-5da6-4590-9394-b4325e8ebd2d","Type":"ContainerDied","Data":"056cff826fc8e45b52030e3b8c00067d7e0a0355ae304f9d8c32c8242e5947f2"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.436770 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86984d6b66-tg6tl" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.437922 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data" (OuterVolumeSpecName: "config-data") pod "9ab4047a-07ab-4c15-88eb-7dbefe5bea78" (UID: "9ab4047a-07ab-4c15-88eb-7dbefe5bea78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.446877 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "783187df-a438-4fb2-97be-3565e08e88ac" (UID: "783187df-a438-4fb2-97be-3565e08e88ac"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.447357 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fxrwp_44171111-32bf-415d-97c9-f966209cd0a0/ovn-controller/0.log" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.447470 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fxrwp" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.447527 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fxrwp" event={"ID":"44171111-32bf-415d-97c9-f966209cd0a0","Type":"ContainerDied","Data":"325feb8e037e2632b533481e82a50ef6be3a245a2ff6fe4331faac72b59aded5"} Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.448232 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea is running failed: container process not found" containerID="b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.448570 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea is running failed: container process not found" containerID="b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.449310 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea is running failed: container process not found" containerID="b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.449390 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="0dcb38b4-0144-434f-adc1-63a2b63565ee" containerName="nova-cell0-conductor-conductor" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.450040 5033 generic.go:334] "Generic (PLEG): container finished" podID="0dcb38b4-0144-434f-adc1-63a2b63565ee" containerID="b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea" exitCode=0 Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.450091 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0dcb38b4-0144-434f-adc1-63a2b63565ee","Type":"ContainerDied","Data":"b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.451031 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9ab4047a-07ab-4c15-88eb-7dbefe5bea78" (UID: "9ab4047a-07ab-4c15-88eb-7dbefe5bea78"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.454489 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ab4047a-07ab-4c15-88eb-7dbefe5bea78","Type":"ContainerDied","Data":"0be02a6fc7a422b8cd78e3b3ae0f7f1d02a6ae81055223b5327c59826ff6849f"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.454650 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.464634 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fdf9-account-create-update-lshpf" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.464645 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fdf9-account-create-update-lshpf" event={"ID":"540a469d-81f3-40f5-9826-27384a03ce2d","Type":"ContainerDied","Data":"c3d562a204d3a5727636a6f222bf37664bf9a81038b19ba4c86ea6df4568383b"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.467023 5033 generic.go:334] "Generic (PLEG): container finished" podID="be41eafa-8738-4d4a-9f9e-592c105bc766" containerID="067a1b260955a76eecc32e5016cd673aa068e9dd6e33251557d30945591fa778" exitCode=0 Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.467076 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"be41eafa-8738-4d4a-9f9e-592c105bc766","Type":"ContainerDied","Data":"067a1b260955a76eecc32e5016cd673aa068e9dd6e33251557d30945591fa778"} Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.469663 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6a31-account-create-update-gbdhh" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.469718 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65a8-account-create-update-h8g9k" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.469716 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3699-account-create-update-9vvpr" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.469899 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.492520 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ef30f16-84a1-4809-8ec4-0c36e315c5f3" (UID: "5ef30f16-84a1-4809-8ec4-0c36e315c5f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.504727 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="65abe18a8fd80fc9112bf0561b66f917755c231568bc655d18e082d960532c5a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.507896 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="65abe18a8fd80fc9112bf0561b66f917755c231568bc655d18e082d960532c5a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.511299 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="65abe18a8fd80fc9112bf0561b66f917755c231568bc655d18e082d960532c5a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.511444 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerName="ovn-northd" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.512081 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bf0b01b9-0085-4552-8a7d-1e7359dad79b" (UID: "bf0b01b9-0085-4552-8a7d-1e7359dad79b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.518909 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5ef30f16-84a1-4809-8ec4-0c36e315c5f3" (UID: "5ef30f16-84a1-4809-8ec4-0c36e315c5f3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.521537 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "44171111-32bf-415d-97c9-f966209cd0a0" (UID: "44171111-32bf-415d-97c9-f966209cd0a0"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.523112 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9646add7-5da6-4590-9394-b4325e8ebd2d" (UID: "9646add7-5da6-4590-9394-b4325e8ebd2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.537437 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.537475 5033 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/783187df-a438-4fb2-97be-3565e08e88ac-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.537490 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.537504 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.537514 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.537525 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.537535 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.537545 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.539742 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-config-data" (OuterVolumeSpecName: "config-data") pod "bf0b01b9-0085-4552-8a7d-1e7359dad79b" (UID: "bf0b01b9-0085-4552-8a7d-1e7359dad79b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.550751 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf0b01b9-0085-4552-8a7d-1e7359dad79b" (UID: "bf0b01b9-0085-4552-8a7d-1e7359dad79b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.564680 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9646add7-5da6-4590-9394-b4325e8ebd2d" (UID: "9646add7-5da6-4590-9394-b4325e8ebd2d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.573521 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44171111-32bf-415d-97c9-f966209cd0a0" (UID: "44171111-32bf-415d-97c9-f966209cd0a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.577092 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data" (OuterVolumeSpecName: "config-data") pod "9646add7-5da6-4590-9394-b4325e8ebd2d" (UID: "9646add7-5da6-4590-9394-b4325e8ebd2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.583788 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9646add7-5da6-4590-9394-b4325e8ebd2d" (UID: "9646add7-5da6-4590-9394-b4325e8ebd2d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.589873 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-config-data" (OuterVolumeSpecName: "config-data") pod "5ef30f16-84a1-4809-8ec4-0c36e315c5f3" (UID: "5ef30f16-84a1-4809-8ec4-0c36e315c5f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.600733 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" (UID: "ea4c6e17-4b51-4387-98c8-d071c1fc7c5a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.603889 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9ab4047a-07ab-4c15-88eb-7dbefe5bea78" (UID: "9ab4047a-07ab-4c15-88eb-7dbefe5bea78"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.607636 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5ef30f16-84a1-4809-8ec4-0c36e315c5f3" (UID: "5ef30f16-84a1-4809-8ec4-0c36e315c5f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640349 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640393 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640405 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640415 5033 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640705 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef30f16-84a1-4809-8ec4-0c36e315c5f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640729 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44171111-32bf-415d-97c9-f966209cd0a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640740 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640752 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9646add7-5da6-4590-9394-b4325e8ebd2d-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640764 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab4047a-07ab-4c15-88eb-7dbefe5bea78-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.640774 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf0b01b9-0085-4552-8a7d-1e7359dad79b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.645644 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41652b04-a199-4ceb-a87d-5734867a31b5" path="/var/lib/kubelet/pods/41652b04-a199-4ceb-a87d-5734867a31b5/volumes" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.649198 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50dfde4d-180f-4679-b02b-62a9592eb237" path="/var/lib/kubelet/pods/50dfde4d-180f-4679-b02b-62a9592eb237/volumes" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.650240 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ceab3ba-6b5a-4c0d-a81d-a17a9273837f" path="/var/lib/kubelet/pods/8ceab3ba-6b5a-4c0d-a81d-a17a9273837f/volumes" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.650901 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a35ad9-a937-4d01-961b-a3488303610e" path="/var/lib/kubelet/pods/a1a35ad9-a937-4d01-961b-a3488303610e/volumes" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.652179 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9de3c3b-afe5-4a88-99bd-28da4fe00092" path="/var/lib/kubelet/pods/c9de3c3b-afe5-4a88-99bd-28da4fe00092/volumes" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.652823 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8fd420c-3146-49a5-a9c5-69e3f01e8848" path="/var/lib/kubelet/pods/e8fd420c-3146-49a5-a9c5-69e3f01e8848/volumes" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.653397 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb635e2d-e92d-4f18-be4b-972e388e8c25" path="/var/lib/kubelet/pods/eb635e2d-e92d-4f18-be4b-972e388e8c25/volumes" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.654448 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcedbed9-f0d1-4c33-b601-50da4f37c971" path="/var/lib/kubelet/pods/fcedbed9-f0d1-4c33-b601-50da4f37c971/volumes" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.664538 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.670298 5033 scope.go:117] "RemoveContainer" containerID="adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.712582 5033 scope.go:117] "RemoveContainer" containerID="adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718" Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.712994 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718\": container with ID starting with adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718 not found: ID does not exist" containerID="adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.713038 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718"} err="failed to get container status \"adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718\": rpc error: code = NotFound desc = could not find container \"adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718\": container with ID starting with adb44919b1ccc6a88e5442d2c088d86a9f62b586fe5143b0139857b26519b718 not found: ID does not exist" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.713067 5033 scope.go:117] "RemoveContainer" containerID="24299c16c292f7a9965a7905be0a3de8da162ef546f6ff8af5a6a1872eed04c9" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.749745 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-scripts\") pod \"fd331de5-997f-4bb2-a032-d174a0bf059b\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.750112 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-sg-core-conf-yaml\") pod \"fd331de5-997f-4bb2-a032-d174a0bf059b\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.750146 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-log-httpd\") pod \"fd331de5-997f-4bb2-a032-d174a0bf059b\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.751466 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fd331de5-997f-4bb2-a032-d174a0bf059b" (UID: "fd331de5-997f-4bb2-a032-d174a0bf059b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.754445 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkp8w\" (UniqueName: \"kubernetes.io/projected/fd331de5-997f-4bb2-a032-d174a0bf059b-kube-api-access-dkp8w\") pod \"fd331de5-997f-4bb2-a032-d174a0bf059b\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.754502 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-combined-ca-bundle\") pod \"fd331de5-997f-4bb2-a032-d174a0bf059b\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.754522 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-ceilometer-tls-certs\") pod \"fd331de5-997f-4bb2-a032-d174a0bf059b\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.754608 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-config-data\") pod \"fd331de5-997f-4bb2-a032-d174a0bf059b\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.754629 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-run-httpd\") pod \"fd331de5-997f-4bb2-a032-d174a0bf059b\" (UID: \"fd331de5-997f-4bb2-a032-d174a0bf059b\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.754653 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-scripts" (OuterVolumeSpecName: "scripts") pod "fd331de5-997f-4bb2-a032-d174a0bf059b" (UID: "fd331de5-997f-4bb2-a032-d174a0bf059b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.755357 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.755375 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.755801 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fd331de5-997f-4bb2-a032-d174a0bf059b" (UID: "fd331de5-997f-4bb2-a032-d174a0bf059b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.758653 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd331de5-997f-4bb2-a032-d174a0bf059b-kube-api-access-dkp8w" (OuterVolumeSpecName: "kube-api-access-dkp8w") pod "fd331de5-997f-4bb2-a032-d174a0bf059b" (UID: "fd331de5-997f-4bb2-a032-d174a0bf059b"). InnerVolumeSpecName "kube-api-access-dkp8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.768666 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.775153 5033 scope.go:117] "RemoveContainer" containerID="37b6556400506b1236e7114115d1a4566edf6e5a76d32b1374c1c673791271cf" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.798300 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fd331de5-997f-4bb2-a032-d174a0bf059b" (UID: "fd331de5-997f-4bb2-a032-d174a0bf059b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.819022 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.821136 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.844312 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.869797 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fd331de5-997f-4bb2-a032-d174a0bf059b" (UID: "fd331de5-997f-4bb2-a032-d174a0bf059b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.881471 5033 scope.go:117] "RemoveContainer" containerID="e957c4c3c0035ebbddff5e5a96c325ad85985b3bd7c2e72eaa4ed32e100f9146" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.890047 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-kolla-config\") pod \"be41eafa-8738-4d4a-9f9e-592c105bc766\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.890115 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-scripts\") pod \"801b1613-3ed1-423d-85cf-6b87f6711034\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.890184 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-combined-ca-bundle\") pod \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.890253 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-httpd-run\") pod \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.890298 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb2a5173-ce4c-4d4a-a136-70356adabbb9-logs\") pod \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892511 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801b1613-3ed1-423d-85cf-6b87f6711034-etc-machine-id\") pod \"801b1613-3ed1-423d-85cf-6b87f6711034\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892602 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c26g5\" (UniqueName: \"kubernetes.io/projected/801b1613-3ed1-423d-85cf-6b87f6711034-kube-api-access-c26g5\") pod \"801b1613-3ed1-423d-85cf-6b87f6711034\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892636 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-combined-ca-bundle\") pod \"be41eafa-8738-4d4a-9f9e-592c105bc766\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892708 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-config-data\") pod \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892753 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-combined-ca-bundle\") pod \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892788 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data\") pod \"801b1613-3ed1-423d-85cf-6b87f6711034\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892847 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-public-tls-certs\") pod \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892896 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9jd7\" (UniqueName: \"kubernetes.io/projected/fb2a5173-ce4c-4d4a-a136-70356adabbb9-kube-api-access-z9jd7\") pod \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892944 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-config-data\") pod \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.892998 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-logs\") pod \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893027 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh6v9\" (UniqueName: \"kubernetes.io/projected/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-kube-api-access-sh6v9\") pod \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893065 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-combined-ca-bundle\") pod \"801b1613-3ed1-423d-85cf-6b87f6711034\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893095 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893122 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data-custom\") pod \"801b1613-3ed1-423d-85cf-6b87f6711034\" (UID: \"801b1613-3ed1-423d-85cf-6b87f6711034\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893160 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-config-data\") pod \"be41eafa-8738-4d4a-9f9e-592c105bc766\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893199 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-internal-tls-certs\") pod \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\" (UID: \"fb2a5173-ce4c-4d4a-a136-70356adabbb9\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893230 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-scripts\") pod \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893282 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-memcached-tls-certs\") pod \"be41eafa-8738-4d4a-9f9e-592c105bc766\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893338 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-public-tls-certs\") pod \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\" (UID: \"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.893386 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9cz8\" (UniqueName: \"kubernetes.io/projected/be41eafa-8738-4d4a-9f9e-592c105bc766-kube-api-access-k9cz8\") pod \"be41eafa-8738-4d4a-9f9e-592c105bc766\" (UID: \"be41eafa-8738-4d4a-9f9e-592c105bc766\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.894356 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" (UID: "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.895189 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "be41eafa-8738-4d4a-9f9e-592c105bc766" (UID: "be41eafa-8738-4d4a-9f9e-592c105bc766"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.895472 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-scripts" (OuterVolumeSpecName: "scripts") pod "801b1613-3ed1-423d-85cf-6b87f6711034" (UID: "801b1613-3ed1-423d-85cf-6b87f6711034"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.896409 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkp8w\" (UniqueName: \"kubernetes.io/projected/fd331de5-997f-4bb2-a032-d174a0bf059b-kube-api-access-dkp8w\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.896445 5033 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.896466 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.896480 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.896492 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.896510 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd331de5-997f-4bb2-a032-d174a0bf059b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.896527 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.896621 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.896750 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data podName:443e562a-aefe-46f3-a86f-59205f5849c7 nodeName:}" failed. No retries permitted until 2026-02-26 16:07:11.896718413 +0000 UTC m=+1559.109317073 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data") pod "rabbitmq-server-0" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7") : configmap "rabbitmq-config-data" not found Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.897543 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-logs" (OuterVolumeSpecName: "logs") pod "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" (UID: "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.899248 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801b1613-3ed1-423d-85cf-6b87f6711034-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "801b1613-3ed1-423d-85cf-6b87f6711034" (UID: "801b1613-3ed1-423d-85cf-6b87f6711034"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.900017 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb2a5173-ce4c-4d4a-a136-70356adabbb9-logs" (OuterVolumeSpecName: "logs") pod "fb2a5173-ce4c-4d4a-a136-70356adabbb9" (UID: "fb2a5173-ce4c-4d4a-a136-70356adabbb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.915086 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3699-account-create-update-9vvpr"] Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.926371 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-config-data" (OuterVolumeSpecName: "config-data") pod "be41eafa-8738-4d4a-9f9e-592c105bc766" (UID: "be41eafa-8738-4d4a-9f9e-592c105bc766"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.926685 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.927996 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "801b1613-3ed1-423d-85cf-6b87f6711034" (UID: "801b1613-3ed1-423d-85cf-6b87f6711034"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.928914 5033 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 26 16:07:03 crc kubenswrapper[5033]: E0226 16:07:03.929005 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts podName:bd95e3d9-8fc5-4f14-a009-cd1749743679 nodeName:}" failed. No retries permitted until 2026-02-26 16:07:05.928982773 +0000 UTC m=+1553.141581423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts") pod "root-account-create-update-czhv8" (UID: "bd95e3d9-8fc5-4f14-a009-cd1749743679") : configmap "openstack-scripts" not found Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.930923 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801b1613-3ed1-423d-85cf-6b87f6711034-kube-api-access-c26g5" (OuterVolumeSpecName: "kube-api-access-c26g5") pod "801b1613-3ed1-423d-85cf-6b87f6711034" (UID: "801b1613-3ed1-423d-85cf-6b87f6711034"). InnerVolumeSpecName "kube-api-access-c26g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.931621 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-kube-api-access-sh6v9" (OuterVolumeSpecName: "kube-api-access-sh6v9") pod "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" (UID: "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3"). InnerVolumeSpecName "kube-api-access-sh6v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.931473 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3699-account-create-update-9vvpr"] Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.935675 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2a5173-ce4c-4d4a-a136-70356adabbb9-kube-api-access-z9jd7" (OuterVolumeSpecName: "kube-api-access-z9jd7") pod "fb2a5173-ce4c-4d4a-a136-70356adabbb9" (UID: "fb2a5173-ce4c-4d4a-a136-70356adabbb9"). InnerVolumeSpecName "kube-api-access-z9jd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.940774 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-scripts" (OuterVolumeSpecName: "scripts") pod "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" (UID: "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.941717 5033 scope.go:117] "RemoveContainer" containerID="e00abe8f8586a8c6516438836f22e8cd867774bd04666cb5631a9962be629702" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.943225 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" (UID: "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.944183 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be41eafa-8738-4d4a-9f9e-592c105bc766-kube-api-access-k9cz8" (OuterVolumeSpecName: "kube-api-access-k9cz8") pod "be41eafa-8738-4d4a-9f9e-592c105bc766" (UID: "be41eafa-8738-4d4a-9f9e-592c105bc766"). InnerVolumeSpecName "kube-api-access-k9cz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.944538 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb2a5173-ce4c-4d4a-a136-70356adabbb9" (UID: "fb2a5173-ce4c-4d4a-a136-70356adabbb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.951484 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-65a8-account-create-update-h8g9k"] Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.963071 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-65a8-account-create-update-h8g9k"] Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.972054 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.985092 5033 scope.go:117] "RemoveContainer" containerID="a46ddf9c39fa3f9620a8c8a0c8752e122ad6d05de939c29de0a166524be100e8" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.997596 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prxml\" (UniqueName: \"kubernetes.io/projected/0dcb38b4-0144-434f-adc1-63a2b63565ee-kube-api-access-prxml\") pod \"0dcb38b4-0144-434f-adc1-63a2b63565ee\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.997934 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-config-data\") pod \"0dcb38b4-0144-434f-adc1-63a2b63565ee\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.998056 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-combined-ca-bundle\") pod \"0dcb38b4-0144-434f-adc1-63a2b63565ee\" (UID: \"0dcb38b4-0144-434f-adc1-63a2b63565ee\") " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.998898 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9jd7\" (UniqueName: \"kubernetes.io/projected/fb2a5173-ce4c-4d4a-a136-70356adabbb9-kube-api-access-z9jd7\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.998970 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.999024 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh6v9\" (UniqueName: \"kubernetes.io/projected/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-kube-api-access-sh6v9\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.999084 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.999136 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.999201 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be41eafa-8738-4d4a-9f9e-592c105bc766-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:03 crc kubenswrapper[5033]: I0226 16:07:03.999268 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.005458 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9cz8\" (UniqueName: \"kubernetes.io/projected/be41eafa-8738-4d4a-9f9e-592c105bc766-kube-api-access-k9cz8\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.005630 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.005712 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb2a5173-ce4c-4d4a-a136-70356adabbb9-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.005766 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801b1613-3ed1-423d-85cf-6b87f6711034-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.005838 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c26g5\" (UniqueName: \"kubernetes.io/projected/801b1613-3ed1-423d-85cf-6b87f6711034-kube-api-access-c26g5\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.021960 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.031205 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dcb38b4-0144-434f-adc1-63a2b63565ee-kube-api-access-prxml" (OuterVolumeSpecName: "kube-api-access-prxml") pod "0dcb38b4-0144-434f-adc1-63a2b63565ee" (UID: "0dcb38b4-0144-434f-adc1-63a2b63565ee"). InnerVolumeSpecName "kube-api-access-prxml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.041574 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-czhv8" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.074063 5033 scope.go:117] "RemoveContainer" containerID="4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.074289 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.082056 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-config-data" (OuterVolumeSpecName: "config-data") pod "fb2a5173-ce4c-4d4a-a136-70356adabbb9" (UID: "fb2a5173-ce4c-4d4a-a136-70356adabbb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.087949 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-config-data" (OuterVolumeSpecName: "config-data") pod "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" (UID: "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.095555 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dcb38b4-0144-434f-adc1-63a2b63565ee" (UID: "0dcb38b4-0144-434f-adc1-63a2b63565ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.107275 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-combined-ca-bundle\") pod \"17f57fe7-555b-4119-b03e-fafb151494a6\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.107367 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-config-data\") pod \"17f57fe7-555b-4119-b03e-fafb151494a6\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.107489 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w8q8\" (UniqueName: \"kubernetes.io/projected/17f57fe7-555b-4119-b03e-fafb151494a6-kube-api-access-2w8q8\") pod \"17f57fe7-555b-4119-b03e-fafb151494a6\" (UID: \"17f57fe7-555b-4119-b03e-fafb151494a6\") " Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.107531 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpcrc\" (UniqueName: \"kubernetes.io/projected/bd95e3d9-8fc5-4f14-a009-cd1749743679-kube-api-access-bpcrc\") pod \"bd95e3d9-8fc5-4f14-a009-cd1749743679\" (UID: \"bd95e3d9-8fc5-4f14-a009-cd1749743679\") " Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.107691 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts\") pod \"bd95e3d9-8fc5-4f14-a009-cd1749743679\" (UID: \"bd95e3d9-8fc5-4f14-a009-cd1749743679\") " Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.108301 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.108323 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.108332 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prxml\" (UniqueName: \"kubernetes.io/projected/0dcb38b4-0144-434f-adc1-63a2b63565ee-kube-api-access-prxml\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.108376 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.108879 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd95e3d9-8fc5-4f14-a009-cd1749743679" (UID: "bd95e3d9-8fc5-4f14-a009-cd1749743679"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.123348 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fdf9-account-create-update-lshpf"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.123405 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fdf9-account-create-update-lshpf"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.123489 5033 scope.go:117] "RemoveContainer" containerID="bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.127595 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd95e3d9-8fc5-4f14-a009-cd1749743679-kube-api-access-bpcrc" (OuterVolumeSpecName: "kube-api-access-bpcrc") pod "bd95e3d9-8fc5-4f14-a009-cd1749743679" (UID: "bd95e3d9-8fc5-4f14-a009-cd1749743679"). InnerVolumeSpecName "kube-api-access-bpcrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.134549 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f57fe7-555b-4119-b03e-fafb151494a6-kube-api-access-2w8q8" (OuterVolumeSpecName: "kube-api-access-2w8q8") pod "17f57fe7-555b-4119-b03e-fafb151494a6" (UID: "17f57fe7-555b-4119-b03e-fafb151494a6"). InnerVolumeSpecName "kube-api-access-2w8q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.190610 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6a31-account-create-update-gbdhh"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.198882 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" (UID: "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.204304 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6a31-account-create-update-gbdhh"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.211782 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.212018 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w8q8\" (UniqueName: \"kubernetes.io/projected/17f57fe7-555b-4119-b03e-fafb151494a6-kube-api-access-2w8q8\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.212098 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpcrc\" (UniqueName: \"kubernetes.io/projected/bd95e3d9-8fc5-4f14-a009-cd1749743679-kube-api-access-bpcrc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.212204 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd95e3d9-8fc5-4f14-a009-cd1749743679-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.235373 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be41eafa-8738-4d4a-9f9e-592c105bc766" (UID: "be41eafa-8738-4d4a-9f9e-592c105bc766"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.239749 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86984d6b66-tg6tl"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.246565 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd331de5-997f-4bb2-a032-d174a0bf059b" (UID: "fd331de5-997f-4bb2-a032-d174a0bf059b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.246636 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-86984d6b66-tg6tl"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.266676 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.268388 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17f57fe7-555b-4119-b03e-fafb151494a6" (UID: "17f57fe7-555b-4119-b03e-fafb151494a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.272367 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fb2a5173-ce4c-4d4a-a136-70356adabbb9" (UID: "fb2a5173-ce4c-4d4a-a136-70356adabbb9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.274803 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.279424 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-config-data" (OuterVolumeSpecName: "config-data") pod "0dcb38b4-0144-434f-adc1-63a2b63565ee" (UID: "0dcb38b4-0144-434f-adc1-63a2b63565ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.281794 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-config-data" (OuterVolumeSpecName: "config-data") pod "17f57fe7-555b-4119-b03e-fafb151494a6" (UID: "17f57fe7-555b-4119-b03e-fafb151494a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.283666 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" (UID: "998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.286596 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-config-data" (OuterVolumeSpecName: "config-data") pod "fd331de5-997f-4bb2-a032-d174a0bf059b" (UID: "fd331de5-997f-4bb2-a032-d174a0bf059b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.288722 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "be41eafa-8738-4d4a-9f9e-592c105bc766" (UID: "be41eafa-8738-4d4a-9f9e-592c105bc766"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.292821 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.303150 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.306796 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315311 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315450 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315466 5033 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315480 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcb38b4-0144-434f-adc1-63a2b63565ee-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315491 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315504 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be41eafa-8738-4d4a-9f9e-592c105bc766-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315515 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd331de5-997f-4bb2-a032-d174a0bf059b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315527 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315539 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315551 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f57fe7-555b-4119-b03e-fafb151494a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.315709 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fxrwp"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.332024 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fxrwp"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.342979 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data" (OuterVolumeSpecName: "config-data") pod "801b1613-3ed1-423d-85cf-6b87f6711034" (UID: "801b1613-3ed1-423d-85cf-6b87f6711034"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.344453 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801b1613-3ed1-423d-85cf-6b87f6711034" (UID: "801b1613-3ed1-423d-85cf-6b87f6711034"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.349093 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.356538 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.358731 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fb2a5173-ce4c-4d4a-a136-70356adabbb9" (UID: "fb2a5173-ce4c-4d4a-a136-70356adabbb9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.369866 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.377031 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.383987 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6d648bbbdd-g6xz5"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.390602 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6d648bbbdd-g6xz5"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.419463 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.419518 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801b1613-3ed1-423d-85cf-6b87f6711034-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.419532 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb2a5173-ce4c-4d4a-a136-70356adabbb9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.497398 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"801b1613-3ed1-423d-85cf-6b87f6711034","Type":"ContainerDied","Data":"3ff873b87303a9670e15bafd9f8f11b63f65948e80aa866f81029ce7d6cdc4af"} Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.497557 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.510576 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"17f57fe7-555b-4119-b03e-fafb151494a6","Type":"ContainerDied","Data":"4b2513f16ab794f9288ec4293871e5005e244c973f84af173163d7308165c27c"} Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.510671 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.519206 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3","Type":"ContainerDied","Data":"100335d3bfe986ed47539164838c321a85292a0e9696af813e0ac893ab0a9b0b"} Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.519472 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.536143 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00a5a420-8682-4780-8349-3da44c8d7f1a/ovn-northd/0.log" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.536272 5033 generic.go:334] "Generic (PLEG): container finished" podID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerID="65abe18a8fd80fc9112bf0561b66f917755c231568bc655d18e082d960532c5a" exitCode=139 Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.536422 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00a5a420-8682-4780-8349-3da44c8d7f1a","Type":"ContainerDied","Data":"65abe18a8fd80fc9112bf0561b66f917755c231568bc655d18e082d960532c5a"} Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.542102 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-czhv8" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.542089 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-czhv8" event={"ID":"bd95e3d9-8fc5-4f14-a009-cd1749743679","Type":"ContainerDied","Data":"98a8ae6ca046024521c9f896d30f4398fee105467c89b057553b81c56c1d4d84"} Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.549722 5033 scope.go:117] "RemoveContainer" containerID="4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166" Feb 26 16:07:04 crc kubenswrapper[5033]: E0226 16:07:04.550119 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166\": container with ID starting with 4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166 not found: ID does not exist" containerID="4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.550157 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166"} err="failed to get container status \"4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166\": rpc error: code = NotFound desc = could not find container \"4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166\": container with ID starting with 4a5002048a7196f9df630ad0089f06b9e015f9f0f7d74825f6398e5fd9773166 not found: ID does not exist" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.550181 5033 scope.go:117] "RemoveContainer" containerID="bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520" Feb 26 16:07:04 crc kubenswrapper[5033]: E0226 16:07:04.550491 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520\": container with ID starting with bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520 not found: ID does not exist" containerID="bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.550531 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520"} err="failed to get container status \"bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520\": rpc error: code = NotFound desc = could not find container \"bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520\": container with ID starting with bfdebebaf5c7bc52f25080efe6a90e2f815e68af675ef8899937b56058672520 not found: ID does not exist" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.550550 5033 scope.go:117] "RemoveContainer" containerID="33f8af65abd2984345edb8a627a3f3c55b944bdf81de100064126a81f6d0b224" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.552651 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.553759 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"be41eafa-8738-4d4a-9f9e-592c105bc766","Type":"ContainerDied","Data":"1a1e6ee458bfccb1e1b96789762f6d308e1b639646ef6f499e8d7bf9205eaf36"} Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.553803 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.563266 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.571465 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.576427 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.576446 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.576467 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.576417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0dcb38b4-0144-434f-adc1-63a2b63565ee","Type":"ContainerDied","Data":"7c17ccb8e9454aacd4f5e6d035fe1db72971529c243f838647ea2f392d7ce24c"} Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.580179 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.584830 5033 scope.go:117] "RemoveContainer" containerID="31c65a36bdf508f77701ec37bf337b3a58e6c854126c11d769248ce82adecb01" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.598195 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.609614 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.617704 5033 scope.go:117] "RemoveContainer" containerID="09653e26a9486bd66b87542be3842434fcfe613ab05437a4b825e241e39e81f0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.636678 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-czhv8"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.650652 5033 scope.go:117] "RemoveContainer" containerID="89deba044f841c417bb427e36c48c0c1ad7827ea068666457c484ac571cf5b5c" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.661590 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-czhv8"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.669664 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.684281 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.684349 5033 scope.go:117] "RemoveContainer" containerID="063b18146099b5100ca56d54d72414c86d85a242467f6339a74297d196991539" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.697914 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.703934 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.709551 5033 scope.go:117] "RemoveContainer" containerID="aa1ee9867435cb326cccdc9dafbd435a111a0e341d91a8b67afb8b92f8d1ca85" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.709950 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.717180 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.725227 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.726314 5033 scope.go:117] "RemoveContainer" containerID="81b00d0b1c13dc1cda83736d47b6fac79d1887078858c3ef5eaedcb3b16e0883" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.736373 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.744689 5033 scope.go:117] "RemoveContainer" containerID="594bc92b21cdaf12c49af3fc339f3186f938691726d83e69d1538a692373df92" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.763072 5033 scope.go:117] "RemoveContainer" containerID="ee215c10d8fb919211b6ab044db9abd81244fbf29ba7fba886152bba9c7de8e4" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.783932 5033 scope.go:117] "RemoveContainer" containerID="2f2bce733fa63476566ab5bf20d7a8fb1718f0f35fccb3db14100b8bf61575d0" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.808155 5033 scope.go:117] "RemoveContainer" containerID="3a534aa8128f826e879da52f7e79951a6fad47306bfecf8692f8608dfd09639b" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.913428 5033 scope.go:117] "RemoveContainer" containerID="3436199015eba2036bfcbc2a3bbd95bd89f8131adcc1124651fd97c7916dfecf" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.936657 5033 scope.go:117] "RemoveContainer" containerID="067a1b260955a76eecc32e5016cd673aa068e9dd6e33251557d30945591fa778" Feb 26 16:07:04 crc kubenswrapper[5033]: I0226 16:07:04.978586 5033 scope.go:117] "RemoveContainer" containerID="b70d024ada1fde6fed1afca1935ff8d8f22708d55459bb8e44e5ec6748c593ea" Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.030183 5033 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.030601 5033 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data podName:94562cbd-1b98-4b30-9a97-f26fb103bb72 nodeName:}" failed. No retries permitted until 2026-02-26 16:07:13.030582055 +0000 UTC m=+1560.243180705 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data") pod "rabbitmq-cell1-server-0" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72") : configmap "rabbitmq-cell1-config-data" not found Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.213002 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.216780 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.217560 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.220798 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="94562cbd-1b98-4b30-9a97-f26fb103bb72" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.224567 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.225009 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.225046 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.230753 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.230827 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.447168 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.453391 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00a5a420-8682-4780-8349-3da44c8d7f1a/ovn-northd/0.log" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.453484 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.536538 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-rundir\") pod \"00a5a420-8682-4780-8349-3da44c8d7f1a\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.537058 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "00a5a420-8682-4780-8349-3da44c8d7f1a" (UID: "00a5a420-8682-4780-8349-3da44c8d7f1a"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.537669 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.537725 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqdsw\" (UniqueName: \"kubernetes.io/projected/00a5a420-8682-4780-8349-3da44c8d7f1a-kube-api-access-pqdsw\") pod \"00a5a420-8682-4780-8349-3da44c8d7f1a\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.537763 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-combined-ca-bundle\") pod \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.537811 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-config\") pod \"00a5a420-8682-4780-8349-3da44c8d7f1a\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.537855 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-metrics-certs-tls-certs\") pod \"00a5a420-8682-4780-8349-3da44c8d7f1a\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.537917 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-scripts\") pod \"00a5a420-8682-4780-8349-3da44c8d7f1a\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.537948 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-northd-tls-certs\") pod \"00a5a420-8682-4780-8349-3da44c8d7f1a\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.537996 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-operator-scripts\") pod \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.538031 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-galera-tls-certs\") pod \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.538064 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-generated\") pod \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.538088 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-combined-ca-bundle\") pod \"00a5a420-8682-4780-8349-3da44c8d7f1a\" (UID: \"00a5a420-8682-4780-8349-3da44c8d7f1a\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.538141 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb89j\" (UniqueName: \"kubernetes.io/projected/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kube-api-access-vb89j\") pod \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.538179 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-default\") pod \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.538213 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kolla-config\") pod \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\" (UID: \"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.539080 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" (UID: "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.539707 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-scripts" (OuterVolumeSpecName: "scripts") pod "00a5a420-8682-4780-8349-3da44c8d7f1a" (UID: "00a5a420-8682-4780-8349-3da44c8d7f1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.540336 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.540363 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.540375 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.541190 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-config" (OuterVolumeSpecName: "config") pod "00a5a420-8682-4780-8349-3da44c8d7f1a" (UID: "00a5a420-8682-4780-8349-3da44c8d7f1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.541359 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" (UID: "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.541882 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" (UID: "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.545723 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" (UID: "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.567342 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kube-api-access-vb89j" (OuterVolumeSpecName: "kube-api-access-vb89j") pod "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" (UID: "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec"). InnerVolumeSpecName "kube-api-access-vb89j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.567460 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a5a420-8682-4780-8349-3da44c8d7f1a-kube-api-access-pqdsw" (OuterVolumeSpecName: "kube-api-access-pqdsw") pod "00a5a420-8682-4780-8349-3da44c8d7f1a" (UID: "00a5a420-8682-4780-8349-3da44c8d7f1a"). InnerVolumeSpecName "kube-api-access-pqdsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.572372 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" (UID: "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.581804 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" (UID: "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.589975 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00a5a420-8682-4780-8349-3da44c8d7f1a" (UID: "00a5a420-8682-4780-8349-3da44c8d7f1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.591817 5033 generic.go:334] "Generic (PLEG): container finished" podID="94562cbd-1b98-4b30-9a97-f26fb103bb72" containerID="4ad1039e51c54abd4ca6cc3d8acfd102de20d99c0d488b4055beccf49a6c1a6d" exitCode=0 Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.591874 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"94562cbd-1b98-4b30-9a97-f26fb103bb72","Type":"ContainerDied","Data":"4ad1039e51c54abd4ca6cc3d8acfd102de20d99c0d488b4055beccf49a6c1a6d"} Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.601873 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" (UID: "6aa45e26-ccbe-4827-8ddd-ebb5308b98ec"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.602847 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00a5a420-8682-4780-8349-3da44c8d7f1a/ovn-northd/0.log" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.602961 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.602991 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00a5a420-8682-4780-8349-3da44c8d7f1a","Type":"ContainerDied","Data":"c376bea5f8d4b23397e8136ee9deab9dfa20345d0b61b68727a170705832a4f9"} Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.603021 5033 scope.go:117] "RemoveContainer" containerID="0781d5f97d7bdfedb1b4ca988b8219b988b2353f19930fed09bd026d7a0f175c" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.621456 5033 generic.go:334] "Generic (PLEG): container finished" podID="8e80c8c9-0f28-488c-9b68-ff4d379f9059" containerID="e93416b1954dbbe264e3883541d6c591523a47b3cbef0c1e3095a85863fc60b8" exitCode=0 Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.621516 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d658b969f-npt2l" event={"ID":"8e80c8c9-0f28-488c-9b68-ff4d379f9059","Type":"ContainerDied","Data":"e93416b1954dbbe264e3883541d6c591523a47b3cbef0c1e3095a85863fc60b8"} Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642175 5033 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642208 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642222 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642235 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb89j\" (UniqueName: \"kubernetes.io/projected/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kube-api-access-vb89j\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642246 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642276 5033 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642309 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642322 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqdsw\" (UniqueName: \"kubernetes.io/projected/00a5a420-8682-4780-8349-3da44c8d7f1a-kube-api-access-pqdsw\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642334 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.642345 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a5a420-8682-4780-8349-3da44c8d7f1a-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.648480 5033 generic.go:334] "Generic (PLEG): container finished" podID="443e562a-aefe-46f3-a86f-59205f5849c7" containerID="118f64c4209e08f3dc281ec9601aaee1f783225640fc7b6f23d85a36174417c3" exitCode=0 Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.652976 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "00a5a420-8682-4780-8349-3da44c8d7f1a" (UID: "00a5a420-8682-4780-8349-3da44c8d7f1a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.657469 5033 generic.go:334] "Generic (PLEG): container finished" podID="6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" containerID="5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396" exitCode=0 Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.657705 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.681809 5033 scope.go:117] "RemoveContainer" containerID="65abe18a8fd80fc9112bf0561b66f917755c231568bc655d18e082d960532c5a" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.684578 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "00a5a420-8682-4780-8349-3da44c8d7f1a" (UID: "00a5a420-8682-4780-8349-3da44c8d7f1a"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.685021 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.686708 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dcb38b4-0144-434f-adc1-63a2b63565ee" path="/var/lib/kubelet/pods/0dcb38b4-0144-434f-adc1-63a2b63565ee/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.687414 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f57fe7-555b-4119-b03e-fafb151494a6" path="/var/lib/kubelet/pods/17f57fe7-555b-4119-b03e-fafb151494a6/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.688402 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44171111-32bf-415d-97c9-f966209cd0a0" path="/var/lib/kubelet/pods/44171111-32bf-415d-97c9-f966209cd0a0/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.689245 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="540a469d-81f3-40f5-9826-27384a03ce2d" path="/var/lib/kubelet/pods/540a469d-81f3-40f5-9826-27384a03ce2d/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.689697 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" path="/var/lib/kubelet/pods/5ef30f16-84a1-4809-8ec4-0c36e315c5f3/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.690738 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783187df-a438-4fb2-97be-3565e08e88ac" path="/var/lib/kubelet/pods/783187df-a438-4fb2-97be-3565e08e88ac/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.691393 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801b1613-3ed1-423d-85cf-6b87f6711034" path="/var/lib/kubelet/pods/801b1613-3ed1-423d-85cf-6b87f6711034/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.692120 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90168acb-f862-4f73-bb8a-dd5b308977e1" path="/var/lib/kubelet/pods/90168acb-f862-4f73-bb8a-dd5b308977e1/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.692584 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" path="/var/lib/kubelet/pods/9646add7-5da6-4590-9394-b4325e8ebd2d/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.693733 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" path="/var/lib/kubelet/pods/998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.694571 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" path="/var/lib/kubelet/pods/9ab4047a-07ab-4c15-88eb-7dbefe5bea78/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.695459 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd95e3d9-8fc5-4f14-a009-cd1749743679" path="/var/lib/kubelet/pods/bd95e3d9-8fc5-4f14-a009-cd1749743679/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.708598 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be41eafa-8738-4d4a-9f9e-592c105bc766" path="/var/lib/kubelet/pods/be41eafa-8738-4d4a-9f9e-592c105bc766/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.709401 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" path="/var/lib/kubelet/pods/bf0b01b9-0085-4552-8a7d-1e7359dad79b/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.710407 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13ba380-e3f6-4322-a434-7b69da9e4e3f" path="/var/lib/kubelet/pods/c13ba380-e3f6-4322-a434-7b69da9e4e3f/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.711387 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d989fd27-f36c-4503-a400-e09bb4e70398" path="/var/lib/kubelet/pods/d989fd27-f36c-4503-a400-e09bb4e70398/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.712102 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" path="/var/lib/kubelet/pods/ea4c6e17-4b51-4387-98c8-d071c1fc7c5a/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.712566 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.713501 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" path="/var/lib/kubelet/pods/fb2a5173-ce4c-4d4a-a136-70356adabbb9/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.714979 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" path="/var/lib/kubelet/pods/fd331de5-997f-4bb2-a032-d174a0bf059b/volumes" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.716098 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.716126 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"443e562a-aefe-46f3-a86f-59205f5849c7","Type":"ContainerDied","Data":"118f64c4209e08f3dc281ec9601aaee1f783225640fc7b6f23d85a36174417c3"} Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.716150 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec","Type":"ContainerDied","Data":"5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396"} Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.716161 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6aa45e26-ccbe-4827-8ddd-ebb5308b98ec","Type":"ContainerDied","Data":"0410a91a0562d41a2afd473190889986bea4bcd86f7f1ad44ab5222a3df28c5a"} Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.716182 5033 scope.go:117] "RemoveContainer" containerID="118f64c4209e08f3dc281ec9601aaee1f783225640fc7b6f23d85a36174417c3" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.728359 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.745607 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747030 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747075 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747097 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-tls\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747134 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/443e562a-aefe-46f3-a86f-59205f5849c7-pod-info\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747179 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-plugins\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747201 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd6sl\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-kube-api-access-nd6sl\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747391 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-plugins-conf\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747442 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/443e562a-aefe-46f3-a86f-59205f5849c7-erlang-cookie-secret\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747477 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-erlang-cookie\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747505 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-confd\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747528 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-server-conf\") pod \"443e562a-aefe-46f3-a86f-59205f5849c7\" (UID: \"443e562a-aefe-46f3-a86f-59205f5849c7\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747924 5033 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747936 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a5a420-8682-4780-8349-3da44c8d7f1a-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.747946 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.750337 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.751144 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.751447 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.751628 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.751768 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.756390 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.757408 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/443e562a-aefe-46f3-a86f-59205f5849c7-pod-info" (OuterVolumeSpecName: "pod-info") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.771923 5033 scope.go:117] "RemoveContainer" containerID="247b57a885cc380f0649077b96f76c4e9d410ef7835d763d6902fafb762cc064" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.787723 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-kube-api-access-nd6sl" (OuterVolumeSpecName: "kube-api-access-nd6sl") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "kube-api-access-nd6sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.805403 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e562a-aefe-46f3-a86f-59205f5849c7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.821867 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data" (OuterVolumeSpecName: "config-data") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.825098 5033 scope.go:117] "RemoveContainer" containerID="5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849024 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-server-conf\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849085 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-credential-keys\") pod \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849409 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94562cbd-1b98-4b30-9a97-f26fb103bb72-erlang-cookie-secret\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849431 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-erlang-cookie\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849483 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849501 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-scripts\") pod \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849566 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-tls\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849598 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-confd\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849685 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-internal-tls-certs\") pod \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849722 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-fernet-keys\") pod \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849739 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p92fh\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-kube-api-access-p92fh\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849768 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-plugins-conf\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849793 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh72x\" (UniqueName: \"kubernetes.io/projected/8e80c8c9-0f28-488c-9b68-ff4d379f9059-kube-api-access-xh72x\") pod \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849819 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-config-data\") pod \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849840 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-plugins\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849859 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-public-tls-certs\") pod \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849893 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-combined-ca-bundle\") pod \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\" (UID: \"8e80c8c9-0f28-488c-9b68-ff4d379f9059\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849921 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94562cbd-1b98-4b30-9a97-f26fb103bb72-pod-info\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.849944 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data\") pod \"94562cbd-1b98-4b30-9a97-f26fb103bb72\" (UID: \"94562cbd-1b98-4b30-9a97-f26fb103bb72\") " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.850219 5033 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/443e562a-aefe-46f3-a86f-59205f5849c7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.850238 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.850276 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.850287 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.850297 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.850306 5033 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/443e562a-aefe-46f3-a86f-59205f5849c7-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.850318 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.850328 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd6sl\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-kube-api-access-nd6sl\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.850337 5033 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.851990 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-server-conf" (OuterVolumeSpecName: "server-conf") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.852439 5033 scope.go:117] "RemoveContainer" containerID="1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.856097 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.858344 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.858487 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.859052 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-scripts" (OuterVolumeSpecName: "scripts") pod "8e80c8c9-0f28-488c-9b68-ff4d379f9059" (UID: "8e80c8c9-0f28-488c-9b68-ff4d379f9059"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.859390 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.859474 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8e80c8c9-0f28-488c-9b68-ff4d379f9059" (UID: "8e80c8c9-0f28-488c-9b68-ff4d379f9059"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.859515 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-kube-api-access-p92fh" (OuterVolumeSpecName: "kube-api-access-p92fh") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "kube-api-access-p92fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.862002 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94562cbd-1b98-4b30-9a97-f26fb103bb72-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.863093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/94562cbd-1b98-4b30-9a97-f26fb103bb72-pod-info" (OuterVolumeSpecName: "pod-info") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.866355 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8e80c8c9-0f28-488c-9b68-ff4d379f9059" (UID: "8e80c8c9-0f28-488c-9b68-ff4d379f9059"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.867687 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e80c8c9-0f28-488c-9b68-ff4d379f9059-kube-api-access-xh72x" (OuterVolumeSpecName: "kube-api-access-xh72x") pod "8e80c8c9-0f28-488c-9b68-ff4d379f9059" (UID: "8e80c8c9-0f28-488c-9b68-ff4d379f9059"). InnerVolumeSpecName "kube-api-access-xh72x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.868721 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.883468 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-config-data" (OuterVolumeSpecName: "config-data") pod "8e80c8c9-0f28-488c-9b68-ff4d379f9059" (UID: "8e80c8c9-0f28-488c-9b68-ff4d379f9059"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.885465 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e80c8c9-0f28-488c-9b68-ff4d379f9059" (UID: "8e80c8c9-0f28-488c-9b68-ff4d379f9059"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.892027 5033 scope.go:117] "RemoveContainer" containerID="5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396" Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.893028 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396\": container with ID starting with 5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396 not found: ID does not exist" containerID="5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.893062 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396"} err="failed to get container status \"5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396\": rpc error: code = NotFound desc = could not find container \"5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396\": container with ID starting with 5972e152ffb35a029b18c376859b1248456d8464016ef3b1aa707742e367e396 not found: ID does not exist" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.893090 5033 scope.go:117] "RemoveContainer" containerID="1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e" Feb 26 16:07:05 crc kubenswrapper[5033]: E0226 16:07:05.893318 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e\": container with ID starting with 1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e not found: ID does not exist" containerID="1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.893346 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e"} err="failed to get container status \"1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e\": rpc error: code = NotFound desc = could not find container \"1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e\": container with ID starting with 1b2c911a4b597dd565f5ccd9a34a20159cee1b20b6151032d777c9ae8d8dce8e not found: ID does not exist" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.897169 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.900856 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-server-conf" (OuterVolumeSpecName: "server-conf") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.902180 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "443e562a-aefe-46f3-a86f-59205f5849c7" (UID: "443e562a-aefe-46f3-a86f-59205f5849c7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.916592 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8e80c8c9-0f28-488c-9b68-ff4d379f9059" (UID: "8e80c8c9-0f28-488c-9b68-ff4d379f9059"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.916756 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data" (OuterVolumeSpecName: "config-data") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.927005 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8e80c8c9-0f28-488c-9b68-ff4d379f9059" (UID: "8e80c8c9-0f28-488c-9b68-ff4d379f9059"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.935450 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.941845 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951654 5033 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951688 5033 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951702 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951712 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951721 5033 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94562cbd-1b98-4b30-9a97-f26fb103bb72-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951744 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951753 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951761 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951769 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951777 5033 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951785 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p92fh\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-kube-api-access-p92fh\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951793 5033 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951801 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh72x\" (UniqueName: \"kubernetes.io/projected/8e80c8c9-0f28-488c-9b68-ff4d379f9059-kube-api-access-xh72x\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951809 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951818 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951827 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951835 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e80c8c9-0f28-488c-9b68-ff4d379f9059-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951843 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/443e562a-aefe-46f3-a86f-59205f5849c7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951853 5033 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94562cbd-1b98-4b30-9a97-f26fb103bb72-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951861 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94562cbd-1b98-4b30-9a97-f26fb103bb72-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.951868 5033 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/443e562a-aefe-46f3-a86f-59205f5849c7-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.965230 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "94562cbd-1b98-4b30-9a97-f26fb103bb72" (UID: "94562cbd-1b98-4b30-9a97-f26fb103bb72"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:05 crc kubenswrapper[5033]: I0226 16:07:05.965596 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.052956 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.052989 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94562cbd-1b98-4b30-9a97-f26fb103bb72-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.701993 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"443e562a-aefe-46f3-a86f-59205f5849c7","Type":"ContainerDied","Data":"f33c995e86b21de7c50df0b29830b963605c0332ac69d65d4e22240e45227b12"} Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.702077 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.705870 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.705870 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"94562cbd-1b98-4b30-9a97-f26fb103bb72","Type":"ContainerDied","Data":"41047549b664909658ddcc8d2cf54718889d14921b52c80c34e2493e2f012d6e"} Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.706026 5033 scope.go:117] "RemoveContainer" containerID="4ad1039e51c54abd4ca6cc3d8acfd102de20d99c0d488b4055beccf49a6c1a6d" Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.709474 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d658b969f-npt2l" event={"ID":"8e80c8c9-0f28-488c-9b68-ff4d379f9059","Type":"ContainerDied","Data":"a94ee81af9cd8002a44ed4bec2c8a1ef53d9b54b037314b582dc30ec28af9619"} Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.709569 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d658b969f-npt2l" Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.745496 5033 scope.go:117] "RemoveContainer" containerID="24ed5542d86de71c32fadfea0c9e8d2f9fc6c503f22d6a3bd14b9feec584230c" Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.770622 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.792519 5033 scope.go:117] "RemoveContainer" containerID="e93416b1954dbbe264e3883541d6c591523a47b3cbef0c1e3095a85863fc60b8" Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.795204 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.802161 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.818637 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.835626 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7d658b969f-npt2l"] Feb 26 16:07:06 crc kubenswrapper[5033]: I0226 16:07:06.850285 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7d658b969f-npt2l"] Feb 26 16:07:07 crc kubenswrapper[5033]: I0226 16:07:07.641391 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" path="/var/lib/kubelet/pods/00a5a420-8682-4780-8349-3da44c8d7f1a/volumes" Feb 26 16:07:07 crc kubenswrapper[5033]: I0226 16:07:07.643654 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" path="/var/lib/kubelet/pods/443e562a-aefe-46f3-a86f-59205f5849c7/volumes" Feb 26 16:07:07 crc kubenswrapper[5033]: I0226 16:07:07.645341 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" path="/var/lib/kubelet/pods/6aa45e26-ccbe-4827-8ddd-ebb5308b98ec/volumes" Feb 26 16:07:07 crc kubenswrapper[5033]: I0226 16:07:07.646114 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e80c8c9-0f28-488c-9b68-ff4d379f9059" path="/var/lib/kubelet/pods/8e80c8c9-0f28-488c-9b68-ff4d379f9059/volumes" Feb 26 16:07:07 crc kubenswrapper[5033]: I0226 16:07:07.647619 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94562cbd-1b98-4b30-9a97-f26fb103bb72" path="/var/lib/kubelet/pods/94562cbd-1b98-4b30-9a97-f26fb103bb72/volumes" Feb 26 16:07:08 crc kubenswrapper[5033]: I0226 16:07:08.728068 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="be41eafa-8738-4d4a-9f9e-592c105bc766" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.108:11211: i/o timeout" Feb 26 16:07:09 crc kubenswrapper[5033]: E0226 16:07:09.402450 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93066a1a_6c0c_4b1e_a257_052a4db51d70.slice\": RecentStats: unable to find data in memory cache]" Feb 26 16:07:10 crc kubenswrapper[5033]: E0226 16:07:10.206673 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:10 crc kubenswrapper[5033]: E0226 16:07:10.209020 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:10 crc kubenswrapper[5033]: E0226 16:07:10.209053 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:10 crc kubenswrapper[5033]: E0226 16:07:10.209486 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:10 crc kubenswrapper[5033]: E0226 16:07:10.210020 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" Feb 26 16:07:10 crc kubenswrapper[5033]: E0226 16:07:10.214418 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:10 crc kubenswrapper[5033]: E0226 16:07:10.216622 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:10 crc kubenswrapper[5033]: E0226 16:07:10.216664 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" Feb 26 16:07:10 crc kubenswrapper[5033]: I0226 16:07:10.558898 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: i/o timeout" Feb 26 16:07:14 crc kubenswrapper[5033]: I0226 16:07:14.796378 5033 generic.go:334] "Generic (PLEG): container finished" podID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerID="6f4850a9a8f812df409a9a6fb3e745a63952c033253853ae16812e130cbe9138" exitCode=0 Feb 26 16:07:14 crc kubenswrapper[5033]: I0226 16:07:14.796528 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d9577fb9-j9mmc" event={"ID":"db7073ea-074d-41d7-b114-9d6b8e02c3a0","Type":"ContainerDied","Data":"6f4850a9a8f812df409a9a6fb3e745a63952c033253853ae16812e130cbe9138"} Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.197740 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:07:15 crc kubenswrapper[5033]: E0226 16:07:15.214914 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:15 crc kubenswrapper[5033]: E0226 16:07:15.215103 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:15 crc kubenswrapper[5033]: E0226 16:07:15.215692 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:15 crc kubenswrapper[5033]: E0226 16:07:15.215929 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:15 crc kubenswrapper[5033]: E0226 16:07:15.215960 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" Feb 26 16:07:15 crc kubenswrapper[5033]: E0226 16:07:15.216299 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:15 crc kubenswrapper[5033]: E0226 16:07:15.223794 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:15 crc kubenswrapper[5033]: E0226 16:07:15.223889 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.333944 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-internal-tls-certs\") pod \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.334048 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-httpd-config\") pod \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.334114 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-ovndb-tls-certs\") pod \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.334150 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2vrn\" (UniqueName: \"kubernetes.io/projected/db7073ea-074d-41d7-b114-9d6b8e02c3a0-kube-api-access-m2vrn\") pod \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.334200 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-combined-ca-bundle\") pod \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.334236 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-public-tls-certs\") pod \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.334342 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-config\") pod \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\" (UID: \"db7073ea-074d-41d7-b114-9d6b8e02c3a0\") " Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.343476 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "db7073ea-074d-41d7-b114-9d6b8e02c3a0" (UID: "db7073ea-074d-41d7-b114-9d6b8e02c3a0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.350368 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7073ea-074d-41d7-b114-9d6b8e02c3a0-kube-api-access-m2vrn" (OuterVolumeSpecName: "kube-api-access-m2vrn") pod "db7073ea-074d-41d7-b114-9d6b8e02c3a0" (UID: "db7073ea-074d-41d7-b114-9d6b8e02c3a0"). InnerVolumeSpecName "kube-api-access-m2vrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.371206 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db7073ea-074d-41d7-b114-9d6b8e02c3a0" (UID: "db7073ea-074d-41d7-b114-9d6b8e02c3a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.377060 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "db7073ea-074d-41d7-b114-9d6b8e02c3a0" (UID: "db7073ea-074d-41d7-b114-9d6b8e02c3a0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.379468 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "db7073ea-074d-41d7-b114-9d6b8e02c3a0" (UID: "db7073ea-074d-41d7-b114-9d6b8e02c3a0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.380959 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-config" (OuterVolumeSpecName: "config") pod "db7073ea-074d-41d7-b114-9d6b8e02c3a0" (UID: "db7073ea-074d-41d7-b114-9d6b8e02c3a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.396128 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "db7073ea-074d-41d7-b114-9d6b8e02c3a0" (UID: "db7073ea-074d-41d7-b114-9d6b8e02c3a0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.435979 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.436173 5033 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.436237 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.436314 5033 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.436374 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2vrn\" (UniqueName: \"kubernetes.io/projected/db7073ea-074d-41d7-b114-9d6b8e02c3a0-kube-api-access-m2vrn\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.436458 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.436529 5033 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db7073ea-074d-41d7-b114-9d6b8e02c3a0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.809068 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d9577fb9-j9mmc" event={"ID":"db7073ea-074d-41d7-b114-9d6b8e02c3a0","Type":"ContainerDied","Data":"50dbdb11bbdf4960d6f832423d69852e515d76f298f32272a7e3daba2e425277"} Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.809133 5033 scope.go:117] "RemoveContainer" containerID="fb11d0b59395826f637bc346a58329527565a9575674426c6ee93c2cfdc8dac0" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.810442 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d9577fb9-j9mmc" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.834784 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6d9577fb9-j9mmc"] Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.834923 5033 scope.go:117] "RemoveContainer" containerID="6f4850a9a8f812df409a9a6fb3e745a63952c033253853ae16812e130cbe9138" Feb 26 16:07:15 crc kubenswrapper[5033]: I0226 16:07:15.838093 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6d9577fb9-j9mmc"] Feb 26 16:07:17 crc kubenswrapper[5033]: I0226 16:07:17.643342 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" path="/var/lib/kubelet/pods/db7073ea-074d-41d7-b114-9d6b8e02c3a0/volumes" Feb 26 16:07:20 crc kubenswrapper[5033]: I0226 16:07:20.189466 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:07:20 crc kubenswrapper[5033]: I0226 16:07:20.189818 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:07:20 crc kubenswrapper[5033]: E0226 16:07:20.206197 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:20 crc kubenswrapper[5033]: E0226 16:07:20.207973 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:20 crc kubenswrapper[5033]: E0226 16:07:20.208595 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:20 crc kubenswrapper[5033]: E0226 16:07:20.209304 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:20 crc kubenswrapper[5033]: E0226 16:07:20.209470 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:20 crc kubenswrapper[5033]: E0226 16:07:20.209468 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" Feb 26 16:07:20 crc kubenswrapper[5033]: E0226 16:07:20.210904 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:20 crc kubenswrapper[5033]: E0226 16:07:20.210942 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" Feb 26 16:07:25 crc kubenswrapper[5033]: E0226 16:07:25.207187 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:25 crc kubenswrapper[5033]: E0226 16:07:25.207860 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:25 crc kubenswrapper[5033]: E0226 16:07:25.208277 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:25 crc kubenswrapper[5033]: E0226 16:07:25.208777 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 26 16:07:25 crc kubenswrapper[5033]: E0226 16:07:25.208805 5033 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" Feb 26 16:07:25 crc kubenswrapper[5033]: E0226 16:07:25.209723 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:25 crc kubenswrapper[5033]: E0226 16:07:25.211193 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 26 16:07:25 crc kubenswrapper[5033]: E0226 16:07:25.211241 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-84k4b" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" Feb 26 16:07:27 crc kubenswrapper[5033]: I0226 16:07:27.941588 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-84k4b_0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6/ovs-vswitchd/0.log" Feb 26 16:07:27 crc kubenswrapper[5033]: I0226 16:07:27.945215 5033 generic.go:334] "Generic (PLEG): container finished" podID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" exitCode=137 Feb 26 16:07:27 crc kubenswrapper[5033]: I0226 16:07:27.945332 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-84k4b" event={"ID":"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6","Type":"ContainerDied","Data":"10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92"} Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.317578 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-84k4b_0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6/ovs-vswitchd/0.log" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.318501 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.439041 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-log\") pod \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.439124 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-etc-ovs\") pod \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.439164 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-lib\") pod \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.439157 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-log" (OuterVolumeSpecName: "var-log") pod "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" (UID: "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.439232 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" (UID: "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.439334 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-lib" (OuterVolumeSpecName: "var-lib") pod "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" (UID: "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.439397 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-scripts\") pod \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.440823 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-scripts" (OuterVolumeSpecName: "scripts") pod "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" (UID: "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.440896 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxflh\" (UniqueName: \"kubernetes.io/projected/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-kube-api-access-cxflh\") pod \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.441838 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-run\") pod \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\" (UID: \"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.441956 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-run" (OuterVolumeSpecName: "var-run") pod "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" (UID: "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.442926 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.442950 5033 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.442962 5033 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-log\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.442975 5033 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.442990 5033 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-var-lib\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.449494 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-kube-api-access-cxflh" (OuterVolumeSpecName: "kube-api-access-cxflh") pod "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" (UID: "0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6"). InnerVolumeSpecName "kube-api-access-cxflh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.544902 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxflh\" (UniqueName: \"kubernetes.io/projected/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6-kube-api-access-cxflh\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.654468 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.748504 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qh4j\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-kube-api-access-2qh4j\") pod \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.748586 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.748665 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-lock\") pod \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.748808 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-cache\") pod \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.748854 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-combined-ca-bundle\") pod \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.748981 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") pod \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\" (UID: \"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9\") " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.749730 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-lock" (OuterVolumeSpecName: "lock") pod "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.750052 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-cache" (OuterVolumeSpecName: "cache") pod "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.752244 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-kube-api-access-2qh4j" (OuterVolumeSpecName: "kube-api-access-2qh4j") pod "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9"). InnerVolumeSpecName "kube-api-access-2qh4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.752557 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.752853 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.851199 5033 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.851239 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qh4j\" (UniqueName: \"kubernetes.io/projected/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-kube-api-access-2qh4j\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.851334 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.851353 5033 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-lock\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.851364 5033 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-cache\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.872395 5033 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.952223 5033 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.955456 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-84k4b_0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6/ovs-vswitchd/0.log" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.956395 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-84k4b" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.956388 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-84k4b" event={"ID":"0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6","Type":"ContainerDied","Data":"53a98cf35457b1fd63fada97e8eb965cfa36a0560679a213466b605b6ade4f74"} Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.956562 5033 scope.go:117] "RemoveContainer" containerID="10d99707fda2d62c83bcd8b41e0764101a9d80f29bd9f3b5e96a5e6959fd7b92" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.964307 5033 generic.go:334] "Generic (PLEG): container finished" podID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerID="9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2" exitCode=137 Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.964358 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2"} Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.964386 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9","Type":"ContainerDied","Data":"b5d6fff8928b120f2104c3bf5317222e4e43e790d6abca6de70f7fcae6502398"} Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.964415 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 16:07:28 crc kubenswrapper[5033]: I0226 16:07:28.987762 5033 scope.go:117] "RemoveContainer" containerID="c5d04d7bd0dc047f20282340b205e025ab82e7e8dcddb538cf14af9942648572" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.009717 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-84k4b"] Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.014482 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-84k4b"] Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.020444 5033 scope.go:117] "RemoveContainer" containerID="f55ea55b5a30bde5eb0724a1694acfd92897c14f2d93ceaa5d07c8635c8cc6ea" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.037200 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" (UID: "8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.045038 5033 scope.go:117] "RemoveContainer" containerID="9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.054039 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.068723 5033 scope.go:117] "RemoveContainer" containerID="0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.097895 5033 scope.go:117] "RemoveContainer" containerID="273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.114346 5033 scope.go:117] "RemoveContainer" containerID="e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.128918 5033 scope.go:117] "RemoveContainer" containerID="cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.150333 5033 scope.go:117] "RemoveContainer" containerID="63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.172159 5033 scope.go:117] "RemoveContainer" containerID="16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.191947 5033 scope.go:117] "RemoveContainer" containerID="876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.214189 5033 scope.go:117] "RemoveContainer" containerID="d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.234391 5033 scope.go:117] "RemoveContainer" containerID="994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.271540 5033 scope.go:117] "RemoveContainer" containerID="f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.296517 5033 scope.go:117] "RemoveContainer" containerID="ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.306558 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.312502 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.322218 5033 scope.go:117] "RemoveContainer" containerID="fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.372026 5033 scope.go:117] "RemoveContainer" containerID="66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.386888 5033 scope.go:117] "RemoveContainer" containerID="7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.411919 5033 scope.go:117] "RemoveContainer" containerID="9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.412280 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2\": container with ID starting with 9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2 not found: ID does not exist" containerID="9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.412320 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2"} err="failed to get container status \"9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2\": rpc error: code = NotFound desc = could not find container \"9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2\": container with ID starting with 9e46a4bdcb510a3a4b393f5c540661c4d9653c7ab0682e8af333c3cd12a267b2 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.412347 5033 scope.go:117] "RemoveContainer" containerID="0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.412626 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0\": container with ID starting with 0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0 not found: ID does not exist" containerID="0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.412655 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0"} err="failed to get container status \"0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0\": rpc error: code = NotFound desc = could not find container \"0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0\": container with ID starting with 0a521eaee4d14e1b46ed17d703c97da911c0a1d4609afbf775e8bea74af25cb0 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.412674 5033 scope.go:117] "RemoveContainer" containerID="273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.413106 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085\": container with ID starting with 273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085 not found: ID does not exist" containerID="273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.413133 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085"} err="failed to get container status \"273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085\": rpc error: code = NotFound desc = could not find container \"273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085\": container with ID starting with 273177fafb15b50ad5e3a665f2763dbe74eb4fe0620ca71d0450672cab99f085 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.413149 5033 scope.go:117] "RemoveContainer" containerID="e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.413651 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1\": container with ID starting with e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1 not found: ID does not exist" containerID="e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.413713 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1"} err="failed to get container status \"e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1\": rpc error: code = NotFound desc = could not find container \"e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1\": container with ID starting with e59a40316977697bbe58927cbce193c5f2e762d56841877f04380a68417bb8f1 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.413799 5033 scope.go:117] "RemoveContainer" containerID="cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.414177 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa\": container with ID starting with cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa not found: ID does not exist" containerID="cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.414208 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa"} err="failed to get container status \"cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa\": rpc error: code = NotFound desc = could not find container \"cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa\": container with ID starting with cb5ea81682fe6493d69a20d3798421944ab44fb79502dc4db6d907c9739f56aa not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.414227 5033 scope.go:117] "RemoveContainer" containerID="63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.414456 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023\": container with ID starting with 63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023 not found: ID does not exist" containerID="63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.414485 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023"} err="failed to get container status \"63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023\": rpc error: code = NotFound desc = could not find container \"63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023\": container with ID starting with 63557131d93248a84dbf44c7655885eeb3e637ee28f0a283deab3bc94fd1a023 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.414503 5033 scope.go:117] "RemoveContainer" containerID="16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.414938 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379\": container with ID starting with 16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379 not found: ID does not exist" containerID="16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.414966 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379"} err="failed to get container status \"16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379\": rpc error: code = NotFound desc = could not find container \"16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379\": container with ID starting with 16d6d78da9b8fa74d108ea4c818e2d926f4d9a2476dccd937aab43051b6b5379 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.414984 5033 scope.go:117] "RemoveContainer" containerID="876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.415224 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90\": container with ID starting with 876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90 not found: ID does not exist" containerID="876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.415247 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90"} err="failed to get container status \"876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90\": rpc error: code = NotFound desc = could not find container \"876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90\": container with ID starting with 876590cd91614c08eb85e9550a866b2f7227f737b6c0cf9038982d89c96afd90 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.415278 5033 scope.go:117] "RemoveContainer" containerID="d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.415515 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f\": container with ID starting with d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f not found: ID does not exist" containerID="d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.415540 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f"} err="failed to get container status \"d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f\": rpc error: code = NotFound desc = could not find container \"d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f\": container with ID starting with d094a591121e963f86d34bb5f21a522c73e4404c29318b38c679edec77be971f not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.415555 5033 scope.go:117] "RemoveContainer" containerID="994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.415791 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c\": container with ID starting with 994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c not found: ID does not exist" containerID="994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.415817 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c"} err="failed to get container status \"994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c\": rpc error: code = NotFound desc = could not find container \"994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c\": container with ID starting with 994a8f9d468389118568e099e35da049dbde57fc2f7b113a7ebeb6522e34f68c not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.415835 5033 scope.go:117] "RemoveContainer" containerID="f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.416102 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37\": container with ID starting with f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37 not found: ID does not exist" containerID="f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.416121 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37"} err="failed to get container status \"f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37\": rpc error: code = NotFound desc = could not find container \"f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37\": container with ID starting with f6de9101571b9c99425130fab5119535f6bd14abd8d68df8092db72a11abaf37 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.416135 5033 scope.go:117] "RemoveContainer" containerID="ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.416371 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe\": container with ID starting with ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe not found: ID does not exist" containerID="ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.416396 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe"} err="failed to get container status \"ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe\": rpc error: code = NotFound desc = could not find container \"ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe\": container with ID starting with ec6ee5082ade5cf7960ee5e616585a2a8f1b973f3fc5a9094edf80ac7cc5a7fe not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.416411 5033 scope.go:117] "RemoveContainer" containerID="fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.416631 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec\": container with ID starting with fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec not found: ID does not exist" containerID="fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.416658 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec"} err="failed to get container status \"fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec\": rpc error: code = NotFound desc = could not find container \"fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec\": container with ID starting with fd8c48579f2c5f65259d345b48281996b882b042925b9b1bc443beeb19414fec not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.416675 5033 scope.go:117] "RemoveContainer" containerID="66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.416896 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6\": container with ID starting with 66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6 not found: ID does not exist" containerID="66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.416919 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6"} err="failed to get container status \"66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6\": rpc error: code = NotFound desc = could not find container \"66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6\": container with ID starting with 66094e44f1089def929d52f93352f3fd49b2aadd4abfade09e4ce622f46578c6 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.416938 5033 scope.go:117] "RemoveContainer" containerID="7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2" Feb 26 16:07:29 crc kubenswrapper[5033]: E0226 16:07:29.417186 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2\": container with ID starting with 7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2 not found: ID does not exist" containerID="7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.417219 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2"} err="failed to get container status \"7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2\": rpc error: code = NotFound desc = could not find container \"7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2\": container with ID starting with 7b8cd5e3adff2a630421593f53c6615b3d6f08fe960b869cd4dd34b38972fbb2 not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.638485 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" path="/var/lib/kubelet/pods/0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6/volumes" Feb 26 16:07:29 crc kubenswrapper[5033]: I0226 16:07:29.639423 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" path="/var/lib/kubelet/pods/8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9/volumes" Feb 26 16:07:50 crc kubenswrapper[5033]: I0226 16:07:50.188970 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:07:50 crc kubenswrapper[5033]: I0226 16:07:50.189687 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:07:50 crc kubenswrapper[5033]: I0226 16:07:50.189754 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:07:50 crc kubenswrapper[5033]: I0226 16:07:50.190680 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:07:50 crc kubenswrapper[5033]: I0226 16:07:50.190771 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" gracePeriod=600 Feb 26 16:07:50 crc kubenswrapper[5033]: E0226 16:07:50.320877 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:07:51 crc kubenswrapper[5033]: I0226 16:07:51.162361 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" exitCode=0 Feb 26 16:07:51 crc kubenswrapper[5033]: I0226 16:07:51.162455 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465"} Feb 26 16:07:51 crc kubenswrapper[5033]: I0226 16:07:51.162815 5033 scope.go:117] "RemoveContainer" containerID="7ce282c1f9d55fd8756822a733df8c8b2d5c1a2cdad655e8d68dfed7a9540071" Feb 26 16:07:51 crc kubenswrapper[5033]: I0226 16:07:51.163682 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:07:51 crc kubenswrapper[5033]: E0226 16:07:51.163975 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.142372 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535368-tklxx"] Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143291 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" containerName="galera" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143308 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" containerName="galera" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143324 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-server" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143332 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-server" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143351 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783187df-a438-4fb2-97be-3565e08e88ac" containerName="kube-state-metrics" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143358 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="783187df-a438-4fb2-97be-3565e08e88ac" containerName="kube-state-metrics" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143372 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerName="glance-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143379 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerName="glance-log" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143393 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801b1613-3ed1-423d-85cf-6b87f6711034" containerName="cinder-scheduler" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143400 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="801b1613-3ed1-423d-85cf-6b87f6711034" containerName="cinder-scheduler" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143410 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143422 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-log" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143433 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcb38b4-0144-434f-adc1-63a2b63565ee" containerName="nova-cell0-conductor-conductor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143442 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcb38b4-0144-434f-adc1-63a2b63565ee" containerName="nova-cell0-conductor-conductor" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143450 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-metadata" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143460 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-metadata" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143474 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-expirer" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143482 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-expirer" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143494 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143501 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api-log" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143514 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" containerName="setup-container" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143522 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" containerName="setup-container" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143532 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94562cbd-1b98-4b30-9a97-f26fb103bb72" containerName="rabbitmq" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143540 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="94562cbd-1b98-4b30-9a97-f26fb103bb72" containerName="rabbitmq" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143555 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be41eafa-8738-4d4a-9f9e-592c105bc766" containerName="memcached" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143563 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="be41eafa-8738-4d4a-9f9e-592c105bc766" containerName="memcached" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143577 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d989fd27-f36c-4503-a400-e09bb4e70398" containerName="nova-scheduler-scheduler" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143584 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d989fd27-f36c-4503-a400-e09bb4e70398" containerName="nova-scheduler-scheduler" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143596 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerName="glance-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143603 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerName="glance-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143610 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="ceilometer-central-agent" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143618 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="ceilometer-central-agent" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143630 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-auditor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143637 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-auditor" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143650 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e80c8c9-0f28-488c-9b68-ff4d379f9059" containerName="keystone-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143657 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e80c8c9-0f28-488c-9b68-ff4d379f9059" containerName="keystone-api" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143666 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143672 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-api" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143680 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" containerName="rabbitmq" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143688 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" containerName="rabbitmq" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143698 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-replicator" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143705 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-replicator" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143714 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94562cbd-1b98-4b30-9a97-f26fb103bb72" containerName="setup-container" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143720 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="94562cbd-1b98-4b30-9a97-f26fb103bb72" containerName="setup-container" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143731 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerName="glance-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143738 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerName="glance-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143749 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143755 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-api" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143769 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143776 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143787 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerName="openstack-network-exporter" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143794 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerName="openstack-network-exporter" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143804 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="ceilometer-notification-agent" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143812 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="ceilometer-notification-agent" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143822 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-server" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143828 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-server" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143836 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-replicator" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143844 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-replicator" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143854 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-replicator" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143861 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-replicator" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143870 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd95e3d9-8fc5-4f14-a009-cd1749743679" containerName="mariadb-account-create-update" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143877 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd95e3d9-8fc5-4f14-a009-cd1749743679" containerName="mariadb-account-create-update" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143885 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801b1613-3ed1-423d-85cf-6b87f6711034" containerName="probe" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143891 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="801b1613-3ed1-423d-85cf-6b87f6711034" containerName="probe" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143900 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerName="cinder-api-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143907 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerName="cinder-api-log" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143919 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerName="ovn-northd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143926 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerName="ovn-northd" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143934 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-auditor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143940 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-auditor" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143948 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143954 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-log" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143962 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="swift-recon-cron" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143968 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="swift-recon-cron" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143979 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44171111-32bf-415d-97c9-f966209cd0a0" containerName="ovn-controller" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.143986 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="44171111-32bf-415d-97c9-f966209cd0a0" containerName="ovn-controller" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.143999 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144006 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144015 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f57fe7-555b-4119-b03e-fafb151494a6" containerName="nova-cell1-conductor-conductor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144021 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f57fe7-555b-4119-b03e-fafb151494a6" containerName="nova-cell1-conductor-conductor" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144029 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-updater" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144036 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-updater" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144045 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-reaper" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144053 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-reaper" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144063 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-updater" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144070 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-updater" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144084 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144091 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144106 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144113 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-log" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144125 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-auditor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144133 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-auditor" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144144 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerName="neutron-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144152 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerName="neutron-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144165 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerName="cinder-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144173 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerName="cinder-api" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144181 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" containerName="mysql-bootstrap" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144189 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" containerName="mysql-bootstrap" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144200 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="sg-core" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144207 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="sg-core" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144214 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerName="glance-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144222 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerName="glance-log" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144231 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-server" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144238 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-server" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144248 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd95e3d9-8fc5-4f14-a009-cd1749743679" containerName="mariadb-account-create-update" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144278 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd95e3d9-8fc5-4f14-a009-cd1749743679" containerName="mariadb-account-create-update" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144316 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerName="neutron-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144324 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerName="neutron-api" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144336 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="proxy-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144344 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="proxy-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144356 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server-init" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144363 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server-init" Feb 26 16:08:00 crc kubenswrapper[5033]: E0226 16:08:00.144371 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="rsync" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144377 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="rsync" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144529 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-replicator" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144550 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerName="neutron-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144562 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="sg-core" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144572 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-replicator" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144585 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="ceilometer-notification-agent" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144594 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-server" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144604 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-metadata" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144619 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-server" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144630 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerName="cinder-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144640 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="801b1613-3ed1-423d-85cf-6b87f6711034" containerName="cinder-scheduler" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144648 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144663 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="44171111-32bf-415d-97c9-f966209cd0a0" containerName="ovn-controller" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144676 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144689 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e80c8c9-0f28-488c-9b68-ff4d379f9059" containerName="keystone-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144698 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd95e3d9-8fc5-4f14-a009-cd1749743679" containerName="mariadb-account-create-update" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144706 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef30f16-84a1-4809-8ec4-0c36e315c5f3" containerName="placement-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144718 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-updater" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144726 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-reaper" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144737 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dcb38b4-0144-434f-adc1-63a2b63565ee" containerName="nova-cell0-conductor-conductor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144750 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerName="ovn-northd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144763 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="swift-recon-cron" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144777 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d989fd27-f36c-4503-a400-e09bb4e70398" containerName="nova-scheduler-scheduler" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144786 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="443e562a-aefe-46f3-a86f-59205f5849c7" containerName="rabbitmq" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144797 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab4047a-07ab-4c15-88eb-7dbefe5bea78" containerName="cinder-api-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144809 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-auditor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144819 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-auditor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144827 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerName="glance-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144838 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="801b1613-3ed1-423d-85cf-6b87f6711034" containerName="probe" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144847 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-expirer" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144857 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144866 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2a5173-ce4c-4d4a-a136-70356adabbb9" containerName="nova-api-api" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144875 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-server" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144884 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa45e26-ccbe-4827-8ddd-ebb5308b98ec" containerName="galera" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144892 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovsdb-server" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144903 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerName="glance-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144914 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="container-replicator" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144922 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="998d4b12-ebbc-4f93-8ef4-dcd5898b5cc3" containerName="glance-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144933 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="94562cbd-1b98-4b30-9a97-f26fb103bb72" containerName="rabbitmq" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144940 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9646add7-5da6-4590-9394-b4325e8ebd2d" containerName="barbican-api-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144950 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4c6e17-4b51-4387-98c8-d071c1fc7c5a" containerName="nova-metadata-log" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144958 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="be41eafa-8738-4d4a-9f9e-592c105bc766" containerName="memcached" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144967 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="account-auditor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144977 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="rsync" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144988 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7073ea-074d-41d7-b114-9d6b8e02c3a0" containerName="neutron-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.144996 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a5a420-8682-4780-8349-3da44c8d7f1a" containerName="openstack-network-exporter" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.145008 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="ceilometer-central-agent" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.145017 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd331de5-997f-4bb2-a032-d174a0bf059b" containerName="proxy-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.145028 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd63f58-35b3-4eeb-a6af-6d1226ee2fb9" containerName="object-updater" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.145038 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f57fe7-555b-4119-b03e-fafb151494a6" containerName="nova-cell1-conductor-conductor" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.145049 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="783187df-a438-4fb2-97be-3565e08e88ac" containerName="kube-state-metrics" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.145058 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d8f31df-b55c-44c6-88e4-3b0d0fb6c1e6" containerName="ovs-vswitchd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.145070 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf0b01b9-0085-4552-8a7d-1e7359dad79b" containerName="glance-httpd" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.145619 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535368-tklxx" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.150593 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.150676 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.150955 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.153026 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535368-tklxx"] Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.252105 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfq9l\" (UniqueName: \"kubernetes.io/projected/2fb768e0-695f-4724-95e2-a6b7489caf29-kube-api-access-tfq9l\") pod \"auto-csr-approver-29535368-tklxx\" (UID: \"2fb768e0-695f-4724-95e2-a6b7489caf29\") " pod="openshift-infra/auto-csr-approver-29535368-tklxx" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.353481 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfq9l\" (UniqueName: \"kubernetes.io/projected/2fb768e0-695f-4724-95e2-a6b7489caf29-kube-api-access-tfq9l\") pod \"auto-csr-approver-29535368-tklxx\" (UID: \"2fb768e0-695f-4724-95e2-a6b7489caf29\") " pod="openshift-infra/auto-csr-approver-29535368-tklxx" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.374893 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfq9l\" (UniqueName: \"kubernetes.io/projected/2fb768e0-695f-4724-95e2-a6b7489caf29-kube-api-access-tfq9l\") pod \"auto-csr-approver-29535368-tklxx\" (UID: \"2fb768e0-695f-4724-95e2-a6b7489caf29\") " pod="openshift-infra/auto-csr-approver-29535368-tklxx" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.501298 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535368-tklxx" Feb 26 16:08:00 crc kubenswrapper[5033]: I0226 16:08:00.926356 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535368-tklxx"] Feb 26 16:08:01 crc kubenswrapper[5033]: I0226 16:08:01.269152 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535368-tklxx" event={"ID":"2fb768e0-695f-4724-95e2-a6b7489caf29","Type":"ContainerStarted","Data":"703c6ef0c09c2cbab5c28bbc8d7f8f038e8f05e2b6356f51bd3c1a968b1719ba"} Feb 26 16:08:01 crc kubenswrapper[5033]: I0226 16:08:01.629382 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:08:01 crc kubenswrapper[5033]: E0226 16:08:01.629613 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:08:02 crc kubenswrapper[5033]: I0226 16:08:02.278186 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535368-tklxx" event={"ID":"2fb768e0-695f-4724-95e2-a6b7489caf29","Type":"ContainerStarted","Data":"b0f512828eb2b21c3bc6f935e6daa0e77c3de9add7b130c5dbee2e2be3eef863"} Feb 26 16:08:02 crc kubenswrapper[5033]: I0226 16:08:02.292720 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535368-tklxx" podStartSLOduration=1.282884794 podStartE2EDuration="2.292701532s" podCreationTimestamp="2026-02-26 16:08:00 +0000 UTC" firstStartedPulling="2026-02-26 16:08:00.932639557 +0000 UTC m=+1608.145238207" lastFinishedPulling="2026-02-26 16:08:01.942456275 +0000 UTC m=+1609.155054945" observedRunningTime="2026-02-26 16:08:02.291561723 +0000 UTC m=+1609.504160373" watchObservedRunningTime="2026-02-26 16:08:02.292701532 +0000 UTC m=+1609.505300182" Feb 26 16:08:03 crc kubenswrapper[5033]: I0226 16:08:03.289560 5033 generic.go:334] "Generic (PLEG): container finished" podID="2fb768e0-695f-4724-95e2-a6b7489caf29" containerID="b0f512828eb2b21c3bc6f935e6daa0e77c3de9add7b130c5dbee2e2be3eef863" exitCode=0 Feb 26 16:08:03 crc kubenswrapper[5033]: I0226 16:08:03.289642 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535368-tklxx" event={"ID":"2fb768e0-695f-4724-95e2-a6b7489caf29","Type":"ContainerDied","Data":"b0f512828eb2b21c3bc6f935e6daa0e77c3de9add7b130c5dbee2e2be3eef863"} Feb 26 16:08:04 crc kubenswrapper[5033]: I0226 16:08:04.545972 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535368-tklxx" Feb 26 16:08:04 crc kubenswrapper[5033]: I0226 16:08:04.613778 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfq9l\" (UniqueName: \"kubernetes.io/projected/2fb768e0-695f-4724-95e2-a6b7489caf29-kube-api-access-tfq9l\") pod \"2fb768e0-695f-4724-95e2-a6b7489caf29\" (UID: \"2fb768e0-695f-4724-95e2-a6b7489caf29\") " Feb 26 16:08:04 crc kubenswrapper[5033]: I0226 16:08:04.620554 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb768e0-695f-4724-95e2-a6b7489caf29-kube-api-access-tfq9l" (OuterVolumeSpecName: "kube-api-access-tfq9l") pod "2fb768e0-695f-4724-95e2-a6b7489caf29" (UID: "2fb768e0-695f-4724-95e2-a6b7489caf29"). InnerVolumeSpecName "kube-api-access-tfq9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:04 crc kubenswrapper[5033]: I0226 16:08:04.715761 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfq9l\" (UniqueName: \"kubernetes.io/projected/2fb768e0-695f-4724-95e2-a6b7489caf29-kube-api-access-tfq9l\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:05 crc kubenswrapper[5033]: I0226 16:08:05.310336 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535368-tklxx" event={"ID":"2fb768e0-695f-4724-95e2-a6b7489caf29","Type":"ContainerDied","Data":"703c6ef0c09c2cbab5c28bbc8d7f8f038e8f05e2b6356f51bd3c1a968b1719ba"} Feb 26 16:08:05 crc kubenswrapper[5033]: I0226 16:08:05.310423 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="703c6ef0c09c2cbab5c28bbc8d7f8f038e8f05e2b6356f51bd3c1a968b1719ba" Feb 26 16:08:05 crc kubenswrapper[5033]: I0226 16:08:05.310429 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535368-tklxx" Feb 26 16:08:05 crc kubenswrapper[5033]: I0226 16:08:05.380855 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535362-5fl8x"] Feb 26 16:08:05 crc kubenswrapper[5033]: I0226 16:08:05.387515 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535362-5fl8x"] Feb 26 16:08:05 crc kubenswrapper[5033]: I0226 16:08:05.637348 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66182ef7-3ffe-4652-bd43-e0d8be6ea7fa" path="/var/lib/kubelet/pods/66182ef7-3ffe-4652-bd43-e0d8be6ea7fa/volumes" Feb 26 16:08:14 crc kubenswrapper[5033]: I0226 16:08:14.628682 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:08:14 crc kubenswrapper[5033]: E0226 16:08:14.629858 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:08:27 crc kubenswrapper[5033]: I0226 16:08:27.893673 5033 scope.go:117] "RemoveContainer" containerID="ff33f6599e5ddd894eee122447342aa510b435cb73d1291495aac3158217c977" Feb 26 16:08:27 crc kubenswrapper[5033]: I0226 16:08:27.939637 5033 scope.go:117] "RemoveContainer" containerID="49468fafa60ce3443569ed80de49ce1efc39535d4d172f5cb2cbc367b94cfb46" Feb 26 16:08:27 crc kubenswrapper[5033]: I0226 16:08:27.977936 5033 scope.go:117] "RemoveContainer" containerID="67e9048d9f60698463f280ec768071214417ebd7eeb7838d27122da3960e1adf" Feb 26 16:08:27 crc kubenswrapper[5033]: I0226 16:08:27.996627 5033 scope.go:117] "RemoveContainer" containerID="6a7d7dbe02afd322b3e2d289d323bed25dba88e717f698d8dc49df491eab87d0" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.024134 5033 scope.go:117] "RemoveContainer" containerID="172da1142b99165336a234020b93a79cc90ba02e0907932c242923af1b6dd5cf" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.043973 5033 scope.go:117] "RemoveContainer" containerID="98708fa1df8c223588f0d564d0bda8f30f7950dcb996257462125b57ce97c4e1" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.074587 5033 scope.go:117] "RemoveContainer" containerID="d4df572ff20d78acf262ce21a0a068f8731d9a8a8e278d67a66bcc4ef90cd096" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.100187 5033 scope.go:117] "RemoveContainer" containerID="ba8083c3d660d1cba09283dbdbdc8e9d22c0f6c25c7dc930e73006ee1f1e2a16" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.124436 5033 scope.go:117] "RemoveContainer" containerID="842f851e2c6783742a1f73d15c38ae7e35440285fe1efdf4007c5f41253c5d16" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.163595 5033 scope.go:117] "RemoveContainer" containerID="fd7383dc4bc45e9bd455588f2952d88a9cce86734afce891b6ca79bc4eaa00c8" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.197652 5033 scope.go:117] "RemoveContainer" containerID="1c9c4fd07b0b2d14f63fa13680cdf51683f0305829e18896e63fa3f4e39ded84" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.219117 5033 scope.go:117] "RemoveContainer" containerID="b14df90635c8f9f683dd06ebec2c2537674122a3b91f3dd889cb1e109a2874d9" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.241145 5033 scope.go:117] "RemoveContainer" containerID="d0ff186c2752535cbe0156aa95c8b137562b367feee80bda35bbc94a62258970" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.262218 5033 scope.go:117] "RemoveContainer" containerID="8aa3061c2c3213ed0df4860c6654d27a0c25ada76528ffc497cb2e23fc08edc9" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.287089 5033 scope.go:117] "RemoveContainer" containerID="a305a66c7d239ac14f07cf5803873ef4155ad10317dad1eda014aca0549ea266" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.307676 5033 scope.go:117] "RemoveContainer" containerID="ca9dacffdf3c3bcb9175574af945576f2cf13f888c591849b03f35e84cfd9fa4" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.324434 5033 scope.go:117] "RemoveContainer" containerID="6046c4769194f5929905a2674f35beebb80e8dba1a305d0e97be6b50d6ce657f" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.343610 5033 scope.go:117] "RemoveContainer" containerID="7afaf1da9f235b64e0b8813df9a957f0a8325406367387aae716204d0ae91726" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.363964 5033 scope.go:117] "RemoveContainer" containerID="d9d43a57c427770c384b76badb61f86c68f296b829ada01856cbd642d294f82c" Feb 26 16:08:28 crc kubenswrapper[5033]: I0226 16:08:28.387252 5033 scope.go:117] "RemoveContainer" containerID="20b248e0fb1d9960be2a68c40ac5b87b789f7097644b42650194a80f3d646412" Feb 26 16:08:29 crc kubenswrapper[5033]: I0226 16:08:29.628714 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:08:29 crc kubenswrapper[5033]: E0226 16:08:29.629359 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:08:40 crc kubenswrapper[5033]: I0226 16:08:40.628538 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:08:40 crc kubenswrapper[5033]: E0226 16:08:40.629376 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:08:51 crc kubenswrapper[5033]: I0226 16:08:51.628533 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:08:51 crc kubenswrapper[5033]: E0226 16:08:51.629528 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:09:04 crc kubenswrapper[5033]: I0226 16:09:04.628533 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:09:04 crc kubenswrapper[5033]: E0226 16:09:04.629291 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:09:18 crc kubenswrapper[5033]: I0226 16:09:18.628449 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:09:18 crc kubenswrapper[5033]: E0226 16:09:18.629177 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:09:28 crc kubenswrapper[5033]: I0226 16:09:28.669403 5033 scope.go:117] "RemoveContainer" containerID="821f636cc982e0875bd9d953abc8ae92ee1bf3e1a4ab78c45eca149715e8ab3b" Feb 26 16:09:28 crc kubenswrapper[5033]: I0226 16:09:28.718219 5033 scope.go:117] "RemoveContainer" containerID="27390461cfcd9e9921cea3efef5e382089d375c8f6c002d59c240bf21f61222b" Feb 26 16:09:28 crc kubenswrapper[5033]: I0226 16:09:28.749722 5033 scope.go:117] "RemoveContainer" containerID="88846ecdfcf0ca846f3ca53a24d1fe2b4ad8f940ade0f867c6bb6c964a133abd" Feb 26 16:09:28 crc kubenswrapper[5033]: I0226 16:09:28.800165 5033 scope.go:117] "RemoveContainer" containerID="4fc856374b9e4577d7bc3935a637339263321a668147cce6238a1cda79ff279b" Feb 26 16:09:28 crc kubenswrapper[5033]: I0226 16:09:28.846047 5033 scope.go:117] "RemoveContainer" containerID="d97ab34cf5faf76006f9859b79ccdbff315105cdef47777cbd0d6eac5ee0600b" Feb 26 16:09:28 crc kubenswrapper[5033]: I0226 16:09:28.873754 5033 scope.go:117] "RemoveContainer" containerID="444638d70c14e26c0e7807701010c49966be1b71768de02d5f5fcc407e2851b7" Feb 26 16:09:28 crc kubenswrapper[5033]: I0226 16:09:28.892031 5033 scope.go:117] "RemoveContainer" containerID="8cffddd5fbb4d925155c313ad8810b6f0ee25c013102a3cec9d3a4efd08a23af" Feb 26 16:09:29 crc kubenswrapper[5033]: I0226 16:09:29.628306 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:09:29 crc kubenswrapper[5033]: E0226 16:09:29.628546 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:09:41 crc kubenswrapper[5033]: I0226 16:09:41.628115 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:09:41 crc kubenswrapper[5033]: E0226 16:09:41.628877 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:09:54 crc kubenswrapper[5033]: I0226 16:09:54.628246 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:09:54 crc kubenswrapper[5033]: E0226 16:09:54.629053 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.139441 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zdqzm"] Feb 26 16:10:00 crc kubenswrapper[5033]: E0226 16:10:00.140064 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb768e0-695f-4724-95e2-a6b7489caf29" containerName="oc" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.140076 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb768e0-695f-4724-95e2-a6b7489caf29" containerName="oc" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.140211 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd95e3d9-8fc5-4f14-a009-cd1749743679" containerName="mariadb-account-create-update" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.140221 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb768e0-695f-4724-95e2-a6b7489caf29" containerName="oc" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.140760 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535370-zdqzm" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.142791 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.143417 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.143724 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.152413 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zdqzm"] Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.234830 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5rmc\" (UniqueName: \"kubernetes.io/projected/6b59b936-69de-47eb-b624-06122f76e5fd-kube-api-access-s5rmc\") pod \"auto-csr-approver-29535370-zdqzm\" (UID: \"6b59b936-69de-47eb-b624-06122f76e5fd\") " pod="openshift-infra/auto-csr-approver-29535370-zdqzm" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.335802 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5rmc\" (UniqueName: \"kubernetes.io/projected/6b59b936-69de-47eb-b624-06122f76e5fd-kube-api-access-s5rmc\") pod \"auto-csr-approver-29535370-zdqzm\" (UID: \"6b59b936-69de-47eb-b624-06122f76e5fd\") " pod="openshift-infra/auto-csr-approver-29535370-zdqzm" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.357741 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5rmc\" (UniqueName: \"kubernetes.io/projected/6b59b936-69de-47eb-b624-06122f76e5fd-kube-api-access-s5rmc\") pod \"auto-csr-approver-29535370-zdqzm\" (UID: \"6b59b936-69de-47eb-b624-06122f76e5fd\") " pod="openshift-infra/auto-csr-approver-29535370-zdqzm" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.458967 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535370-zdqzm" Feb 26 16:10:00 crc kubenswrapper[5033]: I0226 16:10:00.905698 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zdqzm"] Feb 26 16:10:01 crc kubenswrapper[5033]: I0226 16:10:01.259536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535370-zdqzm" event={"ID":"6b59b936-69de-47eb-b624-06122f76e5fd","Type":"ContainerStarted","Data":"e705a5312949cfdde37080c09974c2b5442f398a06b357c6c01e894e0153a754"} Feb 26 16:10:03 crc kubenswrapper[5033]: I0226 16:10:03.275358 5033 generic.go:334] "Generic (PLEG): container finished" podID="6b59b936-69de-47eb-b624-06122f76e5fd" containerID="722ea520796dc321577bb30d9984cccb411f46bdfc3ead546cfba659036b1456" exitCode=0 Feb 26 16:10:03 crc kubenswrapper[5033]: I0226 16:10:03.275673 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535370-zdqzm" event={"ID":"6b59b936-69de-47eb-b624-06122f76e5fd","Type":"ContainerDied","Data":"722ea520796dc321577bb30d9984cccb411f46bdfc3ead546cfba659036b1456"} Feb 26 16:10:04 crc kubenswrapper[5033]: I0226 16:10:04.577101 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535370-zdqzm" Feb 26 16:10:04 crc kubenswrapper[5033]: I0226 16:10:04.697414 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5rmc\" (UniqueName: \"kubernetes.io/projected/6b59b936-69de-47eb-b624-06122f76e5fd-kube-api-access-s5rmc\") pod \"6b59b936-69de-47eb-b624-06122f76e5fd\" (UID: \"6b59b936-69de-47eb-b624-06122f76e5fd\") " Feb 26 16:10:04 crc kubenswrapper[5033]: I0226 16:10:04.702546 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b59b936-69de-47eb-b624-06122f76e5fd-kube-api-access-s5rmc" (OuterVolumeSpecName: "kube-api-access-s5rmc") pod "6b59b936-69de-47eb-b624-06122f76e5fd" (UID: "6b59b936-69de-47eb-b624-06122f76e5fd"). InnerVolumeSpecName "kube-api-access-s5rmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:10:04 crc kubenswrapper[5033]: I0226 16:10:04.798680 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5rmc\" (UniqueName: \"kubernetes.io/projected/6b59b936-69de-47eb-b624-06122f76e5fd-kube-api-access-s5rmc\") on node \"crc\" DevicePath \"\"" Feb 26 16:10:05 crc kubenswrapper[5033]: I0226 16:10:05.296088 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535370-zdqzm" event={"ID":"6b59b936-69de-47eb-b624-06122f76e5fd","Type":"ContainerDied","Data":"e705a5312949cfdde37080c09974c2b5442f398a06b357c6c01e894e0153a754"} Feb 26 16:10:05 crc kubenswrapper[5033]: I0226 16:10:05.296743 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e705a5312949cfdde37080c09974c2b5442f398a06b357c6c01e894e0153a754" Feb 26 16:10:05 crc kubenswrapper[5033]: I0226 16:10:05.296171 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535370-zdqzm" Feb 26 16:10:05 crc kubenswrapper[5033]: I0226 16:10:05.650013 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535364-6vq2m"] Feb 26 16:10:05 crc kubenswrapper[5033]: I0226 16:10:05.655286 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535364-6vq2m"] Feb 26 16:10:06 crc kubenswrapper[5033]: I0226 16:10:06.628419 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:10:06 crc kubenswrapper[5033]: E0226 16:10:06.628695 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:10:07 crc kubenswrapper[5033]: I0226 16:10:07.637347 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f5bf1f-dc74-40b4-beb7-4fbe927804c0" path="/var/lib/kubelet/pods/46f5bf1f-dc74-40b4-beb7-4fbe927804c0/volumes" Feb 26 16:10:21 crc kubenswrapper[5033]: I0226 16:10:21.628846 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:10:21 crc kubenswrapper[5033]: E0226 16:10:21.629753 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:10:29 crc kubenswrapper[5033]: I0226 16:10:29.001556 5033 scope.go:117] "RemoveContainer" containerID="7290302f1aa5e4c3e4db1afa9200af7fbef86671c0765f651efc74f6caefccdc" Feb 26 16:10:29 crc kubenswrapper[5033]: I0226 16:10:29.024317 5033 scope.go:117] "RemoveContainer" containerID="1e8f8f1fb9302e352c7430cffae0a9881509b338bd958c62de8c5b00c66ea405" Feb 26 16:10:29 crc kubenswrapper[5033]: I0226 16:10:29.049773 5033 scope.go:117] "RemoveContainer" containerID="a4dfe992eb2b9293a5a29973827b5efe4e44f2fdd0cbef2c70caa804c30efe24" Feb 26 16:10:29 crc kubenswrapper[5033]: I0226 16:10:29.072176 5033 scope.go:117] "RemoveContainer" containerID="4afe6aff4b824c652ac3295bec5f930e6e6d034934db5c1edaa5c7cba85591a9" Feb 26 16:10:29 crc kubenswrapper[5033]: I0226 16:10:29.121034 5033 scope.go:117] "RemoveContainer" containerID="5b1ffc7d5c02c07b9555b87eb62633cee02ea5a2f677dd7257675723ea705b37" Feb 26 16:10:29 crc kubenswrapper[5033]: I0226 16:10:29.144928 5033 scope.go:117] "RemoveContainer" containerID="3be7d1a82e288ba046a1c6b4bbc24dae70e39ba857a65876854c21c435f5b1de" Feb 26 16:10:29 crc kubenswrapper[5033]: I0226 16:10:29.173362 5033 scope.go:117] "RemoveContainer" containerID="601ec1341d1e2b8c27d9c9b09bc6aa50cda535ffeee2dea5a0f7b8a4a08ee53d" Feb 26 16:10:35 crc kubenswrapper[5033]: I0226 16:10:35.628408 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:10:35 crc kubenswrapper[5033]: E0226 16:10:35.629433 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:10:50 crc kubenswrapper[5033]: I0226 16:10:50.628582 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:10:50 crc kubenswrapper[5033]: E0226 16:10:50.629390 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:11:01 crc kubenswrapper[5033]: I0226 16:11:01.628341 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:11:01 crc kubenswrapper[5033]: E0226 16:11:01.629278 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:11:16 crc kubenswrapper[5033]: I0226 16:11:16.628504 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:11:16 crc kubenswrapper[5033]: E0226 16:11:16.629556 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:11:27 crc kubenswrapper[5033]: I0226 16:11:27.628153 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:11:27 crc kubenswrapper[5033]: E0226 16:11:27.628833 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:11:29 crc kubenswrapper[5033]: I0226 16:11:29.272672 5033 scope.go:117] "RemoveContainer" containerID="1a1d164d9741ffa6240e670361f34560b7319eec8e30291e5bd6ef38daf1fe8d" Feb 26 16:11:29 crc kubenswrapper[5033]: I0226 16:11:29.317815 5033 scope.go:117] "RemoveContainer" containerID="a3eb7e03f8fefb3acf5770d5fe9e48de5bd74bdc5f586a4270cbdd4bdb99df39" Feb 26 16:11:29 crc kubenswrapper[5033]: I0226 16:11:29.360040 5033 scope.go:117] "RemoveContainer" containerID="0faa776ae6b63848c0d05a66870162df03f5366c75da8574c2561da35ab31883" Feb 26 16:11:41 crc kubenswrapper[5033]: I0226 16:11:41.627768 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:11:41 crc kubenswrapper[5033]: E0226 16:11:41.628374 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:11:56 crc kubenswrapper[5033]: I0226 16:11:56.627782 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:11:56 crc kubenswrapper[5033]: E0226 16:11:56.630057 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.155434 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535372-kdpjv"] Feb 26 16:12:00 crc kubenswrapper[5033]: E0226 16:12:00.156714 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b59b936-69de-47eb-b624-06122f76e5fd" containerName="oc" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.156759 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b59b936-69de-47eb-b624-06122f76e5fd" containerName="oc" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.156924 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b59b936-69de-47eb-b624-06122f76e5fd" containerName="oc" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.157502 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535372-kdpjv" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.160066 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.160492 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.162844 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535372-kdpjv"] Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.163856 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.294009 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szlq6\" (UniqueName: \"kubernetes.io/projected/5402975e-da0a-4772-b4c8-23d1b6e6b0c3-kube-api-access-szlq6\") pod \"auto-csr-approver-29535372-kdpjv\" (UID: \"5402975e-da0a-4772-b4c8-23d1b6e6b0c3\") " pod="openshift-infra/auto-csr-approver-29535372-kdpjv" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.395071 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szlq6\" (UniqueName: \"kubernetes.io/projected/5402975e-da0a-4772-b4c8-23d1b6e6b0c3-kube-api-access-szlq6\") pod \"auto-csr-approver-29535372-kdpjv\" (UID: \"5402975e-da0a-4772-b4c8-23d1b6e6b0c3\") " pod="openshift-infra/auto-csr-approver-29535372-kdpjv" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.417956 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szlq6\" (UniqueName: \"kubernetes.io/projected/5402975e-da0a-4772-b4c8-23d1b6e6b0c3-kube-api-access-szlq6\") pod \"auto-csr-approver-29535372-kdpjv\" (UID: \"5402975e-da0a-4772-b4c8-23d1b6e6b0c3\") " pod="openshift-infra/auto-csr-approver-29535372-kdpjv" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.478215 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535372-kdpjv" Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.909610 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535372-kdpjv"] Feb 26 16:12:00 crc kubenswrapper[5033]: I0226 16:12:00.918719 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:12:01 crc kubenswrapper[5033]: I0226 16:12:01.801873 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535372-kdpjv" event={"ID":"5402975e-da0a-4772-b4c8-23d1b6e6b0c3","Type":"ContainerStarted","Data":"15985f77020016a84b55beb386a039d379d767b29ca33ba1bc1286d029b714cb"} Feb 26 16:12:02 crc kubenswrapper[5033]: I0226 16:12:02.811499 5033 generic.go:334] "Generic (PLEG): container finished" podID="5402975e-da0a-4772-b4c8-23d1b6e6b0c3" containerID="812de937ad4e8d44c191adc4c0ad5bfeb57e0ff351d383736214e0c4eb73e592" exitCode=0 Feb 26 16:12:02 crc kubenswrapper[5033]: I0226 16:12:02.811741 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535372-kdpjv" event={"ID":"5402975e-da0a-4772-b4c8-23d1b6e6b0c3","Type":"ContainerDied","Data":"812de937ad4e8d44c191adc4c0ad5bfeb57e0ff351d383736214e0c4eb73e592"} Feb 26 16:12:04 crc kubenswrapper[5033]: I0226 16:12:04.103960 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535372-kdpjv" Feb 26 16:12:04 crc kubenswrapper[5033]: I0226 16:12:04.263980 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szlq6\" (UniqueName: \"kubernetes.io/projected/5402975e-da0a-4772-b4c8-23d1b6e6b0c3-kube-api-access-szlq6\") pod \"5402975e-da0a-4772-b4c8-23d1b6e6b0c3\" (UID: \"5402975e-da0a-4772-b4c8-23d1b6e6b0c3\") " Feb 26 16:12:04 crc kubenswrapper[5033]: I0226 16:12:04.270614 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5402975e-da0a-4772-b4c8-23d1b6e6b0c3-kube-api-access-szlq6" (OuterVolumeSpecName: "kube-api-access-szlq6") pod "5402975e-da0a-4772-b4c8-23d1b6e6b0c3" (UID: "5402975e-da0a-4772-b4c8-23d1b6e6b0c3"). InnerVolumeSpecName "kube-api-access-szlq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:12:04 crc kubenswrapper[5033]: I0226 16:12:04.365934 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szlq6\" (UniqueName: \"kubernetes.io/projected/5402975e-da0a-4772-b4c8-23d1b6e6b0c3-kube-api-access-szlq6\") on node \"crc\" DevicePath \"\"" Feb 26 16:12:04 crc kubenswrapper[5033]: I0226 16:12:04.840136 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535372-kdpjv" event={"ID":"5402975e-da0a-4772-b4c8-23d1b6e6b0c3","Type":"ContainerDied","Data":"15985f77020016a84b55beb386a039d379d767b29ca33ba1bc1286d029b714cb"} Feb 26 16:12:04 crc kubenswrapper[5033]: I0226 16:12:04.840198 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15985f77020016a84b55beb386a039d379d767b29ca33ba1bc1286d029b714cb" Feb 26 16:12:04 crc kubenswrapper[5033]: I0226 16:12:04.840343 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535372-kdpjv" Feb 26 16:12:05 crc kubenswrapper[5033]: I0226 16:12:05.168837 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535366-kk997"] Feb 26 16:12:05 crc kubenswrapper[5033]: I0226 16:12:05.173291 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535366-kk997"] Feb 26 16:12:05 crc kubenswrapper[5033]: I0226 16:12:05.639153 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f" path="/var/lib/kubelet/pods/e9e2e282-a5ab-4fa8-89e9-b7eee6275f4f/volumes" Feb 26 16:12:11 crc kubenswrapper[5033]: I0226 16:12:11.628251 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:12:11 crc kubenswrapper[5033]: E0226 16:12:11.628819 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:12:22 crc kubenswrapper[5033]: I0226 16:12:22.628283 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:12:22 crc kubenswrapper[5033]: E0226 16:12:22.629007 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.437160 5033 scope.go:117] "RemoveContainer" containerID="761688171a3882c19e60e75384d7494e2eaf7d87bf59d1ddf87fe813049c4ba5" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.467229 5033 scope.go:117] "RemoveContainer" containerID="7c2836b72b99580d0853eb6d7ae0794bc4651ce530042e155abe1f4de07321f5" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.483671 5033 scope.go:117] "RemoveContainer" containerID="ea8b8c6bf7109310286c3358cc85b56b065d8fb055d796d627b747c0847016e4" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.527192 5033 scope.go:117] "RemoveContainer" containerID="6f81ef7e60c36350746ebf868687502dc916cf71af0c8866c4d79814b51151f7" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.547280 5033 scope.go:117] "RemoveContainer" containerID="efe61e5ba101f92c1e8b4416d9d0570b31c5948ae215ef72723140bf034dbde5" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.563277 5033 scope.go:117] "RemoveContainer" containerID="b6eace0b2103a031f250c85a90a9746babaf3bcd0afd8cd89a42175fe2fb3877" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.581841 5033 scope.go:117] "RemoveContainer" containerID="274baa6f0fac074022c217f76e1ef56693c6f2d00d433a2f9b4042c158da447b" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.603515 5033 scope.go:117] "RemoveContainer" containerID="8ba0e68a810486498a38130d5f7cd25d15eea20db5a058adc2d449d3b9f85eb5" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.633756 5033 scope.go:117] "RemoveContainer" containerID="5023613edb03d8f6e62ea75ac5a8258a7c708083278aa3e2673f05799f53cdb1" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.669945 5033 scope.go:117] "RemoveContainer" containerID="dc6dc75920b601a897218ec03556e7703f8571b16b9f948992ac68c3a552a5b8" Feb 26 16:12:29 crc kubenswrapper[5033]: I0226 16:12:29.689946 5033 scope.go:117] "RemoveContainer" containerID="fd787498335d14e2033ee5b470f24eb5998e703815bbba943d72e487ec251728" Feb 26 16:12:37 crc kubenswrapper[5033]: I0226 16:12:37.628161 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:12:37 crc kubenswrapper[5033]: E0226 16:12:37.629046 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:12:51 crc kubenswrapper[5033]: I0226 16:12:51.627930 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:12:52 crc kubenswrapper[5033]: I0226 16:12:52.221151 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"3f4ff328ef64b64bd90cfbfbff26eb9b2edc5412a004940ff047398b5e904341"} Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.144955 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535374-79fct"] Feb 26 16:14:00 crc kubenswrapper[5033]: E0226 16:14:00.145976 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5402975e-da0a-4772-b4c8-23d1b6e6b0c3" containerName="oc" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.146000 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5402975e-da0a-4772-b4c8-23d1b6e6b0c3" containerName="oc" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.146251 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5402975e-da0a-4772-b4c8-23d1b6e6b0c3" containerName="oc" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.147014 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535374-79fct" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.150621 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.150806 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.151323 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.152090 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535374-79fct"] Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.303619 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdjmz\" (UniqueName: \"kubernetes.io/projected/e0936582-1465-439f-848a-bee5ff2bd040-kube-api-access-sdjmz\") pod \"auto-csr-approver-29535374-79fct\" (UID: \"e0936582-1465-439f-848a-bee5ff2bd040\") " pod="openshift-infra/auto-csr-approver-29535374-79fct" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.405005 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdjmz\" (UniqueName: \"kubernetes.io/projected/e0936582-1465-439f-848a-bee5ff2bd040-kube-api-access-sdjmz\") pod \"auto-csr-approver-29535374-79fct\" (UID: \"e0936582-1465-439f-848a-bee5ff2bd040\") " pod="openshift-infra/auto-csr-approver-29535374-79fct" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.422952 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdjmz\" (UniqueName: \"kubernetes.io/projected/e0936582-1465-439f-848a-bee5ff2bd040-kube-api-access-sdjmz\") pod \"auto-csr-approver-29535374-79fct\" (UID: \"e0936582-1465-439f-848a-bee5ff2bd040\") " pod="openshift-infra/auto-csr-approver-29535374-79fct" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.468114 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535374-79fct" Feb 26 16:14:00 crc kubenswrapper[5033]: I0226 16:14:00.884161 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535374-79fct"] Feb 26 16:14:01 crc kubenswrapper[5033]: I0226 16:14:01.744639 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535374-79fct" event={"ID":"e0936582-1465-439f-848a-bee5ff2bd040","Type":"ContainerStarted","Data":"b6483d94fc3eb1c339a23343a1c44f6db51afb6d237ebfa53f03ce4a048e96bb"} Feb 26 16:14:02 crc kubenswrapper[5033]: I0226 16:14:02.753120 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535374-79fct" event={"ID":"e0936582-1465-439f-848a-bee5ff2bd040","Type":"ContainerStarted","Data":"36de38cc88a7f6ee5d66b7071512481cf87e2ea3c360c2796ee07b23eedf06da"} Feb 26 16:14:02 crc kubenswrapper[5033]: I0226 16:14:02.773792 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535374-79fct" podStartSLOduration=1.3433284730000001 podStartE2EDuration="2.773773433s" podCreationTimestamp="2026-02-26 16:14:00 +0000 UTC" firstStartedPulling="2026-02-26 16:14:00.891544695 +0000 UTC m=+1968.104143345" lastFinishedPulling="2026-02-26 16:14:02.321989645 +0000 UTC m=+1969.534588305" observedRunningTime="2026-02-26 16:14:02.767773468 +0000 UTC m=+1969.980372118" watchObservedRunningTime="2026-02-26 16:14:02.773773433 +0000 UTC m=+1969.986372083" Feb 26 16:14:03 crc kubenswrapper[5033]: I0226 16:14:03.760169 5033 generic.go:334] "Generic (PLEG): container finished" podID="e0936582-1465-439f-848a-bee5ff2bd040" containerID="36de38cc88a7f6ee5d66b7071512481cf87e2ea3c360c2796ee07b23eedf06da" exitCode=0 Feb 26 16:14:03 crc kubenswrapper[5033]: I0226 16:14:03.760298 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535374-79fct" event={"ID":"e0936582-1465-439f-848a-bee5ff2bd040","Type":"ContainerDied","Data":"36de38cc88a7f6ee5d66b7071512481cf87e2ea3c360c2796ee07b23eedf06da"} Feb 26 16:14:05 crc kubenswrapper[5033]: I0226 16:14:05.029465 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535374-79fct" Feb 26 16:14:05 crc kubenswrapper[5033]: I0226 16:14:05.172699 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdjmz\" (UniqueName: \"kubernetes.io/projected/e0936582-1465-439f-848a-bee5ff2bd040-kube-api-access-sdjmz\") pod \"e0936582-1465-439f-848a-bee5ff2bd040\" (UID: \"e0936582-1465-439f-848a-bee5ff2bd040\") " Feb 26 16:14:05 crc kubenswrapper[5033]: I0226 16:14:05.178197 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0936582-1465-439f-848a-bee5ff2bd040-kube-api-access-sdjmz" (OuterVolumeSpecName: "kube-api-access-sdjmz") pod "e0936582-1465-439f-848a-bee5ff2bd040" (UID: "e0936582-1465-439f-848a-bee5ff2bd040"). InnerVolumeSpecName "kube-api-access-sdjmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:14:05 crc kubenswrapper[5033]: I0226 16:14:05.274272 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdjmz\" (UniqueName: \"kubernetes.io/projected/e0936582-1465-439f-848a-bee5ff2bd040-kube-api-access-sdjmz\") on node \"crc\" DevicePath \"\"" Feb 26 16:14:05 crc kubenswrapper[5033]: I0226 16:14:05.777336 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535374-79fct" event={"ID":"e0936582-1465-439f-848a-bee5ff2bd040","Type":"ContainerDied","Data":"b6483d94fc3eb1c339a23343a1c44f6db51afb6d237ebfa53f03ce4a048e96bb"} Feb 26 16:14:05 crc kubenswrapper[5033]: I0226 16:14:05.777393 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6483d94fc3eb1c339a23343a1c44f6db51afb6d237ebfa53f03ce4a048e96bb" Feb 26 16:14:05 crc kubenswrapper[5033]: I0226 16:14:05.777436 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535374-79fct" Feb 26 16:14:05 crc kubenswrapper[5033]: I0226 16:14:05.849528 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535368-tklxx"] Feb 26 16:14:05 crc kubenswrapper[5033]: I0226 16:14:05.856519 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535368-tklxx"] Feb 26 16:14:07 crc kubenswrapper[5033]: I0226 16:14:07.636561 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fb768e0-695f-4724-95e2-a6b7489caf29" path="/var/lib/kubelet/pods/2fb768e0-695f-4724-95e2-a6b7489caf29/volumes" Feb 26 16:14:29 crc kubenswrapper[5033]: I0226 16:14:29.787351 5033 scope.go:117] "RemoveContainer" containerID="b0f512828eb2b21c3bc6f935e6daa0e77c3de9add7b130c5dbee2e2be3eef863" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.148573 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z"] Feb 26 16:15:00 crc kubenswrapper[5033]: E0226 16:15:00.149468 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0936582-1465-439f-848a-bee5ff2bd040" containerName="oc" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.149486 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0936582-1465-439f-848a-bee5ff2bd040" containerName="oc" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.149635 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0936582-1465-439f-848a-bee5ff2bd040" containerName="oc" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.150081 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.152760 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.153385 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.157935 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z"] Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.334520 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w45t9\" (UniqueName: \"kubernetes.io/projected/923c05a2-b537-4cd4-9cc9-1484790d83f7-kube-api-access-w45t9\") pod \"collect-profiles-29535375-txk7z\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.334814 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923c05a2-b537-4cd4-9cc9-1484790d83f7-config-volume\") pod \"collect-profiles-29535375-txk7z\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.334933 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923c05a2-b537-4cd4-9cc9-1484790d83f7-secret-volume\") pod \"collect-profiles-29535375-txk7z\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.435644 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w45t9\" (UniqueName: \"kubernetes.io/projected/923c05a2-b537-4cd4-9cc9-1484790d83f7-kube-api-access-w45t9\") pod \"collect-profiles-29535375-txk7z\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.436007 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923c05a2-b537-4cd4-9cc9-1484790d83f7-config-volume\") pod \"collect-profiles-29535375-txk7z\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.436119 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923c05a2-b537-4cd4-9cc9-1484790d83f7-secret-volume\") pod \"collect-profiles-29535375-txk7z\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.436757 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923c05a2-b537-4cd4-9cc9-1484790d83f7-config-volume\") pod \"collect-profiles-29535375-txk7z\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.444922 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923c05a2-b537-4cd4-9cc9-1484790d83f7-secret-volume\") pod \"collect-profiles-29535375-txk7z\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.463203 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w45t9\" (UniqueName: \"kubernetes.io/projected/923c05a2-b537-4cd4-9cc9-1484790d83f7-kube-api-access-w45t9\") pod \"collect-profiles-29535375-txk7z\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.518596 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:00 crc kubenswrapper[5033]: I0226 16:15:00.921186 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z"] Feb 26 16:15:01 crc kubenswrapper[5033]: I0226 16:15:01.217565 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" event={"ID":"923c05a2-b537-4cd4-9cc9-1484790d83f7","Type":"ContainerStarted","Data":"01459ab016970271b248fb04be3666dd2a642bc0a098c7a11761d0a06fc49566"} Feb 26 16:15:01 crc kubenswrapper[5033]: I0226 16:15:01.217613 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" event={"ID":"923c05a2-b537-4cd4-9cc9-1484790d83f7","Type":"ContainerStarted","Data":"0a3ce3e1c7cae5ff034b83d09400a59a26f330d9bbd983be5be276360399e996"} Feb 26 16:15:01 crc kubenswrapper[5033]: I0226 16:15:01.234529 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" podStartSLOduration=1.234508588 podStartE2EDuration="1.234508588s" podCreationTimestamp="2026-02-26 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:15:01.233750809 +0000 UTC m=+2028.446349469" watchObservedRunningTime="2026-02-26 16:15:01.234508588 +0000 UTC m=+2028.447107258" Feb 26 16:15:02 crc kubenswrapper[5033]: I0226 16:15:02.224814 5033 generic.go:334] "Generic (PLEG): container finished" podID="923c05a2-b537-4cd4-9cc9-1484790d83f7" containerID="01459ab016970271b248fb04be3666dd2a642bc0a098c7a11761d0a06fc49566" exitCode=0 Feb 26 16:15:02 crc kubenswrapper[5033]: I0226 16:15:02.224861 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" event={"ID":"923c05a2-b537-4cd4-9cc9-1484790d83f7","Type":"ContainerDied","Data":"01459ab016970271b248fb04be3666dd2a642bc0a098c7a11761d0a06fc49566"} Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.483824 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.588031 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923c05a2-b537-4cd4-9cc9-1484790d83f7-secret-volume\") pod \"923c05a2-b537-4cd4-9cc9-1484790d83f7\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.588094 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923c05a2-b537-4cd4-9cc9-1484790d83f7-config-volume\") pod \"923c05a2-b537-4cd4-9cc9-1484790d83f7\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.588168 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w45t9\" (UniqueName: \"kubernetes.io/projected/923c05a2-b537-4cd4-9cc9-1484790d83f7-kube-api-access-w45t9\") pod \"923c05a2-b537-4cd4-9cc9-1484790d83f7\" (UID: \"923c05a2-b537-4cd4-9cc9-1484790d83f7\") " Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.589298 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923c05a2-b537-4cd4-9cc9-1484790d83f7-config-volume" (OuterVolumeSpecName: "config-volume") pod "923c05a2-b537-4cd4-9cc9-1484790d83f7" (UID: "923c05a2-b537-4cd4-9cc9-1484790d83f7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.594636 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/923c05a2-b537-4cd4-9cc9-1484790d83f7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "923c05a2-b537-4cd4-9cc9-1484790d83f7" (UID: "923c05a2-b537-4cd4-9cc9-1484790d83f7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.594989 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923c05a2-b537-4cd4-9cc9-1484790d83f7-kube-api-access-w45t9" (OuterVolumeSpecName: "kube-api-access-w45t9") pod "923c05a2-b537-4cd4-9cc9-1484790d83f7" (UID: "923c05a2-b537-4cd4-9cc9-1484790d83f7"). InnerVolumeSpecName "kube-api-access-w45t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.689374 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923c05a2-b537-4cd4-9cc9-1484790d83f7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.689424 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923c05a2-b537-4cd4-9cc9-1484790d83f7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:03 crc kubenswrapper[5033]: I0226 16:15:03.689436 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w45t9\" (UniqueName: \"kubernetes.io/projected/923c05a2-b537-4cd4-9cc9-1484790d83f7-kube-api-access-w45t9\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:04 crc kubenswrapper[5033]: I0226 16:15:04.238361 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" event={"ID":"923c05a2-b537-4cd4-9cc9-1484790d83f7","Type":"ContainerDied","Data":"0a3ce3e1c7cae5ff034b83d09400a59a26f330d9bbd983be5be276360399e996"} Feb 26 16:15:04 crc kubenswrapper[5033]: I0226 16:15:04.238397 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z" Feb 26 16:15:04 crc kubenswrapper[5033]: I0226 16:15:04.238411 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a3ce3e1c7cae5ff034b83d09400a59a26f330d9bbd983be5be276360399e996" Feb 26 16:15:04 crc kubenswrapper[5033]: I0226 16:15:04.314267 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w"] Feb 26 16:15:04 crc kubenswrapper[5033]: I0226 16:15:04.319073 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535330-57d7w"] Feb 26 16:15:05 crc kubenswrapper[5033]: I0226 16:15:05.638930 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74" path="/var/lib/kubelet/pods/344d7dbc-f4a6-4aca-bbe0-0a5cf0d9dc74/volumes" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.471833 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-trbsn"] Feb 26 16:15:11 crc kubenswrapper[5033]: E0226 16:15:11.472756 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923c05a2-b537-4cd4-9cc9-1484790d83f7" containerName="collect-profiles" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.472774 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="923c05a2-b537-4cd4-9cc9-1484790d83f7" containerName="collect-profiles" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.472929 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="923c05a2-b537-4cd4-9cc9-1484790d83f7" containerName="collect-profiles" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.474099 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.485168 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-trbsn"] Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.593180 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7hr5\" (UniqueName: \"kubernetes.io/projected/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-kube-api-access-j7hr5\") pod \"certified-operators-trbsn\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.593466 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-utilities\") pod \"certified-operators-trbsn\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.593528 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-catalog-content\") pod \"certified-operators-trbsn\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.695127 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-utilities\") pod \"certified-operators-trbsn\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.695186 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-catalog-content\") pod \"certified-operators-trbsn\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.695234 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7hr5\" (UniqueName: \"kubernetes.io/projected/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-kube-api-access-j7hr5\") pod \"certified-operators-trbsn\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.695751 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-utilities\") pod \"certified-operators-trbsn\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.695850 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-catalog-content\") pod \"certified-operators-trbsn\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.718935 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7hr5\" (UniqueName: \"kubernetes.io/projected/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-kube-api-access-j7hr5\") pod \"certified-operators-trbsn\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:11 crc kubenswrapper[5033]: I0226 16:15:11.836248 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:12 crc kubenswrapper[5033]: I0226 16:15:12.332461 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-trbsn"] Feb 26 16:15:13 crc kubenswrapper[5033]: I0226 16:15:13.294483 5033 generic.go:334] "Generic (PLEG): container finished" podID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerID="06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14" exitCode=0 Feb 26 16:15:13 crc kubenswrapper[5033]: I0226 16:15:13.294778 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trbsn" event={"ID":"d4a68061-6f8d-4799-81f4-f8d0e7983bf6","Type":"ContainerDied","Data":"06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14"} Feb 26 16:15:13 crc kubenswrapper[5033]: I0226 16:15:13.294825 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trbsn" event={"ID":"d4a68061-6f8d-4799-81f4-f8d0e7983bf6","Type":"ContainerStarted","Data":"fb95d9eb5b17bae79459e19f7c5505cf6aba3b5a092222fcaa6046dac217477f"} Feb 26 16:15:14 crc kubenswrapper[5033]: I0226 16:15:14.302814 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trbsn" event={"ID":"d4a68061-6f8d-4799-81f4-f8d0e7983bf6","Type":"ContainerStarted","Data":"9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3"} Feb 26 16:15:15 crc kubenswrapper[5033]: I0226 16:15:15.313006 5033 generic.go:334] "Generic (PLEG): container finished" podID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerID="9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3" exitCode=0 Feb 26 16:15:15 crc kubenswrapper[5033]: I0226 16:15:15.313133 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trbsn" event={"ID":"d4a68061-6f8d-4799-81f4-f8d0e7983bf6","Type":"ContainerDied","Data":"9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3"} Feb 26 16:15:16 crc kubenswrapper[5033]: I0226 16:15:16.323474 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trbsn" event={"ID":"d4a68061-6f8d-4799-81f4-f8d0e7983bf6","Type":"ContainerStarted","Data":"b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae"} Feb 26 16:15:16 crc kubenswrapper[5033]: I0226 16:15:16.343434 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-trbsn" podStartSLOduration=2.792352968 podStartE2EDuration="5.343401959s" podCreationTimestamp="2026-02-26 16:15:11 +0000 UTC" firstStartedPulling="2026-02-26 16:15:13.29669982 +0000 UTC m=+2040.509298470" lastFinishedPulling="2026-02-26 16:15:15.847748811 +0000 UTC m=+2043.060347461" observedRunningTime="2026-02-26 16:15:16.341307799 +0000 UTC m=+2043.553906499" watchObservedRunningTime="2026-02-26 16:15:16.343401959 +0000 UTC m=+2043.556000609" Feb 26 16:15:20 crc kubenswrapper[5033]: I0226 16:15:20.189199 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:15:20 crc kubenswrapper[5033]: I0226 16:15:20.190453 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:15:21 crc kubenswrapper[5033]: I0226 16:15:21.836916 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:21 crc kubenswrapper[5033]: I0226 16:15:21.837002 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:21 crc kubenswrapper[5033]: I0226 16:15:21.885497 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.414087 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.525065 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vs8v8"] Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.526729 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.541128 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vs8v8"] Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.650702 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkzmt\" (UniqueName: \"kubernetes.io/projected/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-kube-api-access-kkzmt\") pod \"redhat-operators-vs8v8\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.650822 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-utilities\") pod \"redhat-operators-vs8v8\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.650847 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-catalog-content\") pod \"redhat-operators-vs8v8\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.751714 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkzmt\" (UniqueName: \"kubernetes.io/projected/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-kube-api-access-kkzmt\") pod \"redhat-operators-vs8v8\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.751838 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-utilities\") pod \"redhat-operators-vs8v8\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.751867 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-catalog-content\") pod \"redhat-operators-vs8v8\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.752529 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-utilities\") pod \"redhat-operators-vs8v8\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.752650 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-catalog-content\") pod \"redhat-operators-vs8v8\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.774643 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkzmt\" (UniqueName: \"kubernetes.io/projected/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-kube-api-access-kkzmt\") pod \"redhat-operators-vs8v8\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:22 crc kubenswrapper[5033]: I0226 16:15:22.851223 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:23 crc kubenswrapper[5033]: I0226 16:15:23.297982 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vs8v8"] Feb 26 16:15:23 crc kubenswrapper[5033]: I0226 16:15:23.376284 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs8v8" event={"ID":"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc","Type":"ContainerStarted","Data":"d75a3f7a7a29f56b54994f7fd910bd267c46eddd1aff2dc44cfd449dd24c2ec9"} Feb 26 16:15:24 crc kubenswrapper[5033]: I0226 16:15:24.389447 5033 generic.go:334] "Generic (PLEG): container finished" podID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerID="9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925" exitCode=0 Feb 26 16:15:24 crc kubenswrapper[5033]: I0226 16:15:24.389533 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs8v8" event={"ID":"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc","Type":"ContainerDied","Data":"9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925"} Feb 26 16:15:24 crc kubenswrapper[5033]: I0226 16:15:24.718690 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-trbsn"] Feb 26 16:15:24 crc kubenswrapper[5033]: I0226 16:15:24.718921 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-trbsn" podUID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerName="registry-server" containerID="cri-o://b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae" gracePeriod=2 Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.078429 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.189416 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7hr5\" (UniqueName: \"kubernetes.io/projected/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-kube-api-access-j7hr5\") pod \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.189581 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-catalog-content\") pod \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.189619 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-utilities\") pod \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\" (UID: \"d4a68061-6f8d-4799-81f4-f8d0e7983bf6\") " Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.190807 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-utilities" (OuterVolumeSpecName: "utilities") pod "d4a68061-6f8d-4799-81f4-f8d0e7983bf6" (UID: "d4a68061-6f8d-4799-81f4-f8d0e7983bf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.194660 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-kube-api-access-j7hr5" (OuterVolumeSpecName: "kube-api-access-j7hr5") pod "d4a68061-6f8d-4799-81f4-f8d0e7983bf6" (UID: "d4a68061-6f8d-4799-81f4-f8d0e7983bf6"). InnerVolumeSpecName "kube-api-access-j7hr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.292524 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.292585 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7hr5\" (UniqueName: \"kubernetes.io/projected/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-kube-api-access-j7hr5\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.408576 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-trbsn" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.408752 5033 generic.go:334] "Generic (PLEG): container finished" podID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerID="b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae" exitCode=0 Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.408798 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trbsn" event={"ID":"d4a68061-6f8d-4799-81f4-f8d0e7983bf6","Type":"ContainerDied","Data":"b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae"} Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.408961 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-trbsn" event={"ID":"d4a68061-6f8d-4799-81f4-f8d0e7983bf6","Type":"ContainerDied","Data":"fb95d9eb5b17bae79459e19f7c5505cf6aba3b5a092222fcaa6046dac217477f"} Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.409120 5033 scope.go:117] "RemoveContainer" containerID="b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.482488 5033 scope.go:117] "RemoveContainer" containerID="9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.511437 5033 scope.go:117] "RemoveContainer" containerID="06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.544448 5033 scope.go:117] "RemoveContainer" containerID="b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae" Feb 26 16:15:25 crc kubenswrapper[5033]: E0226 16:15:25.545093 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae\": container with ID starting with b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae not found: ID does not exist" containerID="b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.545130 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae"} err="failed to get container status \"b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae\": rpc error: code = NotFound desc = could not find container \"b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae\": container with ID starting with b510792d36e12987b7d729d5255e3d9fef8af50de24654621a6e5e9aa7135dae not found: ID does not exist" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.545157 5033 scope.go:117] "RemoveContainer" containerID="9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3" Feb 26 16:15:25 crc kubenswrapper[5033]: E0226 16:15:25.545498 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3\": container with ID starting with 9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3 not found: ID does not exist" containerID="9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.545547 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3"} err="failed to get container status \"9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3\": rpc error: code = NotFound desc = could not find container \"9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3\": container with ID starting with 9b620d76f92f177a08715e6ce9b938200f8ea297eac07d65b3cf223d2b7e7af3 not found: ID does not exist" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.545578 5033 scope.go:117] "RemoveContainer" containerID="06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14" Feb 26 16:15:25 crc kubenswrapper[5033]: E0226 16:15:25.545988 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14\": container with ID starting with 06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14 not found: ID does not exist" containerID="06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.546039 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14"} err="failed to get container status \"06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14\": rpc error: code = NotFound desc = could not find container \"06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14\": container with ID starting with 06de80c471fdc08fb25d89e7d520c694ce5f1bb1b961ea773536a9923e2c2c14 not found: ID does not exist" Feb 26 16:15:25 crc kubenswrapper[5033]: I0226 16:15:25.907944 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4a68061-6f8d-4799-81f4-f8d0e7983bf6" (UID: "d4a68061-6f8d-4799-81f4-f8d0e7983bf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:15:26 crc kubenswrapper[5033]: I0226 16:15:26.002522 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a68061-6f8d-4799-81f4-f8d0e7983bf6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:26 crc kubenswrapper[5033]: I0226 16:15:26.046764 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-trbsn"] Feb 26 16:15:26 crc kubenswrapper[5033]: I0226 16:15:26.052693 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-trbsn"] Feb 26 16:15:26 crc kubenswrapper[5033]: I0226 16:15:26.419825 5033 generic.go:334] "Generic (PLEG): container finished" podID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerID="f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba" exitCode=0 Feb 26 16:15:26 crc kubenswrapper[5033]: I0226 16:15:26.419866 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs8v8" event={"ID":"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc","Type":"ContainerDied","Data":"f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba"} Feb 26 16:15:27 crc kubenswrapper[5033]: I0226 16:15:27.430143 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs8v8" event={"ID":"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc","Type":"ContainerStarted","Data":"5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db"} Feb 26 16:15:27 crc kubenswrapper[5033]: I0226 16:15:27.447190 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vs8v8" podStartSLOduration=3.021529566 podStartE2EDuration="5.4471715s" podCreationTimestamp="2026-02-26 16:15:22 +0000 UTC" firstStartedPulling="2026-02-26 16:15:24.391099068 +0000 UTC m=+2051.603697738" lastFinishedPulling="2026-02-26 16:15:26.816741002 +0000 UTC m=+2054.029339672" observedRunningTime="2026-02-26 16:15:27.444595169 +0000 UTC m=+2054.657193829" watchObservedRunningTime="2026-02-26 16:15:27.4471715 +0000 UTC m=+2054.659770150" Feb 26 16:15:27 crc kubenswrapper[5033]: I0226 16:15:27.636937 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" path="/var/lib/kubelet/pods/d4a68061-6f8d-4799-81f4-f8d0e7983bf6/volumes" Feb 26 16:15:29 crc kubenswrapper[5033]: I0226 16:15:29.855164 5033 scope.go:117] "RemoveContainer" containerID="321ba68642c60ac3c778428e35e4e851d36924d7675f9116bfc94d29f1047d9e" Feb 26 16:15:32 crc kubenswrapper[5033]: I0226 16:15:32.851563 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:32 crc kubenswrapper[5033]: I0226 16:15:32.851830 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:32 crc kubenswrapper[5033]: I0226 16:15:32.893547 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:33 crc kubenswrapper[5033]: I0226 16:15:33.520349 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:34 crc kubenswrapper[5033]: I0226 16:15:34.341329 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vs8v8"] Feb 26 16:15:35 crc kubenswrapper[5033]: I0226 16:15:35.480603 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vs8v8" podUID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerName="registry-server" containerID="cri-o://5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db" gracePeriod=2 Feb 26 16:15:35 crc kubenswrapper[5033]: I0226 16:15:35.831960 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:35 crc kubenswrapper[5033]: I0226 16:15:35.937087 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-utilities\") pod \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " Feb 26 16:15:35 crc kubenswrapper[5033]: I0226 16:15:35.937474 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkzmt\" (UniqueName: \"kubernetes.io/projected/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-kube-api-access-kkzmt\") pod \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " Feb 26 16:15:35 crc kubenswrapper[5033]: I0226 16:15:35.937512 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-catalog-content\") pod \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\" (UID: \"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc\") " Feb 26 16:15:35 crc kubenswrapper[5033]: I0226 16:15:35.937977 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-utilities" (OuterVolumeSpecName: "utilities") pod "770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" (UID: "770e75fe-c54a-4a9b-9ba7-1b49d088bbfc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:15:35 crc kubenswrapper[5033]: I0226 16:15:35.943592 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-kube-api-access-kkzmt" (OuterVolumeSpecName: "kube-api-access-kkzmt") pod "770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" (UID: "770e75fe-c54a-4a9b-9ba7-1b49d088bbfc"). InnerVolumeSpecName "kube-api-access-kkzmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.039082 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkzmt\" (UniqueName: \"kubernetes.io/projected/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-kube-api-access-kkzmt\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.039128 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.074188 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" (UID: "770e75fe-c54a-4a9b-9ba7-1b49d088bbfc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.139893 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.491313 5033 generic.go:334] "Generic (PLEG): container finished" podID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerID="5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db" exitCode=0 Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.491357 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs8v8" event={"ID":"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc","Type":"ContainerDied","Data":"5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db"} Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.491392 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vs8v8" event={"ID":"770e75fe-c54a-4a9b-9ba7-1b49d088bbfc","Type":"ContainerDied","Data":"d75a3f7a7a29f56b54994f7fd910bd267c46eddd1aff2dc44cfd449dd24c2ec9"} Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.491403 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vs8v8" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.491413 5033 scope.go:117] "RemoveContainer" containerID="5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.511299 5033 scope.go:117] "RemoveContainer" containerID="f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.532529 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vs8v8"] Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.538450 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vs8v8"] Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.546647 5033 scope.go:117] "RemoveContainer" containerID="9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.566337 5033 scope.go:117] "RemoveContainer" containerID="5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db" Feb 26 16:15:36 crc kubenswrapper[5033]: E0226 16:15:36.566951 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db\": container with ID starting with 5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db not found: ID does not exist" containerID="5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.566993 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db"} err="failed to get container status \"5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db\": rpc error: code = NotFound desc = could not find container \"5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db\": container with ID starting with 5491970bbf33b18068a9d7d469e998557ea1ac17390f7ffd0c92d3c35a4d41db not found: ID does not exist" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.567019 5033 scope.go:117] "RemoveContainer" containerID="f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba" Feb 26 16:15:36 crc kubenswrapper[5033]: E0226 16:15:36.567327 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba\": container with ID starting with f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba not found: ID does not exist" containerID="f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.567364 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba"} err="failed to get container status \"f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba\": rpc error: code = NotFound desc = could not find container \"f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba\": container with ID starting with f927a5089a17c5f2b82d1080001b556e3a818a2f8c598e76efcf2b9bb1f22aba not found: ID does not exist" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.567387 5033 scope.go:117] "RemoveContainer" containerID="9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925" Feb 26 16:15:36 crc kubenswrapper[5033]: E0226 16:15:36.567754 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925\": container with ID starting with 9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925 not found: ID does not exist" containerID="9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.567781 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925"} err="failed to get container status \"9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925\": rpc error: code = NotFound desc = could not find container \"9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925\": container with ID starting with 9c58a3141922982ab73e6a081b3005e4d4ffb39cf745b674663631451fbf1925 not found: ID does not exist" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.947084 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jtvk2"] Feb 26 16:15:36 crc kubenswrapper[5033]: E0226 16:15:36.947426 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerName="registry-server" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.947440 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerName="registry-server" Feb 26 16:15:36 crc kubenswrapper[5033]: E0226 16:15:36.947449 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerName="extract-content" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.947454 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerName="extract-content" Feb 26 16:15:36 crc kubenswrapper[5033]: E0226 16:15:36.947464 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerName="registry-server" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.947471 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerName="registry-server" Feb 26 16:15:36 crc kubenswrapper[5033]: E0226 16:15:36.947500 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerName="extract-utilities" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.947507 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerName="extract-utilities" Feb 26 16:15:36 crc kubenswrapper[5033]: E0226 16:15:36.947514 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerName="extract-utilities" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.947521 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerName="extract-utilities" Feb 26 16:15:36 crc kubenswrapper[5033]: E0226 16:15:36.947530 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerName="extract-content" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.947536 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerName="extract-content" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.947672 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" containerName="registry-server" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.947686 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a68061-6f8d-4799-81f4-f8d0e7983bf6" containerName="registry-server" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.948639 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:36 crc kubenswrapper[5033]: I0226 16:15:36.957128 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtvk2"] Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.051068 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42w52\" (UniqueName: \"kubernetes.io/projected/334a31d3-8525-46fd-9894-da382ab9c6b8-kube-api-access-42w52\") pod \"redhat-marketplace-jtvk2\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.051116 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-catalog-content\") pod \"redhat-marketplace-jtvk2\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.051149 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-utilities\") pod \"redhat-marketplace-jtvk2\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.152386 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42w52\" (UniqueName: \"kubernetes.io/projected/334a31d3-8525-46fd-9894-da382ab9c6b8-kube-api-access-42w52\") pod \"redhat-marketplace-jtvk2\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.152446 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-catalog-content\") pod \"redhat-marketplace-jtvk2\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.152475 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-utilities\") pod \"redhat-marketplace-jtvk2\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.153037 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-utilities\") pod \"redhat-marketplace-jtvk2\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.153236 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-catalog-content\") pod \"redhat-marketplace-jtvk2\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.170480 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42w52\" (UniqueName: \"kubernetes.io/projected/334a31d3-8525-46fd-9894-da382ab9c6b8-kube-api-access-42w52\") pod \"redhat-marketplace-jtvk2\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.270554 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.640725 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="770e75fe-c54a-4a9b-9ba7-1b49d088bbfc" path="/var/lib/kubelet/pods/770e75fe-c54a-4a9b-9ba7-1b49d088bbfc/volumes" Feb 26 16:15:37 crc kubenswrapper[5033]: I0226 16:15:37.821990 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtvk2"] Feb 26 16:15:39 crc kubenswrapper[5033]: I0226 16:15:39.340478 5033 generic.go:334] "Generic (PLEG): container finished" podID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerID="378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143" exitCode=0 Feb 26 16:15:39 crc kubenswrapper[5033]: I0226 16:15:39.340547 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtvk2" event={"ID":"334a31d3-8525-46fd-9894-da382ab9c6b8","Type":"ContainerDied","Data":"378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143"} Feb 26 16:15:39 crc kubenswrapper[5033]: I0226 16:15:39.340589 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtvk2" event={"ID":"334a31d3-8525-46fd-9894-da382ab9c6b8","Type":"ContainerStarted","Data":"feeab43600e855f1c5a418b94c023f95e0e33cdc204b69e83d209c6c67772fb8"} Feb 26 16:15:40 crc kubenswrapper[5033]: I0226 16:15:40.348607 5033 generic.go:334] "Generic (PLEG): container finished" podID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerID="522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834" exitCode=0 Feb 26 16:15:40 crc kubenswrapper[5033]: I0226 16:15:40.348658 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtvk2" event={"ID":"334a31d3-8525-46fd-9894-da382ab9c6b8","Type":"ContainerDied","Data":"522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834"} Feb 26 16:15:41 crc kubenswrapper[5033]: I0226 16:15:41.358295 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtvk2" event={"ID":"334a31d3-8525-46fd-9894-da382ab9c6b8","Type":"ContainerStarted","Data":"3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb"} Feb 26 16:15:41 crc kubenswrapper[5033]: I0226 16:15:41.381451 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jtvk2" podStartSLOduration=3.958076097 podStartE2EDuration="5.381431474s" podCreationTimestamp="2026-02-26 16:15:36 +0000 UTC" firstStartedPulling="2026-02-26 16:15:39.342753421 +0000 UTC m=+2066.555352071" lastFinishedPulling="2026-02-26 16:15:40.766108778 +0000 UTC m=+2067.978707448" observedRunningTime="2026-02-26 16:15:41.37707597 +0000 UTC m=+2068.589674630" watchObservedRunningTime="2026-02-26 16:15:41.381431474 +0000 UTC m=+2068.594030124" Feb 26 16:15:47 crc kubenswrapper[5033]: I0226 16:15:47.271694 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:47 crc kubenswrapper[5033]: I0226 16:15:47.272065 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:47 crc kubenswrapper[5033]: I0226 16:15:47.330810 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:47 crc kubenswrapper[5033]: I0226 16:15:47.462670 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:47 crc kubenswrapper[5033]: I0226 16:15:47.566507 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtvk2"] Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.426950 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jtvk2" podUID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerName="registry-server" containerID="cri-o://3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb" gracePeriod=2 Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.790631 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.860708 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-utilities\") pod \"334a31d3-8525-46fd-9894-da382ab9c6b8\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.860772 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-catalog-content\") pod \"334a31d3-8525-46fd-9894-da382ab9c6b8\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.860845 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42w52\" (UniqueName: \"kubernetes.io/projected/334a31d3-8525-46fd-9894-da382ab9c6b8-kube-api-access-42w52\") pod \"334a31d3-8525-46fd-9894-da382ab9c6b8\" (UID: \"334a31d3-8525-46fd-9894-da382ab9c6b8\") " Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.861813 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-utilities" (OuterVolumeSpecName: "utilities") pod "334a31d3-8525-46fd-9894-da382ab9c6b8" (UID: "334a31d3-8525-46fd-9894-da382ab9c6b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.867030 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334a31d3-8525-46fd-9894-da382ab9c6b8-kube-api-access-42w52" (OuterVolumeSpecName: "kube-api-access-42w52") pod "334a31d3-8525-46fd-9894-da382ab9c6b8" (UID: "334a31d3-8525-46fd-9894-da382ab9c6b8"). InnerVolumeSpecName "kube-api-access-42w52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.890416 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "334a31d3-8525-46fd-9894-da382ab9c6b8" (UID: "334a31d3-8525-46fd-9894-da382ab9c6b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.961960 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.961999 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334a31d3-8525-46fd-9894-da382ab9c6b8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:49 crc kubenswrapper[5033]: I0226 16:15:49.962014 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42w52\" (UniqueName: \"kubernetes.io/projected/334a31d3-8525-46fd-9894-da382ab9c6b8-kube-api-access-42w52\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.189072 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.189146 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.436156 5033 generic.go:334] "Generic (PLEG): container finished" podID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerID="3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb" exitCode=0 Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.436206 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtvk2" event={"ID":"334a31d3-8525-46fd-9894-da382ab9c6b8","Type":"ContainerDied","Data":"3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb"} Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.436214 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtvk2" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.436240 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtvk2" event={"ID":"334a31d3-8525-46fd-9894-da382ab9c6b8","Type":"ContainerDied","Data":"feeab43600e855f1c5a418b94c023f95e0e33cdc204b69e83d209c6c67772fb8"} Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.436289 5033 scope.go:117] "RemoveContainer" containerID="3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.454715 5033 scope.go:117] "RemoveContainer" containerID="522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.468603 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtvk2"] Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.474371 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtvk2"] Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.476417 5033 scope.go:117] "RemoveContainer" containerID="378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.502804 5033 scope.go:117] "RemoveContainer" containerID="3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb" Feb 26 16:15:50 crc kubenswrapper[5033]: E0226 16:15:50.503238 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb\": container with ID starting with 3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb not found: ID does not exist" containerID="3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.503298 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb"} err="failed to get container status \"3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb\": rpc error: code = NotFound desc = could not find container \"3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb\": container with ID starting with 3aa05be0af67a05f6ef232231a656771489960802c022035773080466a01ddfb not found: ID does not exist" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.503318 5033 scope.go:117] "RemoveContainer" containerID="522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834" Feb 26 16:15:50 crc kubenswrapper[5033]: E0226 16:15:50.503810 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834\": container with ID starting with 522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834 not found: ID does not exist" containerID="522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.503862 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834"} err="failed to get container status \"522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834\": rpc error: code = NotFound desc = could not find container \"522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834\": container with ID starting with 522cb548f25aed9b48704f5d653dd2af2f8ab05464c08b9874cc5247494c0834 not found: ID does not exist" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.503882 5033 scope.go:117] "RemoveContainer" containerID="378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143" Feb 26 16:15:50 crc kubenswrapper[5033]: E0226 16:15:50.504509 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143\": container with ID starting with 378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143 not found: ID does not exist" containerID="378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143" Feb 26 16:15:50 crc kubenswrapper[5033]: I0226 16:15:50.504548 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143"} err="failed to get container status \"378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143\": rpc error: code = NotFound desc = could not find container \"378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143\": container with ID starting with 378ca5e122dacf96bd432665a72f8c15e8456b83e499ea7d7617541a0c57b143 not found: ID does not exist" Feb 26 16:15:51 crc kubenswrapper[5033]: I0226 16:15:51.656186 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="334a31d3-8525-46fd-9894-da382ab9c6b8" path="/var/lib/kubelet/pods/334a31d3-8525-46fd-9894-da382ab9c6b8/volumes" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.138298 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535376-pn9vv"] Feb 26 16:16:00 crc kubenswrapper[5033]: E0226 16:16:00.139222 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerName="registry-server" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.139238 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerName="registry-server" Feb 26 16:16:00 crc kubenswrapper[5033]: E0226 16:16:00.139270 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerName="extract-utilities" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.139280 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerName="extract-utilities" Feb 26 16:16:00 crc kubenswrapper[5033]: E0226 16:16:00.139294 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerName="extract-content" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.139302 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerName="extract-content" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.139491 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="334a31d3-8525-46fd-9894-da382ab9c6b8" containerName="registry-server" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.140040 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535376-pn9vv" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.145688 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.145805 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.145864 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.161597 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535376-pn9vv"] Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.197228 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dp6m\" (UniqueName: \"kubernetes.io/projected/4449adb6-eeb2-4ccb-99b8-599561ac2534-kube-api-access-5dp6m\") pod \"auto-csr-approver-29535376-pn9vv\" (UID: \"4449adb6-eeb2-4ccb-99b8-599561ac2534\") " pod="openshift-infra/auto-csr-approver-29535376-pn9vv" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.299449 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dp6m\" (UniqueName: \"kubernetes.io/projected/4449adb6-eeb2-4ccb-99b8-599561ac2534-kube-api-access-5dp6m\") pod \"auto-csr-approver-29535376-pn9vv\" (UID: \"4449adb6-eeb2-4ccb-99b8-599561ac2534\") " pod="openshift-infra/auto-csr-approver-29535376-pn9vv" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.332004 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dp6m\" (UniqueName: \"kubernetes.io/projected/4449adb6-eeb2-4ccb-99b8-599561ac2534-kube-api-access-5dp6m\") pod \"auto-csr-approver-29535376-pn9vv\" (UID: \"4449adb6-eeb2-4ccb-99b8-599561ac2534\") " pod="openshift-infra/auto-csr-approver-29535376-pn9vv" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.459630 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535376-pn9vv" Feb 26 16:16:00 crc kubenswrapper[5033]: I0226 16:16:00.877514 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535376-pn9vv"] Feb 26 16:16:01 crc kubenswrapper[5033]: I0226 16:16:01.530093 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535376-pn9vv" event={"ID":"4449adb6-eeb2-4ccb-99b8-599561ac2534","Type":"ContainerStarted","Data":"f929978c6b0cb4add8bb550f97bebe344693d4aa9b65c540801cf8f66f01b7e6"} Feb 26 16:16:02 crc kubenswrapper[5033]: I0226 16:16:02.538361 5033 generic.go:334] "Generic (PLEG): container finished" podID="4449adb6-eeb2-4ccb-99b8-599561ac2534" containerID="2a214dab3830fc595d0c6d62dcce9a09c62f5281bd1f71ec01ed9a65f4a716d1" exitCode=0 Feb 26 16:16:02 crc kubenswrapper[5033]: I0226 16:16:02.538404 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535376-pn9vv" event={"ID":"4449adb6-eeb2-4ccb-99b8-599561ac2534","Type":"ContainerDied","Data":"2a214dab3830fc595d0c6d62dcce9a09c62f5281bd1f71ec01ed9a65f4a716d1"} Feb 26 16:16:03 crc kubenswrapper[5033]: I0226 16:16:03.827893 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535376-pn9vv" Feb 26 16:16:03 crc kubenswrapper[5033]: I0226 16:16:03.954169 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dp6m\" (UniqueName: \"kubernetes.io/projected/4449adb6-eeb2-4ccb-99b8-599561ac2534-kube-api-access-5dp6m\") pod \"4449adb6-eeb2-4ccb-99b8-599561ac2534\" (UID: \"4449adb6-eeb2-4ccb-99b8-599561ac2534\") " Feb 26 16:16:03 crc kubenswrapper[5033]: I0226 16:16:03.959465 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4449adb6-eeb2-4ccb-99b8-599561ac2534-kube-api-access-5dp6m" (OuterVolumeSpecName: "kube-api-access-5dp6m") pod "4449adb6-eeb2-4ccb-99b8-599561ac2534" (UID: "4449adb6-eeb2-4ccb-99b8-599561ac2534"). InnerVolumeSpecName "kube-api-access-5dp6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:16:04 crc kubenswrapper[5033]: I0226 16:16:04.055957 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dp6m\" (UniqueName: \"kubernetes.io/projected/4449adb6-eeb2-4ccb-99b8-599561ac2534-kube-api-access-5dp6m\") on node \"crc\" DevicePath \"\"" Feb 26 16:16:04 crc kubenswrapper[5033]: I0226 16:16:04.552393 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535376-pn9vv" event={"ID":"4449adb6-eeb2-4ccb-99b8-599561ac2534","Type":"ContainerDied","Data":"f929978c6b0cb4add8bb550f97bebe344693d4aa9b65c540801cf8f66f01b7e6"} Feb 26 16:16:04 crc kubenswrapper[5033]: I0226 16:16:04.553083 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f929978c6b0cb4add8bb550f97bebe344693d4aa9b65c540801cf8f66f01b7e6" Feb 26 16:16:04 crc kubenswrapper[5033]: I0226 16:16:04.552462 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535376-pn9vv" Feb 26 16:16:04 crc kubenswrapper[5033]: I0226 16:16:04.888159 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zdqzm"] Feb 26 16:16:04 crc kubenswrapper[5033]: I0226 16:16:04.895741 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zdqzm"] Feb 26 16:16:05 crc kubenswrapper[5033]: I0226 16:16:05.637002 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b59b936-69de-47eb-b624-06122f76e5fd" path="/var/lib/kubelet/pods/6b59b936-69de-47eb-b624-06122f76e5fd/volumes" Feb 26 16:16:20 crc kubenswrapper[5033]: I0226 16:16:20.188584 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:16:20 crc kubenswrapper[5033]: I0226 16:16:20.189128 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:16:20 crc kubenswrapper[5033]: I0226 16:16:20.189189 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:16:20 crc kubenswrapper[5033]: I0226 16:16:20.189957 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f4ff328ef64b64bd90cfbfbff26eb9b2edc5412a004940ff047398b5e904341"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:16:20 crc kubenswrapper[5033]: I0226 16:16:20.190030 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://3f4ff328ef64b64bd90cfbfbff26eb9b2edc5412a004940ff047398b5e904341" gracePeriod=600 Feb 26 16:16:20 crc kubenswrapper[5033]: I0226 16:16:20.680425 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="3f4ff328ef64b64bd90cfbfbff26eb9b2edc5412a004940ff047398b5e904341" exitCode=0 Feb 26 16:16:20 crc kubenswrapper[5033]: I0226 16:16:20.680472 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"3f4ff328ef64b64bd90cfbfbff26eb9b2edc5412a004940ff047398b5e904341"} Feb 26 16:16:20 crc kubenswrapper[5033]: I0226 16:16:20.680853 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67"} Feb 26 16:16:20 crc kubenswrapper[5033]: I0226 16:16:20.680875 5033 scope.go:117] "RemoveContainer" containerID="b58d62f27888eb261cccf1b5a821d616f9b4f53f44254dc988c91360c634d465" Feb 26 16:16:29 crc kubenswrapper[5033]: I0226 16:16:29.914200 5033 scope.go:117] "RemoveContainer" containerID="722ea520796dc321577bb30d9984cccb411f46bdfc3ead546cfba659036b1456" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.165681 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535378-d5m5k"] Feb 26 16:18:00 crc kubenswrapper[5033]: E0226 16:18:00.167842 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4449adb6-eeb2-4ccb-99b8-599561ac2534" containerName="oc" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.167924 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4449adb6-eeb2-4ccb-99b8-599561ac2534" containerName="oc" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.168478 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4449adb6-eeb2-4ccb-99b8-599561ac2534" containerName="oc" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.170316 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535378-d5m5k" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.177091 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535378-d5m5k"] Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.202483 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.202879 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.203189 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.205987 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcl2h\" (UniqueName: \"kubernetes.io/projected/57c69b9b-6398-4fa9-9563-ba88cc62665f-kube-api-access-xcl2h\") pod \"auto-csr-approver-29535378-d5m5k\" (UID: \"57c69b9b-6398-4fa9-9563-ba88cc62665f\") " pod="openshift-infra/auto-csr-approver-29535378-d5m5k" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.306956 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcl2h\" (UniqueName: \"kubernetes.io/projected/57c69b9b-6398-4fa9-9563-ba88cc62665f-kube-api-access-xcl2h\") pod \"auto-csr-approver-29535378-d5m5k\" (UID: \"57c69b9b-6398-4fa9-9563-ba88cc62665f\") " pod="openshift-infra/auto-csr-approver-29535378-d5m5k" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.331882 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcl2h\" (UniqueName: \"kubernetes.io/projected/57c69b9b-6398-4fa9-9563-ba88cc62665f-kube-api-access-xcl2h\") pod \"auto-csr-approver-29535378-d5m5k\" (UID: \"57c69b9b-6398-4fa9-9563-ba88cc62665f\") " pod="openshift-infra/auto-csr-approver-29535378-d5m5k" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.516195 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535378-d5m5k" Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.915442 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535378-d5m5k"] Feb 26 16:18:00 crc kubenswrapper[5033]: I0226 16:18:00.924142 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:18:01 crc kubenswrapper[5033]: I0226 16:18:01.479139 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535378-d5m5k" event={"ID":"57c69b9b-6398-4fa9-9563-ba88cc62665f","Type":"ContainerStarted","Data":"68544a2ce06ca7def1768b74f1dbf30846f23e19108af5d7613054dc091157f3"} Feb 26 16:18:03 crc kubenswrapper[5033]: I0226 16:18:03.493182 5033 generic.go:334] "Generic (PLEG): container finished" podID="57c69b9b-6398-4fa9-9563-ba88cc62665f" containerID="d39f043c8e9e4d15bf4dd64a8b4a3e83e560bd909d415282eea63896aeb628d2" exitCode=0 Feb 26 16:18:03 crc kubenswrapper[5033]: I0226 16:18:03.493268 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535378-d5m5k" event={"ID":"57c69b9b-6398-4fa9-9563-ba88cc62665f","Type":"ContainerDied","Data":"d39f043c8e9e4d15bf4dd64a8b4a3e83e560bd909d415282eea63896aeb628d2"} Feb 26 16:18:04 crc kubenswrapper[5033]: I0226 16:18:04.762432 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535378-d5m5k" Feb 26 16:18:04 crc kubenswrapper[5033]: I0226 16:18:04.879979 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcl2h\" (UniqueName: \"kubernetes.io/projected/57c69b9b-6398-4fa9-9563-ba88cc62665f-kube-api-access-xcl2h\") pod \"57c69b9b-6398-4fa9-9563-ba88cc62665f\" (UID: \"57c69b9b-6398-4fa9-9563-ba88cc62665f\") " Feb 26 16:18:04 crc kubenswrapper[5033]: I0226 16:18:04.888404 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c69b9b-6398-4fa9-9563-ba88cc62665f-kube-api-access-xcl2h" (OuterVolumeSpecName: "kube-api-access-xcl2h") pod "57c69b9b-6398-4fa9-9563-ba88cc62665f" (UID: "57c69b9b-6398-4fa9-9563-ba88cc62665f"). InnerVolumeSpecName "kube-api-access-xcl2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:18:04 crc kubenswrapper[5033]: I0226 16:18:04.982609 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcl2h\" (UniqueName: \"kubernetes.io/projected/57c69b9b-6398-4fa9-9563-ba88cc62665f-kube-api-access-xcl2h\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:05 crc kubenswrapper[5033]: I0226 16:18:05.510332 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535378-d5m5k" event={"ID":"57c69b9b-6398-4fa9-9563-ba88cc62665f","Type":"ContainerDied","Data":"68544a2ce06ca7def1768b74f1dbf30846f23e19108af5d7613054dc091157f3"} Feb 26 16:18:05 crc kubenswrapper[5033]: I0226 16:18:05.510398 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68544a2ce06ca7def1768b74f1dbf30846f23e19108af5d7613054dc091157f3" Feb 26 16:18:05 crc kubenswrapper[5033]: I0226 16:18:05.510404 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535378-d5m5k" Feb 26 16:18:05 crc kubenswrapper[5033]: I0226 16:18:05.832500 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535372-kdpjv"] Feb 26 16:18:05 crc kubenswrapper[5033]: I0226 16:18:05.838754 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535372-kdpjv"] Feb 26 16:18:07 crc kubenswrapper[5033]: I0226 16:18:07.637052 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5402975e-da0a-4772-b4c8-23d1b6e6b0c3" path="/var/lib/kubelet/pods/5402975e-da0a-4772-b4c8-23d1b6e6b0c3/volumes" Feb 26 16:18:20 crc kubenswrapper[5033]: I0226 16:18:20.188982 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:18:20 crc kubenswrapper[5033]: I0226 16:18:20.191753 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:18:30 crc kubenswrapper[5033]: I0226 16:18:30.021456 5033 scope.go:117] "RemoveContainer" containerID="812de937ad4e8d44c191adc4c0ad5bfeb57e0ff351d383736214e0c4eb73e592" Feb 26 16:18:50 crc kubenswrapper[5033]: I0226 16:18:50.189412 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:18:50 crc kubenswrapper[5033]: I0226 16:18:50.190318 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:19:20 crc kubenswrapper[5033]: I0226 16:19:20.189841 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:19:20 crc kubenswrapper[5033]: I0226 16:19:20.190496 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:19:20 crc kubenswrapper[5033]: I0226 16:19:20.190548 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:19:20 crc kubenswrapper[5033]: I0226 16:19:20.191355 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:19:20 crc kubenswrapper[5033]: I0226 16:19:20.191436 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" gracePeriod=600 Feb 26 16:19:20 crc kubenswrapper[5033]: E0226 16:19:20.312129 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:19:21 crc kubenswrapper[5033]: I0226 16:19:21.109065 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" exitCode=0 Feb 26 16:19:21 crc kubenswrapper[5033]: I0226 16:19:21.109216 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67"} Feb 26 16:19:21 crc kubenswrapper[5033]: I0226 16:19:21.109423 5033 scope.go:117] "RemoveContainer" containerID="3f4ff328ef64b64bd90cfbfbff26eb9b2edc5412a004940ff047398b5e904341" Feb 26 16:19:21 crc kubenswrapper[5033]: I0226 16:19:21.110747 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:19:21 crc kubenswrapper[5033]: E0226 16:19:21.111673 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:19:36 crc kubenswrapper[5033]: I0226 16:19:36.628062 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:19:36 crc kubenswrapper[5033]: E0226 16:19:36.629072 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:19:51 crc kubenswrapper[5033]: I0226 16:19:51.628845 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:19:51 crc kubenswrapper[5033]: E0226 16:19:51.629634 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.141900 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535380-qgfd7"] Feb 26 16:20:00 crc kubenswrapper[5033]: E0226 16:20:00.142865 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c69b9b-6398-4fa9-9563-ba88cc62665f" containerName="oc" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.142885 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c69b9b-6398-4fa9-9563-ba88cc62665f" containerName="oc" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.143069 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="57c69b9b-6398-4fa9-9563-ba88cc62665f" containerName="oc" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.143773 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535380-qgfd7" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.145796 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.149636 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.150681 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.152415 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535380-qgfd7"] Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.292021 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm44m\" (UniqueName: \"kubernetes.io/projected/ace6d11c-b8ba-4fe0-8c0d-c742a60742a9-kube-api-access-bm44m\") pod \"auto-csr-approver-29535380-qgfd7\" (UID: \"ace6d11c-b8ba-4fe0-8c0d-c742a60742a9\") " pod="openshift-infra/auto-csr-approver-29535380-qgfd7" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.393421 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm44m\" (UniqueName: \"kubernetes.io/projected/ace6d11c-b8ba-4fe0-8c0d-c742a60742a9-kube-api-access-bm44m\") pod \"auto-csr-approver-29535380-qgfd7\" (UID: \"ace6d11c-b8ba-4fe0-8c0d-c742a60742a9\") " pod="openshift-infra/auto-csr-approver-29535380-qgfd7" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.419058 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm44m\" (UniqueName: \"kubernetes.io/projected/ace6d11c-b8ba-4fe0-8c0d-c742a60742a9-kube-api-access-bm44m\") pod \"auto-csr-approver-29535380-qgfd7\" (UID: \"ace6d11c-b8ba-4fe0-8c0d-c742a60742a9\") " pod="openshift-infra/auto-csr-approver-29535380-qgfd7" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.465678 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535380-qgfd7" Feb 26 16:20:00 crc kubenswrapper[5033]: I0226 16:20:00.953832 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535380-qgfd7"] Feb 26 16:20:01 crc kubenswrapper[5033]: I0226 16:20:01.424716 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535380-qgfd7" event={"ID":"ace6d11c-b8ba-4fe0-8c0d-c742a60742a9","Type":"ContainerStarted","Data":"335393af333c608a3402a7cb2c33e396bd1daaf451a52f1023313449583494ab"} Feb 26 16:20:02 crc kubenswrapper[5033]: I0226 16:20:02.440729 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535380-qgfd7" event={"ID":"ace6d11c-b8ba-4fe0-8c0d-c742a60742a9","Type":"ContainerStarted","Data":"eabb49bf9465a81eaaacd694b2a443b6fdc016f1a65d840fee28bfa2984b49f0"} Feb 26 16:20:02 crc kubenswrapper[5033]: I0226 16:20:02.458569 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535380-qgfd7" podStartSLOduration=1.379008952 podStartE2EDuration="2.458551146s" podCreationTimestamp="2026-02-26 16:20:00 +0000 UTC" firstStartedPulling="2026-02-26 16:20:00.971656287 +0000 UTC m=+2328.184254967" lastFinishedPulling="2026-02-26 16:20:02.051198511 +0000 UTC m=+2329.263797161" observedRunningTime="2026-02-26 16:20:02.45417382 +0000 UTC m=+2329.666772480" watchObservedRunningTime="2026-02-26 16:20:02.458551146 +0000 UTC m=+2329.671149796" Feb 26 16:20:03 crc kubenswrapper[5033]: I0226 16:20:03.452774 5033 generic.go:334] "Generic (PLEG): container finished" podID="ace6d11c-b8ba-4fe0-8c0d-c742a60742a9" containerID="eabb49bf9465a81eaaacd694b2a443b6fdc016f1a65d840fee28bfa2984b49f0" exitCode=0 Feb 26 16:20:03 crc kubenswrapper[5033]: I0226 16:20:03.453385 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535380-qgfd7" event={"ID":"ace6d11c-b8ba-4fe0-8c0d-c742a60742a9","Type":"ContainerDied","Data":"eabb49bf9465a81eaaacd694b2a443b6fdc016f1a65d840fee28bfa2984b49f0"} Feb 26 16:20:04 crc kubenswrapper[5033]: I0226 16:20:04.712487 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535380-qgfd7" Feb 26 16:20:04 crc kubenswrapper[5033]: I0226 16:20:04.767238 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm44m\" (UniqueName: \"kubernetes.io/projected/ace6d11c-b8ba-4fe0-8c0d-c742a60742a9-kube-api-access-bm44m\") pod \"ace6d11c-b8ba-4fe0-8c0d-c742a60742a9\" (UID: \"ace6d11c-b8ba-4fe0-8c0d-c742a60742a9\") " Feb 26 16:20:04 crc kubenswrapper[5033]: I0226 16:20:04.772053 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace6d11c-b8ba-4fe0-8c0d-c742a60742a9-kube-api-access-bm44m" (OuterVolumeSpecName: "kube-api-access-bm44m") pod "ace6d11c-b8ba-4fe0-8c0d-c742a60742a9" (UID: "ace6d11c-b8ba-4fe0-8c0d-c742a60742a9"). InnerVolumeSpecName "kube-api-access-bm44m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:20:04 crc kubenswrapper[5033]: I0226 16:20:04.868862 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm44m\" (UniqueName: \"kubernetes.io/projected/ace6d11c-b8ba-4fe0-8c0d-c742a60742a9-kube-api-access-bm44m\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:05 crc kubenswrapper[5033]: I0226 16:20:05.467461 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535380-qgfd7" event={"ID":"ace6d11c-b8ba-4fe0-8c0d-c742a60742a9","Type":"ContainerDied","Data":"335393af333c608a3402a7cb2c33e396bd1daaf451a52f1023313449583494ab"} Feb 26 16:20:05 crc kubenswrapper[5033]: I0226 16:20:05.467731 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="335393af333c608a3402a7cb2c33e396bd1daaf451a52f1023313449583494ab" Feb 26 16:20:05 crc kubenswrapper[5033]: I0226 16:20:05.467577 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535380-qgfd7" Feb 26 16:20:05 crc kubenswrapper[5033]: I0226 16:20:05.538598 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535374-79fct"] Feb 26 16:20:05 crc kubenswrapper[5033]: I0226 16:20:05.546056 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535374-79fct"] Feb 26 16:20:05 crc kubenswrapper[5033]: I0226 16:20:05.645199 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0936582-1465-439f-848a-bee5ff2bd040" path="/var/lib/kubelet/pods/e0936582-1465-439f-848a-bee5ff2bd040/volumes" Feb 26 16:20:06 crc kubenswrapper[5033]: I0226 16:20:06.628107 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:20:06 crc kubenswrapper[5033]: E0226 16:20:06.628349 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:20:18 crc kubenswrapper[5033]: I0226 16:20:18.628021 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:20:18 crc kubenswrapper[5033]: E0226 16:20:18.628988 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:20:30 crc kubenswrapper[5033]: I0226 16:20:30.103643 5033 scope.go:117] "RemoveContainer" containerID="36de38cc88a7f6ee5d66b7071512481cf87e2ea3c360c2796ee07b23eedf06da" Feb 26 16:20:33 crc kubenswrapper[5033]: I0226 16:20:33.631970 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:20:33 crc kubenswrapper[5033]: E0226 16:20:33.633136 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.048200 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p95wb"] Feb 26 16:20:45 crc kubenswrapper[5033]: E0226 16:20:45.050094 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace6d11c-b8ba-4fe0-8c0d-c742a60742a9" containerName="oc" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.050169 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace6d11c-b8ba-4fe0-8c0d-c742a60742a9" containerName="oc" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.050391 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace6d11c-b8ba-4fe0-8c0d-c742a60742a9" containerName="oc" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.051424 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.067513 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p95wb"] Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.223522 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-utilities\") pod \"community-operators-p95wb\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.223601 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-catalog-content\") pod \"community-operators-p95wb\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.223644 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c988f\" (UniqueName: \"kubernetes.io/projected/58bb6283-2edd-4844-af7d-75a3cd0458ff-kube-api-access-c988f\") pod \"community-operators-p95wb\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.324801 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c988f\" (UniqueName: \"kubernetes.io/projected/58bb6283-2edd-4844-af7d-75a3cd0458ff-kube-api-access-c988f\") pod \"community-operators-p95wb\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.324900 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-utilities\") pod \"community-operators-p95wb\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.324952 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-catalog-content\") pod \"community-operators-p95wb\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.325430 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-catalog-content\") pod \"community-operators-p95wb\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.325573 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-utilities\") pod \"community-operators-p95wb\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.345242 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c988f\" (UniqueName: \"kubernetes.io/projected/58bb6283-2edd-4844-af7d-75a3cd0458ff-kube-api-access-c988f\") pod \"community-operators-p95wb\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.377584 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:45 crc kubenswrapper[5033]: I0226 16:20:45.925989 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p95wb"] Feb 26 16:20:46 crc kubenswrapper[5033]: I0226 16:20:46.756870 5033 generic.go:334] "Generic (PLEG): container finished" podID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerID="469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537" exitCode=0 Feb 26 16:20:46 crc kubenswrapper[5033]: I0226 16:20:46.756936 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95wb" event={"ID":"58bb6283-2edd-4844-af7d-75a3cd0458ff","Type":"ContainerDied","Data":"469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537"} Feb 26 16:20:46 crc kubenswrapper[5033]: I0226 16:20:46.756975 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95wb" event={"ID":"58bb6283-2edd-4844-af7d-75a3cd0458ff","Type":"ContainerStarted","Data":"2012b2a962fd06f03e3ef2cfe27118b37d92eba29ce4a1af003179cac775bcf5"} Feb 26 16:20:47 crc kubenswrapper[5033]: I0226 16:20:47.765103 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95wb" event={"ID":"58bb6283-2edd-4844-af7d-75a3cd0458ff","Type":"ContainerStarted","Data":"df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0"} Feb 26 16:20:48 crc kubenswrapper[5033]: I0226 16:20:48.628040 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:20:48 crc kubenswrapper[5033]: E0226 16:20:48.628700 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:20:48 crc kubenswrapper[5033]: I0226 16:20:48.773008 5033 generic.go:334] "Generic (PLEG): container finished" podID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerID="df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0" exitCode=0 Feb 26 16:20:48 crc kubenswrapper[5033]: I0226 16:20:48.773057 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95wb" event={"ID":"58bb6283-2edd-4844-af7d-75a3cd0458ff","Type":"ContainerDied","Data":"df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0"} Feb 26 16:20:49 crc kubenswrapper[5033]: I0226 16:20:49.785204 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95wb" event={"ID":"58bb6283-2edd-4844-af7d-75a3cd0458ff","Type":"ContainerStarted","Data":"2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376"} Feb 26 16:20:49 crc kubenswrapper[5033]: I0226 16:20:49.804601 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p95wb" podStartSLOduration=2.197514015 podStartE2EDuration="4.804584902s" podCreationTimestamp="2026-02-26 16:20:45 +0000 UTC" firstStartedPulling="2026-02-26 16:20:46.760549823 +0000 UTC m=+2373.973148513" lastFinishedPulling="2026-02-26 16:20:49.36762074 +0000 UTC m=+2376.580219400" observedRunningTime="2026-02-26 16:20:49.803878934 +0000 UTC m=+2377.016477584" watchObservedRunningTime="2026-02-26 16:20:49.804584902 +0000 UTC m=+2377.017183552" Feb 26 16:20:55 crc kubenswrapper[5033]: I0226 16:20:55.378287 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:55 crc kubenswrapper[5033]: I0226 16:20:55.378891 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:55 crc kubenswrapper[5033]: I0226 16:20:55.432792 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:55 crc kubenswrapper[5033]: I0226 16:20:55.889952 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:55 crc kubenswrapper[5033]: I0226 16:20:55.942048 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p95wb"] Feb 26 16:20:57 crc kubenswrapper[5033]: I0226 16:20:57.841993 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p95wb" podUID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerName="registry-server" containerID="cri-o://2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376" gracePeriod=2 Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.231286 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.256008 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c988f\" (UniqueName: \"kubernetes.io/projected/58bb6283-2edd-4844-af7d-75a3cd0458ff-kube-api-access-c988f\") pod \"58bb6283-2edd-4844-af7d-75a3cd0458ff\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.256089 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-catalog-content\") pod \"58bb6283-2edd-4844-af7d-75a3cd0458ff\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.256164 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-utilities\") pod \"58bb6283-2edd-4844-af7d-75a3cd0458ff\" (UID: \"58bb6283-2edd-4844-af7d-75a3cd0458ff\") " Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.257247 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-utilities" (OuterVolumeSpecName: "utilities") pod "58bb6283-2edd-4844-af7d-75a3cd0458ff" (UID: "58bb6283-2edd-4844-af7d-75a3cd0458ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.263856 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58bb6283-2edd-4844-af7d-75a3cd0458ff-kube-api-access-c988f" (OuterVolumeSpecName: "kube-api-access-c988f") pod "58bb6283-2edd-4844-af7d-75a3cd0458ff" (UID: "58bb6283-2edd-4844-af7d-75a3cd0458ff"). InnerVolumeSpecName "kube-api-access-c988f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.324764 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58bb6283-2edd-4844-af7d-75a3cd0458ff" (UID: "58bb6283-2edd-4844-af7d-75a3cd0458ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.357284 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c988f\" (UniqueName: \"kubernetes.io/projected/58bb6283-2edd-4844-af7d-75a3cd0458ff-kube-api-access-c988f\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.357325 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.357335 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bb6283-2edd-4844-af7d-75a3cd0458ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.857565 5033 generic.go:334] "Generic (PLEG): container finished" podID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerID="2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376" exitCode=0 Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.857613 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95wb" event={"ID":"58bb6283-2edd-4844-af7d-75a3cd0458ff","Type":"ContainerDied","Data":"2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376"} Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.857653 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p95wb" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.857673 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p95wb" event={"ID":"58bb6283-2edd-4844-af7d-75a3cd0458ff","Type":"ContainerDied","Data":"2012b2a962fd06f03e3ef2cfe27118b37d92eba29ce4a1af003179cac775bcf5"} Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.857695 5033 scope.go:117] "RemoveContainer" containerID="2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.899963 5033 scope.go:117] "RemoveContainer" containerID="df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.906599 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p95wb"] Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.914091 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p95wb"] Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.920107 5033 scope.go:117] "RemoveContainer" containerID="469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.943830 5033 scope.go:117] "RemoveContainer" containerID="2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376" Feb 26 16:20:58 crc kubenswrapper[5033]: E0226 16:20:58.944641 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376\": container with ID starting with 2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376 not found: ID does not exist" containerID="2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.944822 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376"} err="failed to get container status \"2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376\": rpc error: code = NotFound desc = could not find container \"2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376\": container with ID starting with 2b994155fb8cfa70e4d31b33684c22c3618014a3d24f46de464b641ded46e376 not found: ID does not exist" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.944857 5033 scope.go:117] "RemoveContainer" containerID="df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0" Feb 26 16:20:58 crc kubenswrapper[5033]: E0226 16:20:58.945570 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0\": container with ID starting with df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0 not found: ID does not exist" containerID="df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.945623 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0"} err="failed to get container status \"df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0\": rpc error: code = NotFound desc = could not find container \"df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0\": container with ID starting with df282b85ef4568bce45acc78a7b9a2dd9bd3d94623a9f1e7bcf6e1d9461798b0 not found: ID does not exist" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.945650 5033 scope.go:117] "RemoveContainer" containerID="469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537" Feb 26 16:20:58 crc kubenswrapper[5033]: E0226 16:20:58.946027 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537\": container with ID starting with 469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537 not found: ID does not exist" containerID="469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537" Feb 26 16:20:58 crc kubenswrapper[5033]: I0226 16:20:58.946065 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537"} err="failed to get container status \"469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537\": rpc error: code = NotFound desc = could not find container \"469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537\": container with ID starting with 469aef4d9a0e42dee686ca425c14af5139e1902456056df0b661bdbdf35c7537 not found: ID does not exist" Feb 26 16:20:59 crc kubenswrapper[5033]: I0226 16:20:59.628197 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:20:59 crc kubenswrapper[5033]: E0226 16:20:59.628736 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:20:59 crc kubenswrapper[5033]: I0226 16:20:59.640653 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58bb6283-2edd-4844-af7d-75a3cd0458ff" path="/var/lib/kubelet/pods/58bb6283-2edd-4844-af7d-75a3cd0458ff/volumes" Feb 26 16:21:11 crc kubenswrapper[5033]: I0226 16:21:11.628574 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:21:11 crc kubenswrapper[5033]: E0226 16:21:11.629295 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:21:24 crc kubenswrapper[5033]: I0226 16:21:24.627764 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:21:24 crc kubenswrapper[5033]: E0226 16:21:24.628199 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:21:37 crc kubenswrapper[5033]: I0226 16:21:37.628993 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:21:37 crc kubenswrapper[5033]: E0226 16:21:37.630042 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:21:48 crc kubenswrapper[5033]: I0226 16:21:48.629066 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:21:48 crc kubenswrapper[5033]: E0226 16:21:48.630358 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.148402 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535382-v9hkm"] Feb 26 16:22:00 crc kubenswrapper[5033]: E0226 16:22:00.149293 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerName="extract-content" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.149306 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerName="extract-content" Feb 26 16:22:00 crc kubenswrapper[5033]: E0226 16:22:00.149328 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerName="extract-utilities" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.149334 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerName="extract-utilities" Feb 26 16:22:00 crc kubenswrapper[5033]: E0226 16:22:00.149353 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerName="registry-server" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.149359 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerName="registry-server" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.149483 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="58bb6283-2edd-4844-af7d-75a3cd0458ff" containerName="registry-server" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.149980 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535382-v9hkm" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.153892 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.154057 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.156705 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.162124 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535382-v9hkm"] Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.264157 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjjbx\" (UniqueName: \"kubernetes.io/projected/90f11bd3-65fa-4240-9f2f-b9a530a62bb3-kube-api-access-xjjbx\") pod \"auto-csr-approver-29535382-v9hkm\" (UID: \"90f11bd3-65fa-4240-9f2f-b9a530a62bb3\") " pod="openshift-infra/auto-csr-approver-29535382-v9hkm" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.366005 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjjbx\" (UniqueName: \"kubernetes.io/projected/90f11bd3-65fa-4240-9f2f-b9a530a62bb3-kube-api-access-xjjbx\") pod \"auto-csr-approver-29535382-v9hkm\" (UID: \"90f11bd3-65fa-4240-9f2f-b9a530a62bb3\") " pod="openshift-infra/auto-csr-approver-29535382-v9hkm" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.388620 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjjbx\" (UniqueName: \"kubernetes.io/projected/90f11bd3-65fa-4240-9f2f-b9a530a62bb3-kube-api-access-xjjbx\") pod \"auto-csr-approver-29535382-v9hkm\" (UID: \"90f11bd3-65fa-4240-9f2f-b9a530a62bb3\") " pod="openshift-infra/auto-csr-approver-29535382-v9hkm" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.475143 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535382-v9hkm" Feb 26 16:22:00 crc kubenswrapper[5033]: I0226 16:22:00.912245 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535382-v9hkm"] Feb 26 16:22:01 crc kubenswrapper[5033]: I0226 16:22:01.335507 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535382-v9hkm" event={"ID":"90f11bd3-65fa-4240-9f2f-b9a530a62bb3","Type":"ContainerStarted","Data":"eb791cbe20c4a96986e0043b27a15f310b1c1a0db38e013d19f8a4df13f50efd"} Feb 26 16:22:02 crc kubenswrapper[5033]: I0226 16:22:02.342876 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535382-v9hkm" event={"ID":"90f11bd3-65fa-4240-9f2f-b9a530a62bb3","Type":"ContainerStarted","Data":"04c8ed43e5413747c0601dc86510b0bf7df1904a69281e0af47aa7da1a1f8be4"} Feb 26 16:22:02 crc kubenswrapper[5033]: I0226 16:22:02.363321 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535382-v9hkm" podStartSLOduration=1.398685212 podStartE2EDuration="2.363303203s" podCreationTimestamp="2026-02-26 16:22:00 +0000 UTC" firstStartedPulling="2026-02-26 16:22:00.928412653 +0000 UTC m=+2448.141011303" lastFinishedPulling="2026-02-26 16:22:01.893030644 +0000 UTC m=+2449.105629294" observedRunningTime="2026-02-26 16:22:02.3578449 +0000 UTC m=+2449.570443560" watchObservedRunningTime="2026-02-26 16:22:02.363303203 +0000 UTC m=+2449.575901843" Feb 26 16:22:03 crc kubenswrapper[5033]: I0226 16:22:03.350561 5033 generic.go:334] "Generic (PLEG): container finished" podID="90f11bd3-65fa-4240-9f2f-b9a530a62bb3" containerID="04c8ed43e5413747c0601dc86510b0bf7df1904a69281e0af47aa7da1a1f8be4" exitCode=0 Feb 26 16:22:03 crc kubenswrapper[5033]: I0226 16:22:03.350667 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535382-v9hkm" event={"ID":"90f11bd3-65fa-4240-9f2f-b9a530a62bb3","Type":"ContainerDied","Data":"04c8ed43e5413747c0601dc86510b0bf7df1904a69281e0af47aa7da1a1f8be4"} Feb 26 16:22:03 crc kubenswrapper[5033]: I0226 16:22:03.636810 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:22:03 crc kubenswrapper[5033]: E0226 16:22:03.637170 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:22:04 crc kubenswrapper[5033]: I0226 16:22:04.736620 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535382-v9hkm" Feb 26 16:22:04 crc kubenswrapper[5033]: I0226 16:22:04.828860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjjbx\" (UniqueName: \"kubernetes.io/projected/90f11bd3-65fa-4240-9f2f-b9a530a62bb3-kube-api-access-xjjbx\") pod \"90f11bd3-65fa-4240-9f2f-b9a530a62bb3\" (UID: \"90f11bd3-65fa-4240-9f2f-b9a530a62bb3\") " Feb 26 16:22:04 crc kubenswrapper[5033]: I0226 16:22:04.833510 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f11bd3-65fa-4240-9f2f-b9a530a62bb3-kube-api-access-xjjbx" (OuterVolumeSpecName: "kube-api-access-xjjbx") pod "90f11bd3-65fa-4240-9f2f-b9a530a62bb3" (UID: "90f11bd3-65fa-4240-9f2f-b9a530a62bb3"). InnerVolumeSpecName "kube-api-access-xjjbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:22:04 crc kubenswrapper[5033]: I0226 16:22:04.930979 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjjbx\" (UniqueName: \"kubernetes.io/projected/90f11bd3-65fa-4240-9f2f-b9a530a62bb3-kube-api-access-xjjbx\") on node \"crc\" DevicePath \"\"" Feb 26 16:22:05 crc kubenswrapper[5033]: I0226 16:22:05.377035 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535382-v9hkm" event={"ID":"90f11bd3-65fa-4240-9f2f-b9a530a62bb3","Type":"ContainerDied","Data":"eb791cbe20c4a96986e0043b27a15f310b1c1a0db38e013d19f8a4df13f50efd"} Feb 26 16:22:05 crc kubenswrapper[5033]: I0226 16:22:05.377091 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535382-v9hkm" Feb 26 16:22:05 crc kubenswrapper[5033]: I0226 16:22:05.377089 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb791cbe20c4a96986e0043b27a15f310b1c1a0db38e013d19f8a4df13f50efd" Feb 26 16:22:05 crc kubenswrapper[5033]: I0226 16:22:05.447502 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535376-pn9vv"] Feb 26 16:22:05 crc kubenswrapper[5033]: I0226 16:22:05.454925 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535376-pn9vv"] Feb 26 16:22:05 crc kubenswrapper[5033]: I0226 16:22:05.638857 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4449adb6-eeb2-4ccb-99b8-599561ac2534" path="/var/lib/kubelet/pods/4449adb6-eeb2-4ccb-99b8-599561ac2534/volumes" Feb 26 16:22:14 crc kubenswrapper[5033]: I0226 16:22:14.629327 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:22:14 crc kubenswrapper[5033]: E0226 16:22:14.630169 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:22:25 crc kubenswrapper[5033]: I0226 16:22:25.628218 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:22:25 crc kubenswrapper[5033]: E0226 16:22:25.628847 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:22:30 crc kubenswrapper[5033]: I0226 16:22:30.197580 5033 scope.go:117] "RemoveContainer" containerID="2a214dab3830fc595d0c6d62dcce9a09c62f5281bd1f71ec01ed9a65f4a716d1" Feb 26 16:22:39 crc kubenswrapper[5033]: I0226 16:22:39.628739 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:22:39 crc kubenswrapper[5033]: E0226 16:22:39.629542 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:22:50 crc kubenswrapper[5033]: I0226 16:22:50.627669 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:22:50 crc kubenswrapper[5033]: E0226 16:22:50.628743 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:23:02 crc kubenswrapper[5033]: I0226 16:23:02.627884 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:23:02 crc kubenswrapper[5033]: E0226 16:23:02.628673 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:23:16 crc kubenswrapper[5033]: I0226 16:23:16.628975 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:23:16 crc kubenswrapper[5033]: E0226 16:23:16.630069 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:23:31 crc kubenswrapper[5033]: I0226 16:23:31.628991 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:23:31 crc kubenswrapper[5033]: E0226 16:23:31.630323 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:23:45 crc kubenswrapper[5033]: I0226 16:23:45.628662 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:23:45 crc kubenswrapper[5033]: E0226 16:23:45.629846 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:23:59 crc kubenswrapper[5033]: I0226 16:23:59.628206 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:23:59 crc kubenswrapper[5033]: E0226 16:23:59.628905 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.156416 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535384-bcxls"] Feb 26 16:24:00 crc kubenswrapper[5033]: E0226 16:24:00.157064 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f11bd3-65fa-4240-9f2f-b9a530a62bb3" containerName="oc" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.157084 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f11bd3-65fa-4240-9f2f-b9a530a62bb3" containerName="oc" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.157298 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f11bd3-65fa-4240-9f2f-b9a530a62bb3" containerName="oc" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.157857 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535384-bcxls" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.160349 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.160461 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.160953 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.173530 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535384-bcxls"] Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.332314 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj8ld\" (UniqueName: \"kubernetes.io/projected/c9f904ab-f4ed-40ab-9927-294d445b5d67-kube-api-access-kj8ld\") pod \"auto-csr-approver-29535384-bcxls\" (UID: \"c9f904ab-f4ed-40ab-9927-294d445b5d67\") " pod="openshift-infra/auto-csr-approver-29535384-bcxls" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.436183 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj8ld\" (UniqueName: \"kubernetes.io/projected/c9f904ab-f4ed-40ab-9927-294d445b5d67-kube-api-access-kj8ld\") pod \"auto-csr-approver-29535384-bcxls\" (UID: \"c9f904ab-f4ed-40ab-9927-294d445b5d67\") " pod="openshift-infra/auto-csr-approver-29535384-bcxls" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.473085 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj8ld\" (UniqueName: \"kubernetes.io/projected/c9f904ab-f4ed-40ab-9927-294d445b5d67-kube-api-access-kj8ld\") pod \"auto-csr-approver-29535384-bcxls\" (UID: \"c9f904ab-f4ed-40ab-9927-294d445b5d67\") " pod="openshift-infra/auto-csr-approver-29535384-bcxls" Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.484008 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535384-bcxls" Feb 26 16:24:00 crc kubenswrapper[5033]: W0226 16:24:00.936490 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9f904ab_f4ed_40ab_9927_294d445b5d67.slice/crio-1bbc68ada7a5089b19a81f99da35ae5cd0d664a152736429cb6170a031d8369f WatchSource:0}: Error finding container 1bbc68ada7a5089b19a81f99da35ae5cd0d664a152736429cb6170a031d8369f: Status 404 returned error can't find the container with id 1bbc68ada7a5089b19a81f99da35ae5cd0d664a152736429cb6170a031d8369f Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.938213 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535384-bcxls"] Feb 26 16:24:00 crc kubenswrapper[5033]: I0226 16:24:00.938598 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:24:01 crc kubenswrapper[5033]: I0226 16:24:01.356171 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535384-bcxls" event={"ID":"c9f904ab-f4ed-40ab-9927-294d445b5d67","Type":"ContainerStarted","Data":"1bbc68ada7a5089b19a81f99da35ae5cd0d664a152736429cb6170a031d8369f"} Feb 26 16:24:02 crc kubenswrapper[5033]: I0226 16:24:02.364129 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535384-bcxls" event={"ID":"c9f904ab-f4ed-40ab-9927-294d445b5d67","Type":"ContainerStarted","Data":"5685d604f74648b889b7d355da1f8725ab416f0fd58a894bbf7cc4f0fd7bad77"} Feb 26 16:24:02 crc kubenswrapper[5033]: I0226 16:24:02.378571 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535384-bcxls" podStartSLOduration=1.374666997 podStartE2EDuration="2.378549888s" podCreationTimestamp="2026-02-26 16:24:00 +0000 UTC" firstStartedPulling="2026-02-26 16:24:00.938241525 +0000 UTC m=+2568.150840175" lastFinishedPulling="2026-02-26 16:24:01.942124416 +0000 UTC m=+2569.154723066" observedRunningTime="2026-02-26 16:24:02.377075812 +0000 UTC m=+2569.589674472" watchObservedRunningTime="2026-02-26 16:24:02.378549888 +0000 UTC m=+2569.591148538" Feb 26 16:24:03 crc kubenswrapper[5033]: I0226 16:24:03.372912 5033 generic.go:334] "Generic (PLEG): container finished" podID="c9f904ab-f4ed-40ab-9927-294d445b5d67" containerID="5685d604f74648b889b7d355da1f8725ab416f0fd58a894bbf7cc4f0fd7bad77" exitCode=0 Feb 26 16:24:03 crc kubenswrapper[5033]: I0226 16:24:03.372977 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535384-bcxls" event={"ID":"c9f904ab-f4ed-40ab-9927-294d445b5d67","Type":"ContainerDied","Data":"5685d604f74648b889b7d355da1f8725ab416f0fd58a894bbf7cc4f0fd7bad77"} Feb 26 16:24:04 crc kubenswrapper[5033]: I0226 16:24:04.717728 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535384-bcxls" Feb 26 16:24:04 crc kubenswrapper[5033]: I0226 16:24:04.908237 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj8ld\" (UniqueName: \"kubernetes.io/projected/c9f904ab-f4ed-40ab-9927-294d445b5d67-kube-api-access-kj8ld\") pod \"c9f904ab-f4ed-40ab-9927-294d445b5d67\" (UID: \"c9f904ab-f4ed-40ab-9927-294d445b5d67\") " Feb 26 16:24:04 crc kubenswrapper[5033]: I0226 16:24:04.916930 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f904ab-f4ed-40ab-9927-294d445b5d67-kube-api-access-kj8ld" (OuterVolumeSpecName: "kube-api-access-kj8ld") pod "c9f904ab-f4ed-40ab-9927-294d445b5d67" (UID: "c9f904ab-f4ed-40ab-9927-294d445b5d67"). InnerVolumeSpecName "kube-api-access-kj8ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:24:05 crc kubenswrapper[5033]: I0226 16:24:05.009887 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj8ld\" (UniqueName: \"kubernetes.io/projected/c9f904ab-f4ed-40ab-9927-294d445b5d67-kube-api-access-kj8ld\") on node \"crc\" DevicePath \"\"" Feb 26 16:24:05 crc kubenswrapper[5033]: I0226 16:24:05.392970 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535384-bcxls" event={"ID":"c9f904ab-f4ed-40ab-9927-294d445b5d67","Type":"ContainerDied","Data":"1bbc68ada7a5089b19a81f99da35ae5cd0d664a152736429cb6170a031d8369f"} Feb 26 16:24:05 crc kubenswrapper[5033]: I0226 16:24:05.393049 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbc68ada7a5089b19a81f99da35ae5cd0d664a152736429cb6170a031d8369f" Feb 26 16:24:05 crc kubenswrapper[5033]: I0226 16:24:05.392999 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535384-bcxls" Feb 26 16:24:05 crc kubenswrapper[5033]: I0226 16:24:05.458866 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535378-d5m5k"] Feb 26 16:24:05 crc kubenswrapper[5033]: I0226 16:24:05.463823 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535378-d5m5k"] Feb 26 16:24:05 crc kubenswrapper[5033]: I0226 16:24:05.636418 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57c69b9b-6398-4fa9-9563-ba88cc62665f" path="/var/lib/kubelet/pods/57c69b9b-6398-4fa9-9563-ba88cc62665f/volumes" Feb 26 16:24:13 crc kubenswrapper[5033]: I0226 16:24:13.632880 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:24:13 crc kubenswrapper[5033]: E0226 16:24:13.633686 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:24:24 crc kubenswrapper[5033]: I0226 16:24:24.628521 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:24:25 crc kubenswrapper[5033]: I0226 16:24:25.550697 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"dad6881e86f5b0bfddfa2805c22679ea01f0f421cdafb8e298b9c2c7e12fdd3c"} Feb 26 16:24:30 crc kubenswrapper[5033]: I0226 16:24:30.298040 5033 scope.go:117] "RemoveContainer" containerID="d39f043c8e9e4d15bf4dd64a8b4a3e83e560bd909d415282eea63896aeb628d2" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.223392 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ndt75"] Feb 26 16:25:35 crc kubenswrapper[5033]: E0226 16:25:35.224509 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f904ab-f4ed-40ab-9927-294d445b5d67" containerName="oc" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.224531 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f904ab-f4ed-40ab-9927-294d445b5d67" containerName="oc" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.224754 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f904ab-f4ed-40ab-9927-294d445b5d67" containerName="oc" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.226068 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.238238 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ndt75"] Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.279014 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g6jj\" (UniqueName: \"kubernetes.io/projected/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-kube-api-access-4g6jj\") pod \"certified-operators-ndt75\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.279062 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-catalog-content\") pod \"certified-operators-ndt75\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.279128 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-utilities\") pod \"certified-operators-ndt75\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.380436 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g6jj\" (UniqueName: \"kubernetes.io/projected/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-kube-api-access-4g6jj\") pod \"certified-operators-ndt75\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.380507 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-catalog-content\") pod \"certified-operators-ndt75\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.380559 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-utilities\") pod \"certified-operators-ndt75\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.381248 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-utilities\") pod \"certified-operators-ndt75\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.381365 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-catalog-content\") pod \"certified-operators-ndt75\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.400617 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g6jj\" (UniqueName: \"kubernetes.io/projected/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-kube-api-access-4g6jj\") pod \"certified-operators-ndt75\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:35 crc kubenswrapper[5033]: I0226 16:25:35.598421 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:36 crc kubenswrapper[5033]: I0226 16:25:36.064107 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ndt75"] Feb 26 16:25:36 crc kubenswrapper[5033]: I0226 16:25:36.084215 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndt75" event={"ID":"fbab4cba-2bbc-4434-974c-1b1cfd25f26d","Type":"ContainerStarted","Data":"bee298457c21b8d9db75c417ba5ee82459b0fcb8fa03927cf6ef163343f3bd32"} Feb 26 16:25:37 crc kubenswrapper[5033]: I0226 16:25:37.095666 5033 generic.go:334] "Generic (PLEG): container finished" podID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerID="1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173" exitCode=0 Feb 26 16:25:37 crc kubenswrapper[5033]: I0226 16:25:37.095733 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndt75" event={"ID":"fbab4cba-2bbc-4434-974c-1b1cfd25f26d","Type":"ContainerDied","Data":"1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173"} Feb 26 16:25:38 crc kubenswrapper[5033]: I0226 16:25:38.104458 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndt75" event={"ID":"fbab4cba-2bbc-4434-974c-1b1cfd25f26d","Type":"ContainerStarted","Data":"052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5"} Feb 26 16:25:39 crc kubenswrapper[5033]: I0226 16:25:39.114558 5033 generic.go:334] "Generic (PLEG): container finished" podID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerID="052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5" exitCode=0 Feb 26 16:25:39 crc kubenswrapper[5033]: I0226 16:25:39.114969 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndt75" event={"ID":"fbab4cba-2bbc-4434-974c-1b1cfd25f26d","Type":"ContainerDied","Data":"052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5"} Feb 26 16:25:40 crc kubenswrapper[5033]: I0226 16:25:40.123707 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndt75" event={"ID":"fbab4cba-2bbc-4434-974c-1b1cfd25f26d","Type":"ContainerStarted","Data":"b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a"} Feb 26 16:25:40 crc kubenswrapper[5033]: I0226 16:25:40.144282 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ndt75" podStartSLOduration=2.497331073 podStartE2EDuration="5.144231293s" podCreationTimestamp="2026-02-26 16:25:35 +0000 UTC" firstStartedPulling="2026-02-26 16:25:37.098202311 +0000 UTC m=+2664.310800971" lastFinishedPulling="2026-02-26 16:25:39.745102541 +0000 UTC m=+2666.957701191" observedRunningTime="2026-02-26 16:25:40.140237617 +0000 UTC m=+2667.352836287" watchObservedRunningTime="2026-02-26 16:25:40.144231293 +0000 UTC m=+2667.356829983" Feb 26 16:25:45 crc kubenswrapper[5033]: I0226 16:25:45.599353 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:45 crc kubenswrapper[5033]: I0226 16:25:45.600035 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:45 crc kubenswrapper[5033]: I0226 16:25:45.672492 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:46 crc kubenswrapper[5033]: I0226 16:25:46.216854 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:46 crc kubenswrapper[5033]: I0226 16:25:46.283633 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ndt75"] Feb 26 16:25:48 crc kubenswrapper[5033]: I0226 16:25:48.191605 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ndt75" podUID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerName="registry-server" containerID="cri-o://b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a" gracePeriod=2 Feb 26 16:25:49 crc kubenswrapper[5033]: I0226 16:25:49.854177 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:49 crc kubenswrapper[5033]: I0226 16:25:49.962857 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-utilities\") pod \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " Feb 26 16:25:49 crc kubenswrapper[5033]: I0226 16:25:49.962931 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g6jj\" (UniqueName: \"kubernetes.io/projected/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-kube-api-access-4g6jj\") pod \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " Feb 26 16:25:49 crc kubenswrapper[5033]: I0226 16:25:49.962953 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-catalog-content\") pod \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\" (UID: \"fbab4cba-2bbc-4434-974c-1b1cfd25f26d\") " Feb 26 16:25:49 crc kubenswrapper[5033]: I0226 16:25:49.963808 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-utilities" (OuterVolumeSpecName: "utilities") pod "fbab4cba-2bbc-4434-974c-1b1cfd25f26d" (UID: "fbab4cba-2bbc-4434-974c-1b1cfd25f26d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:25:49 crc kubenswrapper[5033]: I0226 16:25:49.970371 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-kube-api-access-4g6jj" (OuterVolumeSpecName: "kube-api-access-4g6jj") pod "fbab4cba-2bbc-4434-974c-1b1cfd25f26d" (UID: "fbab4cba-2bbc-4434-974c-1b1cfd25f26d"). InnerVolumeSpecName "kube-api-access-4g6jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.022415 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbab4cba-2bbc-4434-974c-1b1cfd25f26d" (UID: "fbab4cba-2bbc-4434-974c-1b1cfd25f26d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.064361 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g6jj\" (UniqueName: \"kubernetes.io/projected/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-kube-api-access-4g6jj\") on node \"crc\" DevicePath \"\"" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.064396 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.064406 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbab4cba-2bbc-4434-974c-1b1cfd25f26d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.207915 5033 generic.go:334] "Generic (PLEG): container finished" podID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerID="b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a" exitCode=0 Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.207991 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndt75" event={"ID":"fbab4cba-2bbc-4434-974c-1b1cfd25f26d","Type":"ContainerDied","Data":"b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a"} Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.208035 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ndt75" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.208060 5033 scope.go:117] "RemoveContainer" containerID="b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.208037 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndt75" event={"ID":"fbab4cba-2bbc-4434-974c-1b1cfd25f26d","Type":"ContainerDied","Data":"bee298457c21b8d9db75c417ba5ee82459b0fcb8fa03927cf6ef163343f3bd32"} Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.237965 5033 scope.go:117] "RemoveContainer" containerID="052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.246302 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ndt75"] Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.256573 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ndt75"] Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.261463 5033 scope.go:117] "RemoveContainer" containerID="1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.285769 5033 scope.go:117] "RemoveContainer" containerID="b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a" Feb 26 16:25:50 crc kubenswrapper[5033]: E0226 16:25:50.286269 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a\": container with ID starting with b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a not found: ID does not exist" containerID="b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.286315 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a"} err="failed to get container status \"b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a\": rpc error: code = NotFound desc = could not find container \"b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a\": container with ID starting with b180942ec3428723e53145b12b88a1b9375cc76307de636154323f686b05d09a not found: ID does not exist" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.286345 5033 scope.go:117] "RemoveContainer" containerID="052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5" Feb 26 16:25:50 crc kubenswrapper[5033]: E0226 16:25:50.286760 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5\": container with ID starting with 052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5 not found: ID does not exist" containerID="052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.286779 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5"} err="failed to get container status \"052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5\": rpc error: code = NotFound desc = could not find container \"052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5\": container with ID starting with 052c684d52f3c911c0e83908cbef3120dac25ecf36792de9f880f879675da6a5 not found: ID does not exist" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.286793 5033 scope.go:117] "RemoveContainer" containerID="1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173" Feb 26 16:25:50 crc kubenswrapper[5033]: E0226 16:25:50.287138 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173\": container with ID starting with 1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173 not found: ID does not exist" containerID="1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173" Feb 26 16:25:50 crc kubenswrapper[5033]: I0226 16:25:50.287174 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173"} err="failed to get container status \"1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173\": rpc error: code = NotFound desc = could not find container \"1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173\": container with ID starting with 1404d5089b3afed63825ff90b705fd568b86d76c0635db2181b2d4136bf46173 not found: ID does not exist" Feb 26 16:25:51 crc kubenswrapper[5033]: I0226 16:25:51.637501 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" path="/var/lib/kubelet/pods/fbab4cba-2bbc-4434-974c-1b1cfd25f26d/volumes" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.135655 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535386-4r2m6"] Feb 26 16:26:00 crc kubenswrapper[5033]: E0226 16:26:00.136733 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerName="registry-server" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.136750 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerName="registry-server" Feb 26 16:26:00 crc kubenswrapper[5033]: E0226 16:26:00.136764 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerName="extract-utilities" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.136772 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerName="extract-utilities" Feb 26 16:26:00 crc kubenswrapper[5033]: E0226 16:26:00.136785 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerName="extract-content" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.136793 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerName="extract-content" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.136966 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbab4cba-2bbc-4434-974c-1b1cfd25f26d" containerName="registry-server" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.137550 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535386-4r2m6" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.139708 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.140717 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.141420 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.148081 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535386-4r2m6"] Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.212076 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2ks\" (UniqueName: \"kubernetes.io/projected/d225af9c-75fd-4686-83e2-91ea2fccc3ff-kube-api-access-6m2ks\") pod \"auto-csr-approver-29535386-4r2m6\" (UID: \"d225af9c-75fd-4686-83e2-91ea2fccc3ff\") " pod="openshift-infra/auto-csr-approver-29535386-4r2m6" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.313529 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2ks\" (UniqueName: \"kubernetes.io/projected/d225af9c-75fd-4686-83e2-91ea2fccc3ff-kube-api-access-6m2ks\") pod \"auto-csr-approver-29535386-4r2m6\" (UID: \"d225af9c-75fd-4686-83e2-91ea2fccc3ff\") " pod="openshift-infra/auto-csr-approver-29535386-4r2m6" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.332961 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2ks\" (UniqueName: \"kubernetes.io/projected/d225af9c-75fd-4686-83e2-91ea2fccc3ff-kube-api-access-6m2ks\") pod \"auto-csr-approver-29535386-4r2m6\" (UID: \"d225af9c-75fd-4686-83e2-91ea2fccc3ff\") " pod="openshift-infra/auto-csr-approver-29535386-4r2m6" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.457110 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535386-4r2m6" Feb 26 16:26:00 crc kubenswrapper[5033]: I0226 16:26:00.888211 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535386-4r2m6"] Feb 26 16:26:01 crc kubenswrapper[5033]: I0226 16:26:01.284669 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535386-4r2m6" event={"ID":"d225af9c-75fd-4686-83e2-91ea2fccc3ff","Type":"ContainerStarted","Data":"6df864ff2ffeffda45892f97e63987912a9b710074b068d36409aa613d7f5d94"} Feb 26 16:26:03 crc kubenswrapper[5033]: I0226 16:26:03.302314 5033 generic.go:334] "Generic (PLEG): container finished" podID="d225af9c-75fd-4686-83e2-91ea2fccc3ff" containerID="37b682c7ede28145719bce76fb9de2b1641633647448add9bed57b5f908e5a36" exitCode=0 Feb 26 16:26:03 crc kubenswrapper[5033]: I0226 16:26:03.302363 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535386-4r2m6" event={"ID":"d225af9c-75fd-4686-83e2-91ea2fccc3ff","Type":"ContainerDied","Data":"37b682c7ede28145719bce76fb9de2b1641633647448add9bed57b5f908e5a36"} Feb 26 16:26:04 crc kubenswrapper[5033]: I0226 16:26:04.664968 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535386-4r2m6" Feb 26 16:26:04 crc kubenswrapper[5033]: I0226 16:26:04.785241 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m2ks\" (UniqueName: \"kubernetes.io/projected/d225af9c-75fd-4686-83e2-91ea2fccc3ff-kube-api-access-6m2ks\") pod \"d225af9c-75fd-4686-83e2-91ea2fccc3ff\" (UID: \"d225af9c-75fd-4686-83e2-91ea2fccc3ff\") " Feb 26 16:26:04 crc kubenswrapper[5033]: I0226 16:26:04.790953 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d225af9c-75fd-4686-83e2-91ea2fccc3ff-kube-api-access-6m2ks" (OuterVolumeSpecName: "kube-api-access-6m2ks") pod "d225af9c-75fd-4686-83e2-91ea2fccc3ff" (UID: "d225af9c-75fd-4686-83e2-91ea2fccc3ff"). InnerVolumeSpecName "kube-api-access-6m2ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:26:04 crc kubenswrapper[5033]: I0226 16:26:04.887249 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m2ks\" (UniqueName: \"kubernetes.io/projected/d225af9c-75fd-4686-83e2-91ea2fccc3ff-kube-api-access-6m2ks\") on node \"crc\" DevicePath \"\"" Feb 26 16:26:05 crc kubenswrapper[5033]: I0226 16:26:05.317814 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535386-4r2m6" event={"ID":"d225af9c-75fd-4686-83e2-91ea2fccc3ff","Type":"ContainerDied","Data":"6df864ff2ffeffda45892f97e63987912a9b710074b068d36409aa613d7f5d94"} Feb 26 16:26:05 crc kubenswrapper[5033]: I0226 16:26:05.318089 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6df864ff2ffeffda45892f97e63987912a9b710074b068d36409aa613d7f5d94" Feb 26 16:26:05 crc kubenswrapper[5033]: I0226 16:26:05.317856 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535386-4r2m6" Feb 26 16:26:05 crc kubenswrapper[5033]: I0226 16:26:05.731858 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535380-qgfd7"] Feb 26 16:26:05 crc kubenswrapper[5033]: I0226 16:26:05.738269 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535380-qgfd7"] Feb 26 16:26:07 crc kubenswrapper[5033]: I0226 16:26:07.640224 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace6d11c-b8ba-4fe0-8c0d-c742a60742a9" path="/var/lib/kubelet/pods/ace6d11c-b8ba-4fe0-8c0d-c742a60742a9/volumes" Feb 26 16:26:30 crc kubenswrapper[5033]: I0226 16:26:30.382055 5033 scope.go:117] "RemoveContainer" containerID="eabb49bf9465a81eaaacd694b2a443b6fdc016f1a65d840fee28bfa2984b49f0" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.461958 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j6ccq"] Feb 26 16:26:31 crc kubenswrapper[5033]: E0226 16:26:31.462594 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d225af9c-75fd-4686-83e2-91ea2fccc3ff" containerName="oc" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.462611 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d225af9c-75fd-4686-83e2-91ea2fccc3ff" containerName="oc" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.462742 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d225af9c-75fd-4686-83e2-91ea2fccc3ff" containerName="oc" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.468076 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.469890 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6ccq"] Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.570495 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-utilities\") pod \"redhat-operators-j6ccq\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.570901 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-catalog-content\") pod \"redhat-operators-j6ccq\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.571199 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh27v\" (UniqueName: \"kubernetes.io/projected/8aef6251-9e42-405d-b308-b96155298114-kube-api-access-lh27v\") pod \"redhat-operators-j6ccq\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.672379 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-utilities\") pod \"redhat-operators-j6ccq\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.672476 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-catalog-content\") pod \"redhat-operators-j6ccq\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.672545 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh27v\" (UniqueName: \"kubernetes.io/projected/8aef6251-9e42-405d-b308-b96155298114-kube-api-access-lh27v\") pod \"redhat-operators-j6ccq\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.673206 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-catalog-content\") pod \"redhat-operators-j6ccq\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.673418 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-utilities\") pod \"redhat-operators-j6ccq\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.704370 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh27v\" (UniqueName: \"kubernetes.io/projected/8aef6251-9e42-405d-b308-b96155298114-kube-api-access-lh27v\") pod \"redhat-operators-j6ccq\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:31 crc kubenswrapper[5033]: I0226 16:26:31.789709 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:32 crc kubenswrapper[5033]: I0226 16:26:32.268949 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6ccq"] Feb 26 16:26:32 crc kubenswrapper[5033]: I0226 16:26:32.505932 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6ccq" event={"ID":"8aef6251-9e42-405d-b308-b96155298114","Type":"ContainerStarted","Data":"bfd92c7859c3145b6ae688c8703701b9b22517176682b34f350fa08e4ca6e029"} Feb 26 16:26:33 crc kubenswrapper[5033]: I0226 16:26:33.513590 5033 generic.go:334] "Generic (PLEG): container finished" podID="8aef6251-9e42-405d-b308-b96155298114" containerID="89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365" exitCode=0 Feb 26 16:26:33 crc kubenswrapper[5033]: I0226 16:26:33.513650 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6ccq" event={"ID":"8aef6251-9e42-405d-b308-b96155298114","Type":"ContainerDied","Data":"89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365"} Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.651946 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-44x92"] Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.655674 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.663777 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44x92"] Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.828592 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-utilities\") pod \"redhat-marketplace-44x92\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.828648 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-catalog-content\") pod \"redhat-marketplace-44x92\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.828736 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwr96\" (UniqueName: \"kubernetes.io/projected/ff00de95-024b-4770-8c61-10fa4e3139bf-kube-api-access-fwr96\") pod \"redhat-marketplace-44x92\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.930400 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwr96\" (UniqueName: \"kubernetes.io/projected/ff00de95-024b-4770-8c61-10fa4e3139bf-kube-api-access-fwr96\") pod \"redhat-marketplace-44x92\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.930504 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-utilities\") pod \"redhat-marketplace-44x92\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.930533 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-catalog-content\") pod \"redhat-marketplace-44x92\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.930958 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-catalog-content\") pod \"redhat-marketplace-44x92\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.931511 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-utilities\") pod \"redhat-marketplace-44x92\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.951550 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwr96\" (UniqueName: \"kubernetes.io/projected/ff00de95-024b-4770-8c61-10fa4e3139bf-kube-api-access-fwr96\") pod \"redhat-marketplace-44x92\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:34 crc kubenswrapper[5033]: I0226 16:26:34.993311 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:35 crc kubenswrapper[5033]: I0226 16:26:35.447852 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44x92"] Feb 26 16:26:35 crc kubenswrapper[5033]: I0226 16:26:35.528176 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44x92" event={"ID":"ff00de95-024b-4770-8c61-10fa4e3139bf","Type":"ContainerStarted","Data":"bef8fa512af46e35e5a151471e2f337dfb0a6b6dec0a27c8bc5bad31ae813540"} Feb 26 16:26:35 crc kubenswrapper[5033]: I0226 16:26:35.531983 5033 generic.go:334] "Generic (PLEG): container finished" podID="8aef6251-9e42-405d-b308-b96155298114" containerID="0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef" exitCode=0 Feb 26 16:26:35 crc kubenswrapper[5033]: I0226 16:26:35.532025 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6ccq" event={"ID":"8aef6251-9e42-405d-b308-b96155298114","Type":"ContainerDied","Data":"0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef"} Feb 26 16:26:36 crc kubenswrapper[5033]: I0226 16:26:36.541306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6ccq" event={"ID":"8aef6251-9e42-405d-b308-b96155298114","Type":"ContainerStarted","Data":"d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b"} Feb 26 16:26:36 crc kubenswrapper[5033]: I0226 16:26:36.542786 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerID="baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b" exitCode=0 Feb 26 16:26:36 crc kubenswrapper[5033]: I0226 16:26:36.542825 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44x92" event={"ID":"ff00de95-024b-4770-8c61-10fa4e3139bf","Type":"ContainerDied","Data":"baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b"} Feb 26 16:26:36 crc kubenswrapper[5033]: I0226 16:26:36.565693 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j6ccq" podStartSLOduration=3.11672627 podStartE2EDuration="5.565676048s" podCreationTimestamp="2026-02-26 16:26:31 +0000 UTC" firstStartedPulling="2026-02-26 16:26:33.515434534 +0000 UTC m=+2720.728033184" lastFinishedPulling="2026-02-26 16:26:35.964384312 +0000 UTC m=+2723.176982962" observedRunningTime="2026-02-26 16:26:36.558431203 +0000 UTC m=+2723.771029863" watchObservedRunningTime="2026-02-26 16:26:36.565676048 +0000 UTC m=+2723.778274698" Feb 26 16:26:38 crc kubenswrapper[5033]: I0226 16:26:38.652470 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerID="2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72" exitCode=0 Feb 26 16:26:38 crc kubenswrapper[5033]: I0226 16:26:38.652557 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44x92" event={"ID":"ff00de95-024b-4770-8c61-10fa4e3139bf","Type":"ContainerDied","Data":"2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72"} Feb 26 16:26:39 crc kubenswrapper[5033]: I0226 16:26:39.660782 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44x92" event={"ID":"ff00de95-024b-4770-8c61-10fa4e3139bf","Type":"ContainerStarted","Data":"3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19"} Feb 26 16:26:39 crc kubenswrapper[5033]: I0226 16:26:39.684599 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-44x92" podStartSLOduration=3.122353698 podStartE2EDuration="5.684576387s" podCreationTimestamp="2026-02-26 16:26:34 +0000 UTC" firstStartedPulling="2026-02-26 16:26:36.544318643 +0000 UTC m=+2723.756917293" lastFinishedPulling="2026-02-26 16:26:39.106541332 +0000 UTC m=+2726.319139982" observedRunningTime="2026-02-26 16:26:39.678205203 +0000 UTC m=+2726.890803853" watchObservedRunningTime="2026-02-26 16:26:39.684576387 +0000 UTC m=+2726.897175037" Feb 26 16:26:41 crc kubenswrapper[5033]: I0226 16:26:41.790760 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:41 crc kubenswrapper[5033]: I0226 16:26:41.792852 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:42 crc kubenswrapper[5033]: I0226 16:26:42.835230 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j6ccq" podUID="8aef6251-9e42-405d-b308-b96155298114" containerName="registry-server" probeResult="failure" output=< Feb 26 16:26:42 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 16:26:42 crc kubenswrapper[5033]: > Feb 26 16:26:44 crc kubenswrapper[5033]: I0226 16:26:44.994377 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:44 crc kubenswrapper[5033]: I0226 16:26:44.994684 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:45 crc kubenswrapper[5033]: I0226 16:26:45.048753 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:45 crc kubenswrapper[5033]: I0226 16:26:45.748529 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:45 crc kubenswrapper[5033]: I0226 16:26:45.795677 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44x92"] Feb 26 16:26:47 crc kubenswrapper[5033]: I0226 16:26:47.722785 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-44x92" podUID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerName="registry-server" containerID="cri-o://3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19" gracePeriod=2 Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.113717 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.227367 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-catalog-content\") pod \"ff00de95-024b-4770-8c61-10fa4e3139bf\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.227415 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-utilities\") pod \"ff00de95-024b-4770-8c61-10fa4e3139bf\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.227463 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwr96\" (UniqueName: \"kubernetes.io/projected/ff00de95-024b-4770-8c61-10fa4e3139bf-kube-api-access-fwr96\") pod \"ff00de95-024b-4770-8c61-10fa4e3139bf\" (UID: \"ff00de95-024b-4770-8c61-10fa4e3139bf\") " Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.228392 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-utilities" (OuterVolumeSpecName: "utilities") pod "ff00de95-024b-4770-8c61-10fa4e3139bf" (UID: "ff00de95-024b-4770-8c61-10fa4e3139bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.232814 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff00de95-024b-4770-8c61-10fa4e3139bf-kube-api-access-fwr96" (OuterVolumeSpecName: "kube-api-access-fwr96") pod "ff00de95-024b-4770-8c61-10fa4e3139bf" (UID: "ff00de95-024b-4770-8c61-10fa4e3139bf"). InnerVolumeSpecName "kube-api-access-fwr96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.255734 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff00de95-024b-4770-8c61-10fa4e3139bf" (UID: "ff00de95-024b-4770-8c61-10fa4e3139bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.329390 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.329425 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff00de95-024b-4770-8c61-10fa4e3139bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.329435 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwr96\" (UniqueName: \"kubernetes.io/projected/ff00de95-024b-4770-8c61-10fa4e3139bf-kube-api-access-fwr96\") on node \"crc\" DevicePath \"\"" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.732129 5033 generic.go:334] "Generic (PLEG): container finished" podID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerID="3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19" exitCode=0 Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.732170 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44x92" event={"ID":"ff00de95-024b-4770-8c61-10fa4e3139bf","Type":"ContainerDied","Data":"3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19"} Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.732196 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44x92" event={"ID":"ff00de95-024b-4770-8c61-10fa4e3139bf","Type":"ContainerDied","Data":"bef8fa512af46e35e5a151471e2f337dfb0a6b6dec0a27c8bc5bad31ae813540"} Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.732212 5033 scope.go:117] "RemoveContainer" containerID="3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.732345 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44x92" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.760041 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44x92"] Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.766090 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-44x92"] Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.766441 5033 scope.go:117] "RemoveContainer" containerID="2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.787470 5033 scope.go:117] "RemoveContainer" containerID="baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.804934 5033 scope.go:117] "RemoveContainer" containerID="3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19" Feb 26 16:26:48 crc kubenswrapper[5033]: E0226 16:26:48.805533 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19\": container with ID starting with 3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19 not found: ID does not exist" containerID="3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.805577 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19"} err="failed to get container status \"3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19\": rpc error: code = NotFound desc = could not find container \"3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19\": container with ID starting with 3d5030a9d16ad8a01f8abad82de883db7b7a49cbfbc1322c84e7f1159f468d19 not found: ID does not exist" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.805603 5033 scope.go:117] "RemoveContainer" containerID="2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72" Feb 26 16:26:48 crc kubenswrapper[5033]: E0226 16:26:48.805891 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72\": container with ID starting with 2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72 not found: ID does not exist" containerID="2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.805937 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72"} err="failed to get container status \"2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72\": rpc error: code = NotFound desc = could not find container \"2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72\": container with ID starting with 2e5d37499ca16567d2937580443d62fdd9488489565e6122e158efdc01e08d72 not found: ID does not exist" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.805963 5033 scope.go:117] "RemoveContainer" containerID="baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b" Feb 26 16:26:48 crc kubenswrapper[5033]: E0226 16:26:48.806308 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b\": container with ID starting with baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b not found: ID does not exist" containerID="baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b" Feb 26 16:26:48 crc kubenswrapper[5033]: I0226 16:26:48.806369 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b"} err="failed to get container status \"baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b\": rpc error: code = NotFound desc = could not find container \"baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b\": container with ID starting with baa9017f629f2f14606822456b0f495d643b973f275ead336bf6da24944c974b not found: ID does not exist" Feb 26 16:26:49 crc kubenswrapper[5033]: I0226 16:26:49.638193 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff00de95-024b-4770-8c61-10fa4e3139bf" path="/var/lib/kubelet/pods/ff00de95-024b-4770-8c61-10fa4e3139bf/volumes" Feb 26 16:26:50 crc kubenswrapper[5033]: I0226 16:26:50.188772 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:26:50 crc kubenswrapper[5033]: I0226 16:26:50.189450 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:26:51 crc kubenswrapper[5033]: I0226 16:26:51.856200 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:51 crc kubenswrapper[5033]: I0226 16:26:51.907225 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:52 crc kubenswrapper[5033]: I0226 16:26:52.096542 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6ccq"] Feb 26 16:26:53 crc kubenswrapper[5033]: I0226 16:26:53.770211 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j6ccq" podUID="8aef6251-9e42-405d-b308-b96155298114" containerName="registry-server" containerID="cri-o://d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b" gracePeriod=2 Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.198169 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.331724 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-utilities\") pod \"8aef6251-9e42-405d-b308-b96155298114\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.331831 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh27v\" (UniqueName: \"kubernetes.io/projected/8aef6251-9e42-405d-b308-b96155298114-kube-api-access-lh27v\") pod \"8aef6251-9e42-405d-b308-b96155298114\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.331890 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-catalog-content\") pod \"8aef6251-9e42-405d-b308-b96155298114\" (UID: \"8aef6251-9e42-405d-b308-b96155298114\") " Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.333113 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-utilities" (OuterVolumeSpecName: "utilities") pod "8aef6251-9e42-405d-b308-b96155298114" (UID: "8aef6251-9e42-405d-b308-b96155298114"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.333213 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.339669 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aef6251-9e42-405d-b308-b96155298114-kube-api-access-lh27v" (OuterVolumeSpecName: "kube-api-access-lh27v") pod "8aef6251-9e42-405d-b308-b96155298114" (UID: "8aef6251-9e42-405d-b308-b96155298114"). InnerVolumeSpecName "kube-api-access-lh27v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.434524 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh27v\" (UniqueName: \"kubernetes.io/projected/8aef6251-9e42-405d-b308-b96155298114-kube-api-access-lh27v\") on node \"crc\" DevicePath \"\"" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.488013 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8aef6251-9e42-405d-b308-b96155298114" (UID: "8aef6251-9e42-405d-b308-b96155298114"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.537074 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aef6251-9e42-405d-b308-b96155298114-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.780893 5033 generic.go:334] "Generic (PLEG): container finished" podID="8aef6251-9e42-405d-b308-b96155298114" containerID="d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b" exitCode=0 Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.780932 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6ccq" event={"ID":"8aef6251-9e42-405d-b308-b96155298114","Type":"ContainerDied","Data":"d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b"} Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.780959 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6ccq" event={"ID":"8aef6251-9e42-405d-b308-b96155298114","Type":"ContainerDied","Data":"bfd92c7859c3145b6ae688c8703701b9b22517176682b34f350fa08e4ca6e029"} Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.780978 5033 scope.go:117] "RemoveContainer" containerID="d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.781092 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6ccq" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.818576 5033 scope.go:117] "RemoveContainer" containerID="0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.844441 5033 scope.go:117] "RemoveContainer" containerID="89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.857637 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6ccq"] Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.861783 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j6ccq"] Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.869586 5033 scope.go:117] "RemoveContainer" containerID="d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b" Feb 26 16:26:54 crc kubenswrapper[5033]: E0226 16:26:54.869952 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b\": container with ID starting with d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b not found: ID does not exist" containerID="d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.870073 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b"} err="failed to get container status \"d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b\": rpc error: code = NotFound desc = could not find container \"d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b\": container with ID starting with d41e8e3a88cd5408fa4a37a4cc66b25c48a7a860f4111631a467b89e251f5e7b not found: ID does not exist" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.870174 5033 scope.go:117] "RemoveContainer" containerID="0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef" Feb 26 16:26:54 crc kubenswrapper[5033]: E0226 16:26:54.870600 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef\": container with ID starting with 0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef not found: ID does not exist" containerID="0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.870655 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef"} err="failed to get container status \"0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef\": rpc error: code = NotFound desc = could not find container \"0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef\": container with ID starting with 0e6bb8503ea2926d7d8da0efed8e474b97b8e6ce8a2282056ed81829d36ef4ef not found: ID does not exist" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.870693 5033 scope.go:117] "RemoveContainer" containerID="89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365" Feb 26 16:26:54 crc kubenswrapper[5033]: E0226 16:26:54.870879 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365\": container with ID starting with 89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365 not found: ID does not exist" containerID="89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365" Feb 26 16:26:54 crc kubenswrapper[5033]: I0226 16:26:54.870979 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365"} err="failed to get container status \"89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365\": rpc error: code = NotFound desc = could not find container \"89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365\": container with ID starting with 89d39f87c56e8ba09fbed0def0aef9211ab6f94c14a385c053fdc44b0f39e365 not found: ID does not exist" Feb 26 16:26:55 crc kubenswrapper[5033]: I0226 16:26:55.637060 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aef6251-9e42-405d-b308-b96155298114" path="/var/lib/kubelet/pods/8aef6251-9e42-405d-b308-b96155298114/volumes" Feb 26 16:27:20 crc kubenswrapper[5033]: I0226 16:27:20.189070 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:27:20 crc kubenswrapper[5033]: I0226 16:27:20.189529 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:27:50 crc kubenswrapper[5033]: I0226 16:27:50.189561 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:27:50 crc kubenswrapper[5033]: I0226 16:27:50.190168 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:27:50 crc kubenswrapper[5033]: I0226 16:27:50.190239 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:27:50 crc kubenswrapper[5033]: I0226 16:27:50.191241 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dad6881e86f5b0bfddfa2805c22679ea01f0f421cdafb8e298b9c2c7e12fdd3c"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:27:50 crc kubenswrapper[5033]: I0226 16:27:50.191407 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://dad6881e86f5b0bfddfa2805c22679ea01f0f421cdafb8e298b9c2c7e12fdd3c" gracePeriod=600 Feb 26 16:27:51 crc kubenswrapper[5033]: I0226 16:27:51.243339 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="dad6881e86f5b0bfddfa2805c22679ea01f0f421cdafb8e298b9c2c7e12fdd3c" exitCode=0 Feb 26 16:27:51 crc kubenswrapper[5033]: I0226 16:27:51.243435 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"dad6881e86f5b0bfddfa2805c22679ea01f0f421cdafb8e298b9c2c7e12fdd3c"} Feb 26 16:27:51 crc kubenswrapper[5033]: I0226 16:27:51.244123 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129"} Feb 26 16:27:51 crc kubenswrapper[5033]: I0226 16:27:51.244159 5033 scope.go:117] "RemoveContainer" containerID="6f23a1b39a5a39caa49edf03ffc0e53481d6bcd5e66ec246a9c8ef1e64073a67" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.138801 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535388-c8c5f"] Feb 26 16:28:00 crc kubenswrapper[5033]: E0226 16:28:00.141568 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerName="extract-utilities" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.141592 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerName="extract-utilities" Feb 26 16:28:00 crc kubenswrapper[5033]: E0226 16:28:00.141609 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aef6251-9e42-405d-b308-b96155298114" containerName="registry-server" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.141614 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aef6251-9e42-405d-b308-b96155298114" containerName="registry-server" Feb 26 16:28:00 crc kubenswrapper[5033]: E0226 16:28:00.141622 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerName="registry-server" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.141627 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerName="registry-server" Feb 26 16:28:00 crc kubenswrapper[5033]: E0226 16:28:00.141640 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aef6251-9e42-405d-b308-b96155298114" containerName="extract-utilities" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.141646 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aef6251-9e42-405d-b308-b96155298114" containerName="extract-utilities" Feb 26 16:28:00 crc kubenswrapper[5033]: E0226 16:28:00.141665 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerName="extract-content" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.141671 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerName="extract-content" Feb 26 16:28:00 crc kubenswrapper[5033]: E0226 16:28:00.141682 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aef6251-9e42-405d-b308-b96155298114" containerName="extract-content" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.141687 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aef6251-9e42-405d-b308-b96155298114" containerName="extract-content" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.142535 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aef6251-9e42-405d-b308-b96155298114" containerName="registry-server" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.142552 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff00de95-024b-4770-8c61-10fa4e3139bf" containerName="registry-server" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.143040 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535388-c8c5f" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.145348 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535388-c8c5f"] Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.146248 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.146442 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.146565 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.229759 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9g8p\" (UniqueName: \"kubernetes.io/projected/7343fddd-fec4-4666-aceb-c14252365de2-kube-api-access-n9g8p\") pod \"auto-csr-approver-29535388-c8c5f\" (UID: \"7343fddd-fec4-4666-aceb-c14252365de2\") " pod="openshift-infra/auto-csr-approver-29535388-c8c5f" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.331246 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9g8p\" (UniqueName: \"kubernetes.io/projected/7343fddd-fec4-4666-aceb-c14252365de2-kube-api-access-n9g8p\") pod \"auto-csr-approver-29535388-c8c5f\" (UID: \"7343fddd-fec4-4666-aceb-c14252365de2\") " pod="openshift-infra/auto-csr-approver-29535388-c8c5f" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.352603 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9g8p\" (UniqueName: \"kubernetes.io/projected/7343fddd-fec4-4666-aceb-c14252365de2-kube-api-access-n9g8p\") pod \"auto-csr-approver-29535388-c8c5f\" (UID: \"7343fddd-fec4-4666-aceb-c14252365de2\") " pod="openshift-infra/auto-csr-approver-29535388-c8c5f" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.463397 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535388-c8c5f" Feb 26 16:28:00 crc kubenswrapper[5033]: I0226 16:28:00.887004 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535388-c8c5f"] Feb 26 16:28:00 crc kubenswrapper[5033]: W0226 16:28:00.895827 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7343fddd_fec4_4666_aceb_c14252365de2.slice/crio-9ed0298791e434af820088c6e4839d059efa4315a0964ca9db4aa08a0f63e95e WatchSource:0}: Error finding container 9ed0298791e434af820088c6e4839d059efa4315a0964ca9db4aa08a0f63e95e: Status 404 returned error can't find the container with id 9ed0298791e434af820088c6e4839d059efa4315a0964ca9db4aa08a0f63e95e Feb 26 16:28:01 crc kubenswrapper[5033]: I0226 16:28:01.330149 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535388-c8c5f" event={"ID":"7343fddd-fec4-4666-aceb-c14252365de2","Type":"ContainerStarted","Data":"9ed0298791e434af820088c6e4839d059efa4315a0964ca9db4aa08a0f63e95e"} Feb 26 16:28:02 crc kubenswrapper[5033]: I0226 16:28:02.338037 5033 generic.go:334] "Generic (PLEG): container finished" podID="7343fddd-fec4-4666-aceb-c14252365de2" containerID="bbf4bf36280775769cacd8867246f4f16ff71ed6611724ea29d34d12c30bd81a" exitCode=0 Feb 26 16:28:02 crc kubenswrapper[5033]: I0226 16:28:02.338141 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535388-c8c5f" event={"ID":"7343fddd-fec4-4666-aceb-c14252365de2","Type":"ContainerDied","Data":"bbf4bf36280775769cacd8867246f4f16ff71ed6611724ea29d34d12c30bd81a"} Feb 26 16:28:03 crc kubenswrapper[5033]: I0226 16:28:03.642471 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535388-c8c5f" Feb 26 16:28:03 crc kubenswrapper[5033]: I0226 16:28:03.779977 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9g8p\" (UniqueName: \"kubernetes.io/projected/7343fddd-fec4-4666-aceb-c14252365de2-kube-api-access-n9g8p\") pod \"7343fddd-fec4-4666-aceb-c14252365de2\" (UID: \"7343fddd-fec4-4666-aceb-c14252365de2\") " Feb 26 16:28:03 crc kubenswrapper[5033]: I0226 16:28:03.785976 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7343fddd-fec4-4666-aceb-c14252365de2-kube-api-access-n9g8p" (OuterVolumeSpecName: "kube-api-access-n9g8p") pod "7343fddd-fec4-4666-aceb-c14252365de2" (UID: "7343fddd-fec4-4666-aceb-c14252365de2"). InnerVolumeSpecName "kube-api-access-n9g8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:28:03 crc kubenswrapper[5033]: I0226 16:28:03.881356 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9g8p\" (UniqueName: \"kubernetes.io/projected/7343fddd-fec4-4666-aceb-c14252365de2-kube-api-access-n9g8p\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:04 crc kubenswrapper[5033]: I0226 16:28:04.355336 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535388-c8c5f" event={"ID":"7343fddd-fec4-4666-aceb-c14252365de2","Type":"ContainerDied","Data":"9ed0298791e434af820088c6e4839d059efa4315a0964ca9db4aa08a0f63e95e"} Feb 26 16:28:04 crc kubenswrapper[5033]: I0226 16:28:04.355377 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed0298791e434af820088c6e4839d059efa4315a0964ca9db4aa08a0f63e95e" Feb 26 16:28:04 crc kubenswrapper[5033]: I0226 16:28:04.355424 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535388-c8c5f" Feb 26 16:28:04 crc kubenswrapper[5033]: I0226 16:28:04.711856 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535382-v9hkm"] Feb 26 16:28:04 crc kubenswrapper[5033]: I0226 16:28:04.719441 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535382-v9hkm"] Feb 26 16:28:05 crc kubenswrapper[5033]: I0226 16:28:05.638714 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f11bd3-65fa-4240-9f2f-b9a530a62bb3" path="/var/lib/kubelet/pods/90f11bd3-65fa-4240-9f2f-b9a530a62bb3/volumes" Feb 26 16:28:30 crc kubenswrapper[5033]: I0226 16:28:30.484909 5033 scope.go:117] "RemoveContainer" containerID="04c8ed43e5413747c0601dc86510b0bf7df1904a69281e0af47aa7da1a1f8be4" Feb 26 16:29:50 crc kubenswrapper[5033]: I0226 16:29:50.188724 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:29:50 crc kubenswrapper[5033]: I0226 16:29:50.189239 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.161562 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf"] Feb 26 16:30:00 crc kubenswrapper[5033]: E0226 16:30:00.162660 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7343fddd-fec4-4666-aceb-c14252365de2" containerName="oc" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.162686 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7343fddd-fec4-4666-aceb-c14252365de2" containerName="oc" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.162876 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7343fddd-fec4-4666-aceb-c14252365de2" containerName="oc" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.163585 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.168361 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.173120 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.176099 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf"] Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.227334 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljj56\" (UniqueName: \"kubernetes.io/projected/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-kube-api-access-ljj56\") pod \"collect-profiles-29535390-fg8wf\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.227430 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-config-volume\") pod \"collect-profiles-29535390-fg8wf\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.227494 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-secret-volume\") pod \"collect-profiles-29535390-fg8wf\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.253113 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535390-8pbnt"] Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.254001 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535390-8pbnt" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.255878 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.256665 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.257117 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.266032 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535390-8pbnt"] Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.329105 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljj56\" (UniqueName: \"kubernetes.io/projected/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-kube-api-access-ljj56\") pod \"collect-profiles-29535390-fg8wf\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.329201 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xlbh\" (UniqueName: \"kubernetes.io/projected/14df2479-2a8f-44c8-ab6e-0032e08eb3be-kube-api-access-2xlbh\") pod \"auto-csr-approver-29535390-8pbnt\" (UID: \"14df2479-2a8f-44c8-ab6e-0032e08eb3be\") " pod="openshift-infra/auto-csr-approver-29535390-8pbnt" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.329240 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-config-volume\") pod \"collect-profiles-29535390-fg8wf\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.329316 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-secret-volume\") pod \"collect-profiles-29535390-fg8wf\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.330194 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-config-volume\") pod \"collect-profiles-29535390-fg8wf\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.334440 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-secret-volume\") pod \"collect-profiles-29535390-fg8wf\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.346208 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljj56\" (UniqueName: \"kubernetes.io/projected/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-kube-api-access-ljj56\") pod \"collect-profiles-29535390-fg8wf\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.430129 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xlbh\" (UniqueName: \"kubernetes.io/projected/14df2479-2a8f-44c8-ab6e-0032e08eb3be-kube-api-access-2xlbh\") pod \"auto-csr-approver-29535390-8pbnt\" (UID: \"14df2479-2a8f-44c8-ab6e-0032e08eb3be\") " pod="openshift-infra/auto-csr-approver-29535390-8pbnt" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.447052 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xlbh\" (UniqueName: \"kubernetes.io/projected/14df2479-2a8f-44c8-ab6e-0032e08eb3be-kube-api-access-2xlbh\") pod \"auto-csr-approver-29535390-8pbnt\" (UID: \"14df2479-2a8f-44c8-ab6e-0032e08eb3be\") " pod="openshift-infra/auto-csr-approver-29535390-8pbnt" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.485649 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.571154 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535390-8pbnt" Feb 26 16:30:00 crc kubenswrapper[5033]: I0226 16:30:00.918321 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf"] Feb 26 16:30:01 crc kubenswrapper[5033]: I0226 16:30:01.010094 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535390-8pbnt"] Feb 26 16:30:01 crc kubenswrapper[5033]: W0226 16:30:01.017629 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14df2479_2a8f_44c8_ab6e_0032e08eb3be.slice/crio-773a85ecffd84d611da407e11043288e364deb6c60e1845549daa8d286430787 WatchSource:0}: Error finding container 773a85ecffd84d611da407e11043288e364deb6c60e1845549daa8d286430787: Status 404 returned error can't find the container with id 773a85ecffd84d611da407e11043288e364deb6c60e1845549daa8d286430787 Feb 26 16:30:01 crc kubenswrapper[5033]: I0226 16:30:01.020540 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:30:01 crc kubenswrapper[5033]: I0226 16:30:01.637864 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535390-8pbnt" event={"ID":"14df2479-2a8f-44c8-ab6e-0032e08eb3be","Type":"ContainerStarted","Data":"773a85ecffd84d611da407e11043288e364deb6c60e1845549daa8d286430787"} Feb 26 16:30:01 crc kubenswrapper[5033]: I0226 16:30:01.638176 5033 generic.go:334] "Generic (PLEG): container finished" podID="3bda0c14-3fc0-4f66-a3b1-f9359e21a587" containerID="735649ec8dc2e420d393f9746aaaca6c864f7bca67aebbbd2f5931365f50469d" exitCode=0 Feb 26 16:30:01 crc kubenswrapper[5033]: I0226 16:30:01.638191 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" event={"ID":"3bda0c14-3fc0-4f66-a3b1-f9359e21a587","Type":"ContainerDied","Data":"735649ec8dc2e420d393f9746aaaca6c864f7bca67aebbbd2f5931365f50469d"} Feb 26 16:30:01 crc kubenswrapper[5033]: I0226 16:30:01.638236 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" event={"ID":"3bda0c14-3fc0-4f66-a3b1-f9359e21a587","Type":"ContainerStarted","Data":"14a65edf2336f2974e679166cd6d13f53dea0f0349545106e6361fc895d7bf67"} Feb 26 16:30:02 crc kubenswrapper[5033]: I0226 16:30:02.981047 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.064709 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljj56\" (UniqueName: \"kubernetes.io/projected/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-kube-api-access-ljj56\") pod \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.064750 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-config-volume\") pod \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.064785 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-secret-volume\") pod \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\" (UID: \"3bda0c14-3fc0-4f66-a3b1-f9359e21a587\") " Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.066100 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-config-volume" (OuterVolumeSpecName: "config-volume") pod "3bda0c14-3fc0-4f66-a3b1-f9359e21a587" (UID: "3bda0c14-3fc0-4f66-a3b1-f9359e21a587"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.069839 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3bda0c14-3fc0-4f66-a3b1-f9359e21a587" (UID: "3bda0c14-3fc0-4f66-a3b1-f9359e21a587"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.069946 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-kube-api-access-ljj56" (OuterVolumeSpecName: "kube-api-access-ljj56") pod "3bda0c14-3fc0-4f66-a3b1-f9359e21a587" (UID: "3bda0c14-3fc0-4f66-a3b1-f9359e21a587"). InnerVolumeSpecName "kube-api-access-ljj56". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.166544 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljj56\" (UniqueName: \"kubernetes.io/projected/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-kube-api-access-ljj56\") on node \"crc\" DevicePath \"\"" Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.166582 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.166594 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bda0c14-3fc0-4f66-a3b1-f9359e21a587-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.661419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" event={"ID":"3bda0c14-3fc0-4f66-a3b1-f9359e21a587","Type":"ContainerDied","Data":"14a65edf2336f2974e679166cd6d13f53dea0f0349545106e6361fc895d7bf67"} Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.661468 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14a65edf2336f2974e679166cd6d13f53dea0f0349545106e6361fc895d7bf67" Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.661476 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf" Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.663543 5033 generic.go:334] "Generic (PLEG): container finished" podID="14df2479-2a8f-44c8-ab6e-0032e08eb3be" containerID="daf450f440423ace4bf71a56d9b82a792fe08e3d87c1b2bedc498c1949145141" exitCode=0 Feb 26 16:30:03 crc kubenswrapper[5033]: I0226 16:30:03.663603 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535390-8pbnt" event={"ID":"14df2479-2a8f-44c8-ab6e-0032e08eb3be","Type":"ContainerDied","Data":"daf450f440423ace4bf71a56d9b82a792fe08e3d87c1b2bedc498c1949145141"} Feb 26 16:30:04 crc kubenswrapper[5033]: I0226 16:30:04.046555 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr"] Feb 26 16:30:04 crc kubenswrapper[5033]: I0226 16:30:04.051019 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-rndzr"] Feb 26 16:30:04 crc kubenswrapper[5033]: I0226 16:30:04.914480 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535390-8pbnt" Feb 26 16:30:04 crc kubenswrapper[5033]: I0226 16:30:04.990229 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xlbh\" (UniqueName: \"kubernetes.io/projected/14df2479-2a8f-44c8-ab6e-0032e08eb3be-kube-api-access-2xlbh\") pod \"14df2479-2a8f-44c8-ab6e-0032e08eb3be\" (UID: \"14df2479-2a8f-44c8-ab6e-0032e08eb3be\") " Feb 26 16:30:04 crc kubenswrapper[5033]: I0226 16:30:04.996507 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14df2479-2a8f-44c8-ab6e-0032e08eb3be-kube-api-access-2xlbh" (OuterVolumeSpecName: "kube-api-access-2xlbh") pod "14df2479-2a8f-44c8-ab6e-0032e08eb3be" (UID: "14df2479-2a8f-44c8-ab6e-0032e08eb3be"). InnerVolumeSpecName "kube-api-access-2xlbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:30:05 crc kubenswrapper[5033]: I0226 16:30:05.091313 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xlbh\" (UniqueName: \"kubernetes.io/projected/14df2479-2a8f-44c8-ab6e-0032e08eb3be-kube-api-access-2xlbh\") on node \"crc\" DevicePath \"\"" Feb 26 16:30:05 crc kubenswrapper[5033]: I0226 16:30:05.637688 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9352ff50-4ff9-481c-99fc-ec47324bab97" path="/var/lib/kubelet/pods/9352ff50-4ff9-481c-99fc-ec47324bab97/volumes" Feb 26 16:30:05 crc kubenswrapper[5033]: I0226 16:30:05.680131 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535390-8pbnt" event={"ID":"14df2479-2a8f-44c8-ab6e-0032e08eb3be","Type":"ContainerDied","Data":"773a85ecffd84d611da407e11043288e364deb6c60e1845549daa8d286430787"} Feb 26 16:30:05 crc kubenswrapper[5033]: I0226 16:30:05.680171 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="773a85ecffd84d611da407e11043288e364deb6c60e1845549daa8d286430787" Feb 26 16:30:05 crc kubenswrapper[5033]: I0226 16:30:05.680179 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535390-8pbnt" Feb 26 16:30:05 crc kubenswrapper[5033]: I0226 16:30:05.977289 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535384-bcxls"] Feb 26 16:30:05 crc kubenswrapper[5033]: I0226 16:30:05.983248 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535384-bcxls"] Feb 26 16:30:07 crc kubenswrapper[5033]: I0226 16:30:07.635962 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f904ab-f4ed-40ab-9927-294d445b5d67" path="/var/lib/kubelet/pods/c9f904ab-f4ed-40ab-9927-294d445b5d67/volumes" Feb 26 16:30:20 crc kubenswrapper[5033]: I0226 16:30:20.188708 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:30:20 crc kubenswrapper[5033]: I0226 16:30:20.189942 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:30:30 crc kubenswrapper[5033]: I0226 16:30:30.556497 5033 scope.go:117] "RemoveContainer" containerID="5685d604f74648b889b7d355da1f8725ab416f0fd58a894bbf7cc4f0fd7bad77" Feb 26 16:30:30 crc kubenswrapper[5033]: I0226 16:30:30.603195 5033 scope.go:117] "RemoveContainer" containerID="3b44e3bf7dde37f3bbc00e3f35de8e8541de980855186cd19516d2d805a77974" Feb 26 16:30:50 crc kubenswrapper[5033]: I0226 16:30:50.189376 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:30:50 crc kubenswrapper[5033]: I0226 16:30:50.191207 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:30:50 crc kubenswrapper[5033]: I0226 16:30:50.191378 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:30:50 crc kubenswrapper[5033]: I0226 16:30:50.192067 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:30:50 crc kubenswrapper[5033]: I0226 16:30:50.192203 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" gracePeriod=600 Feb 26 16:30:50 crc kubenswrapper[5033]: E0226 16:30:50.308841 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:30:51 crc kubenswrapper[5033]: I0226 16:30:51.022890 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" exitCode=0 Feb 26 16:30:51 crc kubenswrapper[5033]: I0226 16:30:51.022961 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129"} Feb 26 16:30:51 crc kubenswrapper[5033]: I0226 16:30:51.023482 5033 scope.go:117] "RemoveContainer" containerID="dad6881e86f5b0bfddfa2805c22679ea01f0f421cdafb8e298b9c2c7e12fdd3c" Feb 26 16:30:51 crc kubenswrapper[5033]: I0226 16:30:51.024316 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:30:51 crc kubenswrapper[5033]: E0226 16:30:51.024776 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:31:04 crc kubenswrapper[5033]: I0226 16:31:04.627892 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:31:04 crc kubenswrapper[5033]: E0226 16:31:04.628774 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:31:15 crc kubenswrapper[5033]: I0226 16:31:15.628645 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:31:15 crc kubenswrapper[5033]: E0226 16:31:15.629650 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:31:26 crc kubenswrapper[5033]: I0226 16:31:26.627770 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:31:26 crc kubenswrapper[5033]: E0226 16:31:26.628662 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:31:39 crc kubenswrapper[5033]: I0226 16:31:39.628777 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:31:39 crc kubenswrapper[5033]: E0226 16:31:39.630643 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:31:52 crc kubenswrapper[5033]: I0226 16:31:52.628064 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:31:52 crc kubenswrapper[5033]: E0226 16:31:52.628954 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.145492 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535392-jrg97"] Feb 26 16:32:00 crc kubenswrapper[5033]: E0226 16:32:00.146542 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bda0c14-3fc0-4f66-a3b1-f9359e21a587" containerName="collect-profiles" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.146566 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bda0c14-3fc0-4f66-a3b1-f9359e21a587" containerName="collect-profiles" Feb 26 16:32:00 crc kubenswrapper[5033]: E0226 16:32:00.146600 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14df2479-2a8f-44c8-ab6e-0032e08eb3be" containerName="oc" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.146614 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="14df2479-2a8f-44c8-ab6e-0032e08eb3be" containerName="oc" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.146918 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bda0c14-3fc0-4f66-a3b1-f9359e21a587" containerName="collect-profiles" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.146945 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="14df2479-2a8f-44c8-ab6e-0032e08eb3be" containerName="oc" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.147729 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535392-jrg97" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.151044 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.151151 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.151172 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.155809 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535392-jrg97"] Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.212841 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss5kx\" (UniqueName: \"kubernetes.io/projected/4462c6ff-e148-40af-baa3-7aa2f7950218-kube-api-access-ss5kx\") pod \"auto-csr-approver-29535392-jrg97\" (UID: \"4462c6ff-e148-40af-baa3-7aa2f7950218\") " pod="openshift-infra/auto-csr-approver-29535392-jrg97" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.314123 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss5kx\" (UniqueName: \"kubernetes.io/projected/4462c6ff-e148-40af-baa3-7aa2f7950218-kube-api-access-ss5kx\") pod \"auto-csr-approver-29535392-jrg97\" (UID: \"4462c6ff-e148-40af-baa3-7aa2f7950218\") " pod="openshift-infra/auto-csr-approver-29535392-jrg97" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.337232 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss5kx\" (UniqueName: \"kubernetes.io/projected/4462c6ff-e148-40af-baa3-7aa2f7950218-kube-api-access-ss5kx\") pod \"auto-csr-approver-29535392-jrg97\" (UID: \"4462c6ff-e148-40af-baa3-7aa2f7950218\") " pod="openshift-infra/auto-csr-approver-29535392-jrg97" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.466224 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535392-jrg97" Feb 26 16:32:00 crc kubenswrapper[5033]: I0226 16:32:00.905841 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535392-jrg97"] Feb 26 16:32:01 crc kubenswrapper[5033]: I0226 16:32:01.700815 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535392-jrg97" event={"ID":"4462c6ff-e148-40af-baa3-7aa2f7950218","Type":"ContainerStarted","Data":"7112541cd3ded2f840a0056f7136db35ba3c6aa6e3f142d7cfb00eed6fca1074"} Feb 26 16:32:02 crc kubenswrapper[5033]: I0226 16:32:02.710088 5033 generic.go:334] "Generic (PLEG): container finished" podID="4462c6ff-e148-40af-baa3-7aa2f7950218" containerID="cd4f707e8a8f179cbcdb59053fa4ad8b59fe3317d17d2aa9e6a4c14781e27ef9" exitCode=0 Feb 26 16:32:02 crc kubenswrapper[5033]: I0226 16:32:02.710147 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535392-jrg97" event={"ID":"4462c6ff-e148-40af-baa3-7aa2f7950218","Type":"ContainerDied","Data":"cd4f707e8a8f179cbcdb59053fa4ad8b59fe3317d17d2aa9e6a4c14781e27ef9"} Feb 26 16:32:04 crc kubenswrapper[5033]: I0226 16:32:04.039548 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535392-jrg97" Feb 26 16:32:04 crc kubenswrapper[5033]: I0226 16:32:04.116121 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss5kx\" (UniqueName: \"kubernetes.io/projected/4462c6ff-e148-40af-baa3-7aa2f7950218-kube-api-access-ss5kx\") pod \"4462c6ff-e148-40af-baa3-7aa2f7950218\" (UID: \"4462c6ff-e148-40af-baa3-7aa2f7950218\") " Feb 26 16:32:04 crc kubenswrapper[5033]: I0226 16:32:04.121995 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4462c6ff-e148-40af-baa3-7aa2f7950218-kube-api-access-ss5kx" (OuterVolumeSpecName: "kube-api-access-ss5kx") pod "4462c6ff-e148-40af-baa3-7aa2f7950218" (UID: "4462c6ff-e148-40af-baa3-7aa2f7950218"). InnerVolumeSpecName "kube-api-access-ss5kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:32:04 crc kubenswrapper[5033]: I0226 16:32:04.218125 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss5kx\" (UniqueName: \"kubernetes.io/projected/4462c6ff-e148-40af-baa3-7aa2f7950218-kube-api-access-ss5kx\") on node \"crc\" DevicePath \"\"" Feb 26 16:32:04 crc kubenswrapper[5033]: I0226 16:32:04.726610 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535392-jrg97" event={"ID":"4462c6ff-e148-40af-baa3-7aa2f7950218","Type":"ContainerDied","Data":"7112541cd3ded2f840a0056f7136db35ba3c6aa6e3f142d7cfb00eed6fca1074"} Feb 26 16:32:04 crc kubenswrapper[5033]: I0226 16:32:04.726653 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535392-jrg97" Feb 26 16:32:04 crc kubenswrapper[5033]: I0226 16:32:04.726661 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7112541cd3ded2f840a0056f7136db35ba3c6aa6e3f142d7cfb00eed6fca1074" Feb 26 16:32:05 crc kubenswrapper[5033]: I0226 16:32:05.108371 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535386-4r2m6"] Feb 26 16:32:05 crc kubenswrapper[5033]: I0226 16:32:05.117049 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535386-4r2m6"] Feb 26 16:32:05 crc kubenswrapper[5033]: I0226 16:32:05.636300 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d225af9c-75fd-4686-83e2-91ea2fccc3ff" path="/var/lib/kubelet/pods/d225af9c-75fd-4686-83e2-91ea2fccc3ff/volumes" Feb 26 16:32:07 crc kubenswrapper[5033]: I0226 16:32:07.631285 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:32:07 crc kubenswrapper[5033]: E0226 16:32:07.631780 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:32:19 crc kubenswrapper[5033]: I0226 16:32:19.628067 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:32:19 crc kubenswrapper[5033]: E0226 16:32:19.628731 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:32:30 crc kubenswrapper[5033]: I0226 16:32:30.660221 5033 scope.go:117] "RemoveContainer" containerID="37b682c7ede28145719bce76fb9de2b1641633647448add9bed57b5f908e5a36" Feb 26 16:32:34 crc kubenswrapper[5033]: I0226 16:32:34.628375 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:32:34 crc kubenswrapper[5033]: E0226 16:32:34.629203 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:32:45 crc kubenswrapper[5033]: I0226 16:32:45.628231 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:32:45 crc kubenswrapper[5033]: E0226 16:32:45.629342 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:32:58 crc kubenswrapper[5033]: I0226 16:32:58.628784 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:32:58 crc kubenswrapper[5033]: E0226 16:32:58.629658 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:33:11 crc kubenswrapper[5033]: I0226 16:33:11.628528 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:33:11 crc kubenswrapper[5033]: E0226 16:33:11.629842 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:33:22 crc kubenswrapper[5033]: I0226 16:33:22.628618 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:33:22 crc kubenswrapper[5033]: E0226 16:33:22.629546 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:33:33 crc kubenswrapper[5033]: I0226 16:33:33.632972 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:33:33 crc kubenswrapper[5033]: E0226 16:33:33.633823 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:33:48 crc kubenswrapper[5033]: I0226 16:33:48.628451 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:33:48 crc kubenswrapper[5033]: E0226 16:33:48.629714 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.153010 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535394-dmwmj"] Feb 26 16:34:00 crc kubenswrapper[5033]: E0226 16:34:00.153703 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4462c6ff-e148-40af-baa3-7aa2f7950218" containerName="oc" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.153719 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4462c6ff-e148-40af-baa3-7aa2f7950218" containerName="oc" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.153888 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4462c6ff-e148-40af-baa3-7aa2f7950218" containerName="oc" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.154461 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535394-dmwmj" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.157709 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.157775 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.158005 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.166976 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535394-dmwmj"] Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.176174 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsd8f\" (UniqueName: \"kubernetes.io/projected/0d545ecd-2ba8-49aa-bbdd-84590b9c211a-kube-api-access-qsd8f\") pod \"auto-csr-approver-29535394-dmwmj\" (UID: \"0d545ecd-2ba8-49aa-bbdd-84590b9c211a\") " pod="openshift-infra/auto-csr-approver-29535394-dmwmj" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.277439 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsd8f\" (UniqueName: \"kubernetes.io/projected/0d545ecd-2ba8-49aa-bbdd-84590b9c211a-kube-api-access-qsd8f\") pod \"auto-csr-approver-29535394-dmwmj\" (UID: \"0d545ecd-2ba8-49aa-bbdd-84590b9c211a\") " pod="openshift-infra/auto-csr-approver-29535394-dmwmj" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.295984 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsd8f\" (UniqueName: \"kubernetes.io/projected/0d545ecd-2ba8-49aa-bbdd-84590b9c211a-kube-api-access-qsd8f\") pod \"auto-csr-approver-29535394-dmwmj\" (UID: \"0d545ecd-2ba8-49aa-bbdd-84590b9c211a\") " pod="openshift-infra/auto-csr-approver-29535394-dmwmj" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.476806 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535394-dmwmj" Feb 26 16:34:00 crc kubenswrapper[5033]: I0226 16:34:00.727315 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535394-dmwmj"] Feb 26 16:34:01 crc kubenswrapper[5033]: I0226 16:34:01.645853 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535394-dmwmj" event={"ID":"0d545ecd-2ba8-49aa-bbdd-84590b9c211a","Type":"ContainerStarted","Data":"b48f51ba7f516fd607c925dbbc04e38663eec8e1ba419d9329078caeb21144cc"} Feb 26 16:34:02 crc kubenswrapper[5033]: I0226 16:34:02.640109 5033 generic.go:334] "Generic (PLEG): container finished" podID="0d545ecd-2ba8-49aa-bbdd-84590b9c211a" containerID="4da70d8fd4ff1db94edc3ed5d1673a81d0af04bf2995d1c88331e5048fcdfe1d" exitCode=0 Feb 26 16:34:02 crc kubenswrapper[5033]: I0226 16:34:02.640204 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535394-dmwmj" event={"ID":"0d545ecd-2ba8-49aa-bbdd-84590b9c211a","Type":"ContainerDied","Data":"4da70d8fd4ff1db94edc3ed5d1673a81d0af04bf2995d1c88331e5048fcdfe1d"} Feb 26 16:34:03 crc kubenswrapper[5033]: I0226 16:34:03.638013 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:34:03 crc kubenswrapper[5033]: E0226 16:34:03.638324 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:34:03 crc kubenswrapper[5033]: I0226 16:34:03.881616 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535394-dmwmj" Feb 26 16:34:03 crc kubenswrapper[5033]: I0226 16:34:03.926660 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsd8f\" (UniqueName: \"kubernetes.io/projected/0d545ecd-2ba8-49aa-bbdd-84590b9c211a-kube-api-access-qsd8f\") pod \"0d545ecd-2ba8-49aa-bbdd-84590b9c211a\" (UID: \"0d545ecd-2ba8-49aa-bbdd-84590b9c211a\") " Feb 26 16:34:03 crc kubenswrapper[5033]: I0226 16:34:03.932480 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d545ecd-2ba8-49aa-bbdd-84590b9c211a-kube-api-access-qsd8f" (OuterVolumeSpecName: "kube-api-access-qsd8f") pod "0d545ecd-2ba8-49aa-bbdd-84590b9c211a" (UID: "0d545ecd-2ba8-49aa-bbdd-84590b9c211a"). InnerVolumeSpecName "kube-api-access-qsd8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:34:04 crc kubenswrapper[5033]: I0226 16:34:04.028274 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsd8f\" (UniqueName: \"kubernetes.io/projected/0d545ecd-2ba8-49aa-bbdd-84590b9c211a-kube-api-access-qsd8f\") on node \"crc\" DevicePath \"\"" Feb 26 16:34:04 crc kubenswrapper[5033]: I0226 16:34:04.657358 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535394-dmwmj" event={"ID":"0d545ecd-2ba8-49aa-bbdd-84590b9c211a","Type":"ContainerDied","Data":"b48f51ba7f516fd607c925dbbc04e38663eec8e1ba419d9329078caeb21144cc"} Feb 26 16:34:04 crc kubenswrapper[5033]: I0226 16:34:04.657428 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b48f51ba7f516fd607c925dbbc04e38663eec8e1ba419d9329078caeb21144cc" Feb 26 16:34:04 crc kubenswrapper[5033]: I0226 16:34:04.657377 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535394-dmwmj" Feb 26 16:34:04 crc kubenswrapper[5033]: I0226 16:34:04.953053 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535388-c8c5f"] Feb 26 16:34:04 crc kubenswrapper[5033]: I0226 16:34:04.958386 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535388-c8c5f"] Feb 26 16:34:05 crc kubenswrapper[5033]: I0226 16:34:05.643326 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7343fddd-fec4-4666-aceb-c14252365de2" path="/var/lib/kubelet/pods/7343fddd-fec4-4666-aceb-c14252365de2/volumes" Feb 26 16:34:16 crc kubenswrapper[5033]: I0226 16:34:16.628531 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:34:16 crc kubenswrapper[5033]: E0226 16:34:16.631164 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:34:28 crc kubenswrapper[5033]: I0226 16:34:28.628717 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:34:28 crc kubenswrapper[5033]: E0226 16:34:28.629628 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:34:30 crc kubenswrapper[5033]: I0226 16:34:30.748799 5033 scope.go:117] "RemoveContainer" containerID="bbf4bf36280775769cacd8867246f4f16ff71ed6611724ea29d34d12c30bd81a" Feb 26 16:34:42 crc kubenswrapper[5033]: I0226 16:34:42.627591 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:34:42 crc kubenswrapper[5033]: E0226 16:34:42.628204 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:34:53 crc kubenswrapper[5033]: I0226 16:34:53.632974 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:34:53 crc kubenswrapper[5033]: E0226 16:34:53.633829 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:35:07 crc kubenswrapper[5033]: I0226 16:35:07.628208 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:35:07 crc kubenswrapper[5033]: E0226 16:35:07.629503 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:35:22 crc kubenswrapper[5033]: I0226 16:35:22.628788 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:35:22 crc kubenswrapper[5033]: E0226 16:35:22.629867 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:35:36 crc kubenswrapper[5033]: I0226 16:35:36.627789 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:35:36 crc kubenswrapper[5033]: E0226 16:35:36.628508 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:35:48 crc kubenswrapper[5033]: I0226 16:35:48.628515 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:35:48 crc kubenswrapper[5033]: E0226 16:35:48.629483 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:35:59 crc kubenswrapper[5033]: I0226 16:35:59.628444 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.175139 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535396-sj7rb"] Feb 26 16:36:00 crc kubenswrapper[5033]: E0226 16:36:00.175839 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d545ecd-2ba8-49aa-bbdd-84590b9c211a" containerName="oc" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.175855 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d545ecd-2ba8-49aa-bbdd-84590b9c211a" containerName="oc" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.176053 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d545ecd-2ba8-49aa-bbdd-84590b9c211a" containerName="oc" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.176590 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535396-sj7rb" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.179521 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.179701 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.179781 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.193309 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535396-sj7rb"] Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.230695 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mwrx\" (UniqueName: \"kubernetes.io/projected/bf37743b-0d6c-4977-b6c7-23000743139a-kube-api-access-2mwrx\") pod \"auto-csr-approver-29535396-sj7rb\" (UID: \"bf37743b-0d6c-4977-b6c7-23000743139a\") " pod="openshift-infra/auto-csr-approver-29535396-sj7rb" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.331985 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mwrx\" (UniqueName: \"kubernetes.io/projected/bf37743b-0d6c-4977-b6c7-23000743139a-kube-api-access-2mwrx\") pod \"auto-csr-approver-29535396-sj7rb\" (UID: \"bf37743b-0d6c-4977-b6c7-23000743139a\") " pod="openshift-infra/auto-csr-approver-29535396-sj7rb" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.351328 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mwrx\" (UniqueName: \"kubernetes.io/projected/bf37743b-0d6c-4977-b6c7-23000743139a-kube-api-access-2mwrx\") pod \"auto-csr-approver-29535396-sj7rb\" (UID: \"bf37743b-0d6c-4977-b6c7-23000743139a\") " pod="openshift-infra/auto-csr-approver-29535396-sj7rb" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.503815 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535396-sj7rb" Feb 26 16:36:00 crc kubenswrapper[5033]: I0226 16:36:00.590368 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"3d4991683170439fd91b891f66e6b5d338934da27938fbdc44f3499149cd3e54"} Feb 26 16:36:01 crc kubenswrapper[5033]: I0226 16:36:01.010838 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535396-sj7rb"] Feb 26 16:36:01 crc kubenswrapper[5033]: I0226 16:36:01.025011 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:36:01 crc kubenswrapper[5033]: I0226 16:36:01.599835 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535396-sj7rb" event={"ID":"bf37743b-0d6c-4977-b6c7-23000743139a","Type":"ContainerStarted","Data":"7aa87e969219ee073307e9476e420a336dc28de2f6631a186e729b5733b16e9c"} Feb 26 16:36:02 crc kubenswrapper[5033]: I0226 16:36:02.610023 5033 generic.go:334] "Generic (PLEG): container finished" podID="bf37743b-0d6c-4977-b6c7-23000743139a" containerID="c8a4e098ccef81f5c108cab5f42211f88303c9bf3e819cac8158e97e6422740a" exitCode=0 Feb 26 16:36:02 crc kubenswrapper[5033]: I0226 16:36:02.610191 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535396-sj7rb" event={"ID":"bf37743b-0d6c-4977-b6c7-23000743139a","Type":"ContainerDied","Data":"c8a4e098ccef81f5c108cab5f42211f88303c9bf3e819cac8158e97e6422740a"} Feb 26 16:36:03 crc kubenswrapper[5033]: I0226 16:36:03.886495 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535396-sj7rb" Feb 26 16:36:03 crc kubenswrapper[5033]: I0226 16:36:03.926033 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mwrx\" (UniqueName: \"kubernetes.io/projected/bf37743b-0d6c-4977-b6c7-23000743139a-kube-api-access-2mwrx\") pod \"bf37743b-0d6c-4977-b6c7-23000743139a\" (UID: \"bf37743b-0d6c-4977-b6c7-23000743139a\") " Feb 26 16:36:03 crc kubenswrapper[5033]: I0226 16:36:03.939555 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf37743b-0d6c-4977-b6c7-23000743139a-kube-api-access-2mwrx" (OuterVolumeSpecName: "kube-api-access-2mwrx") pod "bf37743b-0d6c-4977-b6c7-23000743139a" (UID: "bf37743b-0d6c-4977-b6c7-23000743139a"). InnerVolumeSpecName "kube-api-access-2mwrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:36:04 crc kubenswrapper[5033]: I0226 16:36:04.028137 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mwrx\" (UniqueName: \"kubernetes.io/projected/bf37743b-0d6c-4977-b6c7-23000743139a-kube-api-access-2mwrx\") on node \"crc\" DevicePath \"\"" Feb 26 16:36:04 crc kubenswrapper[5033]: I0226 16:36:04.627415 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535396-sj7rb" Feb 26 16:36:04 crc kubenswrapper[5033]: I0226 16:36:04.627472 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535396-sj7rb" event={"ID":"bf37743b-0d6c-4977-b6c7-23000743139a","Type":"ContainerDied","Data":"7aa87e969219ee073307e9476e420a336dc28de2f6631a186e729b5733b16e9c"} Feb 26 16:36:04 crc kubenswrapper[5033]: I0226 16:36:04.627496 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aa87e969219ee073307e9476e420a336dc28de2f6631a186e729b5733b16e9c" Feb 26 16:36:04 crc kubenswrapper[5033]: I0226 16:36:04.981639 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535390-8pbnt"] Feb 26 16:36:04 crc kubenswrapper[5033]: I0226 16:36:04.988575 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535390-8pbnt"] Feb 26 16:36:05 crc kubenswrapper[5033]: I0226 16:36:05.635932 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14df2479-2a8f-44c8-ab6e-0032e08eb3be" path="/var/lib/kubelet/pods/14df2479-2a8f-44c8-ab6e-0032e08eb3be/volumes" Feb 26 16:36:30 crc kubenswrapper[5033]: I0226 16:36:30.836578 5033 scope.go:117] "RemoveContainer" containerID="daf450f440423ace4bf71a56d9b82a792fe08e3d87c1b2bedc498c1949145141" Feb 26 16:36:34 crc kubenswrapper[5033]: I0226 16:36:34.856286 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t8t4k"] Feb 26 16:36:34 crc kubenswrapper[5033]: E0226 16:36:34.857317 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf37743b-0d6c-4977-b6c7-23000743139a" containerName="oc" Feb 26 16:36:34 crc kubenswrapper[5033]: I0226 16:36:34.857336 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf37743b-0d6c-4977-b6c7-23000743139a" containerName="oc" Feb 26 16:36:34 crc kubenswrapper[5033]: I0226 16:36:34.857549 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf37743b-0d6c-4977-b6c7-23000743139a" containerName="oc" Feb 26 16:36:34 crc kubenswrapper[5033]: I0226 16:36:34.858809 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:34 crc kubenswrapper[5033]: I0226 16:36:34.890171 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8t4k"] Feb 26 16:36:34 crc kubenswrapper[5033]: I0226 16:36:34.935228 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-catalog-content\") pod \"certified-operators-t8t4k\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:34 crc kubenswrapper[5033]: I0226 16:36:34.935305 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-utilities\") pod \"certified-operators-t8t4k\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:34 crc kubenswrapper[5033]: I0226 16:36:34.935378 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vc98\" (UniqueName: \"kubernetes.io/projected/919632ad-6da1-493c-8203-628578e50b78-kube-api-access-6vc98\") pod \"certified-operators-t8t4k\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.036446 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vc98\" (UniqueName: \"kubernetes.io/projected/919632ad-6da1-493c-8203-628578e50b78-kube-api-access-6vc98\") pod \"certified-operators-t8t4k\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.036602 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-catalog-content\") pod \"certified-operators-t8t4k\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.036655 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-utilities\") pod \"certified-operators-t8t4k\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.037326 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-utilities\") pod \"certified-operators-t8t4k\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.037362 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-catalog-content\") pod \"certified-operators-t8t4k\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.056644 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vc98\" (UniqueName: \"kubernetes.io/projected/919632ad-6da1-493c-8203-628578e50b78-kube-api-access-6vc98\") pod \"certified-operators-t8t4k\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.185641 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.637152 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8t4k"] Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.901713 5033 generic.go:334] "Generic (PLEG): container finished" podID="919632ad-6da1-493c-8203-628578e50b78" containerID="4cfedeed38b2cdc1a0dfe97eadba4856d1508fe0cd4d44279fc949532b717f3f" exitCode=0 Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.901755 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8t4k" event={"ID":"919632ad-6da1-493c-8203-628578e50b78","Type":"ContainerDied","Data":"4cfedeed38b2cdc1a0dfe97eadba4856d1508fe0cd4d44279fc949532b717f3f"} Feb 26 16:36:35 crc kubenswrapper[5033]: I0226 16:36:35.901832 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8t4k" event={"ID":"919632ad-6da1-493c-8203-628578e50b78","Type":"ContainerStarted","Data":"4d1943c5821b515295cf7f92e175fd33f982934dead22f4ab96aa5a9fec755b7"} Feb 26 16:36:37 crc kubenswrapper[5033]: I0226 16:36:37.919272 5033 generic.go:334] "Generic (PLEG): container finished" podID="919632ad-6da1-493c-8203-628578e50b78" containerID="73a2b43d346ef902aa521387094ce07560c9d1212b0422a14d3a13ffc748e0ea" exitCode=0 Feb 26 16:36:37 crc kubenswrapper[5033]: I0226 16:36:37.919330 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8t4k" event={"ID":"919632ad-6da1-493c-8203-628578e50b78","Type":"ContainerDied","Data":"73a2b43d346ef902aa521387094ce07560c9d1212b0422a14d3a13ffc748e0ea"} Feb 26 16:36:38 crc kubenswrapper[5033]: I0226 16:36:38.932818 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8t4k" event={"ID":"919632ad-6da1-493c-8203-628578e50b78","Type":"ContainerStarted","Data":"a3a9ff35f2ccd99dfd6fdea414d8ed50a3c60d4ad05a4049fe830d53532bc7af"} Feb 26 16:36:38 crc kubenswrapper[5033]: I0226 16:36:38.957944 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t8t4k" podStartSLOduration=2.480787489 podStartE2EDuration="4.957924729s" podCreationTimestamp="2026-02-26 16:36:34 +0000 UTC" firstStartedPulling="2026-02-26 16:36:35.905895246 +0000 UTC m=+3323.118493906" lastFinishedPulling="2026-02-26 16:36:38.383032506 +0000 UTC m=+3325.595631146" observedRunningTime="2026-02-26 16:36:38.955122122 +0000 UTC m=+3326.167720782" watchObservedRunningTime="2026-02-26 16:36:38.957924729 +0000 UTC m=+3326.170523399" Feb 26 16:36:45 crc kubenswrapper[5033]: I0226 16:36:45.186222 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:45 crc kubenswrapper[5033]: I0226 16:36:45.186875 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:45 crc kubenswrapper[5033]: I0226 16:36:45.239192 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:46 crc kubenswrapper[5033]: I0226 16:36:46.076075 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:46 crc kubenswrapper[5033]: I0226 16:36:46.142673 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8t4k"] Feb 26 16:36:48 crc kubenswrapper[5033]: I0226 16:36:48.017439 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t8t4k" podUID="919632ad-6da1-493c-8203-628578e50b78" containerName="registry-server" containerID="cri-o://a3a9ff35f2ccd99dfd6fdea414d8ed50a3c60d4ad05a4049fe830d53532bc7af" gracePeriod=2 Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.028070 5033 generic.go:334] "Generic (PLEG): container finished" podID="919632ad-6da1-493c-8203-628578e50b78" containerID="a3a9ff35f2ccd99dfd6fdea414d8ed50a3c60d4ad05a4049fe830d53532bc7af" exitCode=0 Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.028129 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8t4k" event={"ID":"919632ad-6da1-493c-8203-628578e50b78","Type":"ContainerDied","Data":"a3a9ff35f2ccd99dfd6fdea414d8ed50a3c60d4ad05a4049fe830d53532bc7af"} Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.028523 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8t4k" event={"ID":"919632ad-6da1-493c-8203-628578e50b78","Type":"ContainerDied","Data":"4d1943c5821b515295cf7f92e175fd33f982934dead22f4ab96aa5a9fec755b7"} Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.028550 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1943c5821b515295cf7f92e175fd33f982934dead22f4ab96aa5a9fec755b7" Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.031374 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.059784 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vc98\" (UniqueName: \"kubernetes.io/projected/919632ad-6da1-493c-8203-628578e50b78-kube-api-access-6vc98\") pod \"919632ad-6da1-493c-8203-628578e50b78\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.060450 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-utilities\") pod \"919632ad-6da1-493c-8203-628578e50b78\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.060527 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-catalog-content\") pod \"919632ad-6da1-493c-8203-628578e50b78\" (UID: \"919632ad-6da1-493c-8203-628578e50b78\") " Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.061355 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-utilities" (OuterVolumeSpecName: "utilities") pod "919632ad-6da1-493c-8203-628578e50b78" (UID: "919632ad-6da1-493c-8203-628578e50b78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.066714 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919632ad-6da1-493c-8203-628578e50b78-kube-api-access-6vc98" (OuterVolumeSpecName: "kube-api-access-6vc98") pod "919632ad-6da1-493c-8203-628578e50b78" (UID: "919632ad-6da1-493c-8203-628578e50b78"). InnerVolumeSpecName "kube-api-access-6vc98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.118677 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "919632ad-6da1-493c-8203-628578e50b78" (UID: "919632ad-6da1-493c-8203-628578e50b78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.162247 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vc98\" (UniqueName: \"kubernetes.io/projected/919632ad-6da1-493c-8203-628578e50b78-kube-api-access-6vc98\") on node \"crc\" DevicePath \"\"" Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.162297 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:36:49 crc kubenswrapper[5033]: I0226 16:36:49.162308 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919632ad-6da1-493c-8203-628578e50b78-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:36:50 crc kubenswrapper[5033]: I0226 16:36:50.035920 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8t4k" Feb 26 16:36:50 crc kubenswrapper[5033]: I0226 16:36:50.073899 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8t4k"] Feb 26 16:36:50 crc kubenswrapper[5033]: I0226 16:36:50.082147 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t8t4k"] Feb 26 16:36:51 crc kubenswrapper[5033]: I0226 16:36:51.638146 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919632ad-6da1-493c-8203-628578e50b78" path="/var/lib/kubelet/pods/919632ad-6da1-493c-8203-628578e50b78/volumes" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.776499 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cb7hx"] Feb 26 16:37:57 crc kubenswrapper[5033]: E0226 16:37:57.777540 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919632ad-6da1-493c-8203-628578e50b78" containerName="registry-server" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.777557 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="919632ad-6da1-493c-8203-628578e50b78" containerName="registry-server" Feb 26 16:37:57 crc kubenswrapper[5033]: E0226 16:37:57.777574 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919632ad-6da1-493c-8203-628578e50b78" containerName="extract-utilities" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.777581 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="919632ad-6da1-493c-8203-628578e50b78" containerName="extract-utilities" Feb 26 16:37:57 crc kubenswrapper[5033]: E0226 16:37:57.777608 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919632ad-6da1-493c-8203-628578e50b78" containerName="extract-content" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.777617 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="919632ad-6da1-493c-8203-628578e50b78" containerName="extract-content" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.777776 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="919632ad-6da1-493c-8203-628578e50b78" containerName="registry-server" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.779939 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.809932 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cb7hx"] Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.917538 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-catalog-content\") pod \"redhat-marketplace-cb7hx\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.917609 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-utilities\") pod \"redhat-marketplace-cb7hx\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.917752 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzp7v\" (UniqueName: \"kubernetes.io/projected/113dcf2e-c0f7-40d9-8bcf-46469f586f39-kube-api-access-rzp7v\") pod \"redhat-marketplace-cb7hx\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.968766 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kgmm6"] Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.970351 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:57 crc kubenswrapper[5033]: I0226 16:37:57.993390 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgmm6"] Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.019651 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzp7v\" (UniqueName: \"kubernetes.io/projected/113dcf2e-c0f7-40d9-8bcf-46469f586f39-kube-api-access-rzp7v\") pod \"redhat-marketplace-cb7hx\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.019751 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rv5r\" (UniqueName: \"kubernetes.io/projected/c0b242dd-f37c-401c-8805-f0ba2b93d160-kube-api-access-9rv5r\") pod \"community-operators-kgmm6\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.019791 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-catalog-content\") pod \"community-operators-kgmm6\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.019820 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-catalog-content\") pod \"redhat-marketplace-cb7hx\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.019851 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-utilities\") pod \"redhat-marketplace-cb7hx\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.019944 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-utilities\") pod \"community-operators-kgmm6\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.020949 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-catalog-content\") pod \"redhat-marketplace-cb7hx\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.021259 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-utilities\") pod \"redhat-marketplace-cb7hx\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.040883 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzp7v\" (UniqueName: \"kubernetes.io/projected/113dcf2e-c0f7-40d9-8bcf-46469f586f39-kube-api-access-rzp7v\") pod \"redhat-marketplace-cb7hx\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.112576 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.120574 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rv5r\" (UniqueName: \"kubernetes.io/projected/c0b242dd-f37c-401c-8805-f0ba2b93d160-kube-api-access-9rv5r\") pod \"community-operators-kgmm6\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.120625 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-catalog-content\") pod \"community-operators-kgmm6\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.120713 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-utilities\") pod \"community-operators-kgmm6\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.121129 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-catalog-content\") pod \"community-operators-kgmm6\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.121247 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-utilities\") pod \"community-operators-kgmm6\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.140369 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rv5r\" (UniqueName: \"kubernetes.io/projected/c0b242dd-f37c-401c-8805-f0ba2b93d160-kube-api-access-9rv5r\") pod \"community-operators-kgmm6\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.289429 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.602011 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cb7hx"] Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.614221 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cb7hx" event={"ID":"113dcf2e-c0f7-40d9-8bcf-46469f586f39","Type":"ContainerStarted","Data":"efb09fc7d0352550efca49fa27b5d51c2122e1a2d8a2c1f4942c87380e5dc1c6"} Feb 26 16:37:58 crc kubenswrapper[5033]: I0226 16:37:58.702621 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgmm6"] Feb 26 16:37:58 crc kubenswrapper[5033]: W0226 16:37:58.714407 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b242dd_f37c_401c_8805_f0ba2b93d160.slice/crio-8a496e0b315a6b81417497da5fabafea27c882d52bcbdd6a0d7989225fdd763a WatchSource:0}: Error finding container 8a496e0b315a6b81417497da5fabafea27c882d52bcbdd6a0d7989225fdd763a: Status 404 returned error can't find the container with id 8a496e0b315a6b81417497da5fabafea27c882d52bcbdd6a0d7989225fdd763a Feb 26 16:37:59 crc kubenswrapper[5033]: I0226 16:37:59.624528 5033 generic.go:334] "Generic (PLEG): container finished" podID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerID="8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a" exitCode=0 Feb 26 16:37:59 crc kubenswrapper[5033]: I0226 16:37:59.624600 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cb7hx" event={"ID":"113dcf2e-c0f7-40d9-8bcf-46469f586f39","Type":"ContainerDied","Data":"8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a"} Feb 26 16:37:59 crc kubenswrapper[5033]: I0226 16:37:59.628533 5033 generic.go:334] "Generic (PLEG): container finished" podID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerID="b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585" exitCode=0 Feb 26 16:37:59 crc kubenswrapper[5033]: I0226 16:37:59.656582 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmm6" event={"ID":"c0b242dd-f37c-401c-8805-f0ba2b93d160","Type":"ContainerDied","Data":"b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585"} Feb 26 16:37:59 crc kubenswrapper[5033]: I0226 16:37:59.656680 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmm6" event={"ID":"c0b242dd-f37c-401c-8805-f0ba2b93d160","Type":"ContainerStarted","Data":"8a496e0b315a6b81417497da5fabafea27c882d52bcbdd6a0d7989225fdd763a"} Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.178722 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535398-dzgcx"] Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.179930 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535398-dzgcx"] Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.180014 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535398-dzgcx" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.182648 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.182708 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.183111 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.359128 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96ctg\" (UniqueName: \"kubernetes.io/projected/6a26ae6f-ff53-4605-817f-54e3ef4d0e43-kube-api-access-96ctg\") pod \"auto-csr-approver-29535398-dzgcx\" (UID: \"6a26ae6f-ff53-4605-817f-54e3ef4d0e43\") " pod="openshift-infra/auto-csr-approver-29535398-dzgcx" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.379032 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6sgwf"] Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.381485 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.397917 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6sgwf"] Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.460859 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96ctg\" (UniqueName: \"kubernetes.io/projected/6a26ae6f-ff53-4605-817f-54e3ef4d0e43-kube-api-access-96ctg\") pod \"auto-csr-approver-29535398-dzgcx\" (UID: \"6a26ae6f-ff53-4605-817f-54e3ef4d0e43\") " pod="openshift-infra/auto-csr-approver-29535398-dzgcx" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.482441 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96ctg\" (UniqueName: \"kubernetes.io/projected/6a26ae6f-ff53-4605-817f-54e3ef4d0e43-kube-api-access-96ctg\") pod \"auto-csr-approver-29535398-dzgcx\" (UID: \"6a26ae6f-ff53-4605-817f-54e3ef4d0e43\") " pod="openshift-infra/auto-csr-approver-29535398-dzgcx" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.508047 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535398-dzgcx" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.562648 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kcvf\" (UniqueName: \"kubernetes.io/projected/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-kube-api-access-8kcvf\") pod \"redhat-operators-6sgwf\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.562709 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-catalog-content\") pod \"redhat-operators-6sgwf\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.562735 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-utilities\") pod \"redhat-operators-6sgwf\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.663898 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kcvf\" (UniqueName: \"kubernetes.io/projected/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-kube-api-access-8kcvf\") pod \"redhat-operators-6sgwf\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.664159 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-catalog-content\") pod \"redhat-operators-6sgwf\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.664186 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-utilities\") pod \"redhat-operators-6sgwf\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.664884 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-catalog-content\") pod \"redhat-operators-6sgwf\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.664965 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-utilities\") pod \"redhat-operators-6sgwf\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.687144 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kcvf\" (UniqueName: \"kubernetes.io/projected/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-kube-api-access-8kcvf\") pod \"redhat-operators-6sgwf\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:00 crc kubenswrapper[5033]: I0226 16:38:00.722937 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.012023 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535398-dzgcx"] Feb 26 16:38:01 crc kubenswrapper[5033]: W0226 16:38:01.061846 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a26ae6f_ff53_4605_817f_54e3ef4d0e43.slice/crio-7db171036f74f95e6b0d95d688947a5a4425fdf8d5148bacda28fe63de9b19f9 WatchSource:0}: Error finding container 7db171036f74f95e6b0d95d688947a5a4425fdf8d5148bacda28fe63de9b19f9: Status 404 returned error can't find the container with id 7db171036f74f95e6b0d95d688947a5a4425fdf8d5148bacda28fe63de9b19f9 Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.166339 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6sgwf"] Feb 26 16:38:01 crc kubenswrapper[5033]: W0226 16:38:01.166712 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8aa85b5_1e7e_471a_9690_3e74c80dc3b8.slice/crio-e9aba6dc7b00feaddf9a63a8bc5c6d26d3401e152696e2601617ced15a6696b4 WatchSource:0}: Error finding container e9aba6dc7b00feaddf9a63a8bc5c6d26d3401e152696e2601617ced15a6696b4: Status 404 returned error can't find the container with id e9aba6dc7b00feaddf9a63a8bc5c6d26d3401e152696e2601617ced15a6696b4 Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.650274 5033 generic.go:334] "Generic (PLEG): container finished" podID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerID="e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6" exitCode=0 Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.650330 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sgwf" event={"ID":"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8","Type":"ContainerDied","Data":"e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6"} Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.650381 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sgwf" event={"ID":"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8","Type":"ContainerStarted","Data":"e9aba6dc7b00feaddf9a63a8bc5c6d26d3401e152696e2601617ced15a6696b4"} Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.653537 5033 generic.go:334] "Generic (PLEG): container finished" podID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerID="c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be" exitCode=0 Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.653600 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cb7hx" event={"ID":"113dcf2e-c0f7-40d9-8bcf-46469f586f39","Type":"ContainerDied","Data":"c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be"} Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.655918 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535398-dzgcx" event={"ID":"6a26ae6f-ff53-4605-817f-54e3ef4d0e43","Type":"ContainerStarted","Data":"7db171036f74f95e6b0d95d688947a5a4425fdf8d5148bacda28fe63de9b19f9"} Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.659353 5033 generic.go:334] "Generic (PLEG): container finished" podID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerID="8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6" exitCode=0 Feb 26 16:38:01 crc kubenswrapper[5033]: I0226 16:38:01.659419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmm6" event={"ID":"c0b242dd-f37c-401c-8805-f0ba2b93d160","Type":"ContainerDied","Data":"8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6"} Feb 26 16:38:02 crc kubenswrapper[5033]: I0226 16:38:02.673479 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmm6" event={"ID":"c0b242dd-f37c-401c-8805-f0ba2b93d160","Type":"ContainerStarted","Data":"e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074"} Feb 26 16:38:02 crc kubenswrapper[5033]: I0226 16:38:02.677963 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cb7hx" event={"ID":"113dcf2e-c0f7-40d9-8bcf-46469f586f39","Type":"ContainerStarted","Data":"fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067"} Feb 26 16:38:02 crc kubenswrapper[5033]: I0226 16:38:02.679625 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535398-dzgcx" event={"ID":"6a26ae6f-ff53-4605-817f-54e3ef4d0e43","Type":"ContainerStarted","Data":"fac1768a6ce14e05d8d06bfd03cf3812390a8cfd98fc84a22a8defc55269bb29"} Feb 26 16:38:02 crc kubenswrapper[5033]: I0226 16:38:02.705070 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kgmm6" podStartSLOduration=3.112541128 podStartE2EDuration="5.705051849s" podCreationTimestamp="2026-02-26 16:37:57 +0000 UTC" firstStartedPulling="2026-02-26 16:37:59.636828675 +0000 UTC m=+3406.849427345" lastFinishedPulling="2026-02-26 16:38:02.229339416 +0000 UTC m=+3409.441938066" observedRunningTime="2026-02-26 16:38:02.698301376 +0000 UTC m=+3409.910900036" watchObservedRunningTime="2026-02-26 16:38:02.705051849 +0000 UTC m=+3409.917650509" Feb 26 16:38:02 crc kubenswrapper[5033]: I0226 16:38:02.724978 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cb7hx" podStartSLOduration=3.249346393 podStartE2EDuration="5.724959077s" podCreationTimestamp="2026-02-26 16:37:57 +0000 UTC" firstStartedPulling="2026-02-26 16:37:59.626752743 +0000 UTC m=+3406.839351443" lastFinishedPulling="2026-02-26 16:38:02.102365457 +0000 UTC m=+3409.314964127" observedRunningTime="2026-02-26 16:38:02.720356076 +0000 UTC m=+3409.932954736" watchObservedRunningTime="2026-02-26 16:38:02.724959077 +0000 UTC m=+3409.937557737" Feb 26 16:38:02 crc kubenswrapper[5033]: I0226 16:38:02.737040 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535398-dzgcx" podStartSLOduration=1.636390169 podStartE2EDuration="2.737023457s" podCreationTimestamp="2026-02-26 16:38:00 +0000 UTC" firstStartedPulling="2026-02-26 16:38:01.064591628 +0000 UTC m=+3408.277190278" lastFinishedPulling="2026-02-26 16:38:02.165224916 +0000 UTC m=+3409.377823566" observedRunningTime="2026-02-26 16:38:02.733960202 +0000 UTC m=+3409.946558852" watchObservedRunningTime="2026-02-26 16:38:02.737023457 +0000 UTC m=+3409.949622117" Feb 26 16:38:03 crc kubenswrapper[5033]: I0226 16:38:03.689185 5033 generic.go:334] "Generic (PLEG): container finished" podID="6a26ae6f-ff53-4605-817f-54e3ef4d0e43" containerID="fac1768a6ce14e05d8d06bfd03cf3812390a8cfd98fc84a22a8defc55269bb29" exitCode=0 Feb 26 16:38:03 crc kubenswrapper[5033]: I0226 16:38:03.689298 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535398-dzgcx" event={"ID":"6a26ae6f-ff53-4605-817f-54e3ef4d0e43","Type":"ContainerDied","Data":"fac1768a6ce14e05d8d06bfd03cf3812390a8cfd98fc84a22a8defc55269bb29"} Feb 26 16:38:03 crc kubenswrapper[5033]: I0226 16:38:03.692328 5033 generic.go:334] "Generic (PLEG): container finished" podID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerID="f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780" exitCode=0 Feb 26 16:38:03 crc kubenswrapper[5033]: I0226 16:38:03.692374 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sgwf" event={"ID":"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8","Type":"ContainerDied","Data":"f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780"} Feb 26 16:38:04 crc kubenswrapper[5033]: I0226 16:38:04.705158 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sgwf" event={"ID":"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8","Type":"ContainerStarted","Data":"c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6"} Feb 26 16:38:04 crc kubenswrapper[5033]: I0226 16:38:04.732599 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6sgwf" podStartSLOduration=2.286098719 podStartE2EDuration="4.732583393s" podCreationTimestamp="2026-02-26 16:38:00 +0000 UTC" firstStartedPulling="2026-02-26 16:38:01.652103176 +0000 UTC m=+3408.864701826" lastFinishedPulling="2026-02-26 16:38:04.09858785 +0000 UTC m=+3411.311186500" observedRunningTime="2026-02-26 16:38:04.72705439 +0000 UTC m=+3411.939653050" watchObservedRunningTime="2026-02-26 16:38:04.732583393 +0000 UTC m=+3411.945182043" Feb 26 16:38:05 crc kubenswrapper[5033]: I0226 16:38:05.038433 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535398-dzgcx" Feb 26 16:38:05 crc kubenswrapper[5033]: I0226 16:38:05.228612 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96ctg\" (UniqueName: \"kubernetes.io/projected/6a26ae6f-ff53-4605-817f-54e3ef4d0e43-kube-api-access-96ctg\") pod \"6a26ae6f-ff53-4605-817f-54e3ef4d0e43\" (UID: \"6a26ae6f-ff53-4605-817f-54e3ef4d0e43\") " Feb 26 16:38:05 crc kubenswrapper[5033]: I0226 16:38:05.234393 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a26ae6f-ff53-4605-817f-54e3ef4d0e43-kube-api-access-96ctg" (OuterVolumeSpecName: "kube-api-access-96ctg") pod "6a26ae6f-ff53-4605-817f-54e3ef4d0e43" (UID: "6a26ae6f-ff53-4605-817f-54e3ef4d0e43"). InnerVolumeSpecName "kube-api-access-96ctg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:38:05 crc kubenswrapper[5033]: I0226 16:38:05.330239 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96ctg\" (UniqueName: \"kubernetes.io/projected/6a26ae6f-ff53-4605-817f-54e3ef4d0e43-kube-api-access-96ctg\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:05 crc kubenswrapper[5033]: I0226 16:38:05.716616 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535398-dzgcx" event={"ID":"6a26ae6f-ff53-4605-817f-54e3ef4d0e43","Type":"ContainerDied","Data":"7db171036f74f95e6b0d95d688947a5a4425fdf8d5148bacda28fe63de9b19f9"} Feb 26 16:38:05 crc kubenswrapper[5033]: I0226 16:38:05.716663 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535398-dzgcx" Feb 26 16:38:05 crc kubenswrapper[5033]: I0226 16:38:05.716687 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7db171036f74f95e6b0d95d688947a5a4425fdf8d5148bacda28fe63de9b19f9" Feb 26 16:38:06 crc kubenswrapper[5033]: I0226 16:38:06.125312 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535392-jrg97"] Feb 26 16:38:06 crc kubenswrapper[5033]: I0226 16:38:06.135711 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535392-jrg97"] Feb 26 16:38:07 crc kubenswrapper[5033]: I0226 16:38:07.643476 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4462c6ff-e148-40af-baa3-7aa2f7950218" path="/var/lib/kubelet/pods/4462c6ff-e148-40af-baa3-7aa2f7950218/volumes" Feb 26 16:38:08 crc kubenswrapper[5033]: I0226 16:38:08.113123 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:38:08 crc kubenswrapper[5033]: I0226 16:38:08.113205 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:38:08 crc kubenswrapper[5033]: I0226 16:38:08.181458 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:38:08 crc kubenswrapper[5033]: I0226 16:38:08.289750 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:38:08 crc kubenswrapper[5033]: I0226 16:38:08.289804 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:38:08 crc kubenswrapper[5033]: I0226 16:38:08.355411 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:38:08 crc kubenswrapper[5033]: I0226 16:38:08.802322 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:38:08 crc kubenswrapper[5033]: I0226 16:38:08.812717 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:38:10 crc kubenswrapper[5033]: I0226 16:38:10.163958 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cb7hx"] Feb 26 16:38:10 crc kubenswrapper[5033]: I0226 16:38:10.724194 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:10 crc kubenswrapper[5033]: I0226 16:38:10.724269 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:10 crc kubenswrapper[5033]: I0226 16:38:10.767933 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgmm6"] Feb 26 16:38:10 crc kubenswrapper[5033]: I0226 16:38:10.774648 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kgmm6" podUID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerName="registry-server" containerID="cri-o://e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074" gracePeriod=2 Feb 26 16:38:10 crc kubenswrapper[5033]: I0226 16:38:10.774879 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cb7hx" podUID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerName="registry-server" containerID="cri-o://fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067" gracePeriod=2 Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.244434 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.252176 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.432147 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-utilities\") pod \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.432212 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-utilities\") pod \"c0b242dd-f37c-401c-8805-f0ba2b93d160\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.432338 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-catalog-content\") pod \"c0b242dd-f37c-401c-8805-f0ba2b93d160\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.432375 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-catalog-content\") pod \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.432477 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzp7v\" (UniqueName: \"kubernetes.io/projected/113dcf2e-c0f7-40d9-8bcf-46469f586f39-kube-api-access-rzp7v\") pod \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\" (UID: \"113dcf2e-c0f7-40d9-8bcf-46469f586f39\") " Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.432571 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rv5r\" (UniqueName: \"kubernetes.io/projected/c0b242dd-f37c-401c-8805-f0ba2b93d160-kube-api-access-9rv5r\") pod \"c0b242dd-f37c-401c-8805-f0ba2b93d160\" (UID: \"c0b242dd-f37c-401c-8805-f0ba2b93d160\") " Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.434080 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-utilities" (OuterVolumeSpecName: "utilities") pod "113dcf2e-c0f7-40d9-8bcf-46469f586f39" (UID: "113dcf2e-c0f7-40d9-8bcf-46469f586f39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.435714 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-utilities" (OuterVolumeSpecName: "utilities") pod "c0b242dd-f37c-401c-8805-f0ba2b93d160" (UID: "c0b242dd-f37c-401c-8805-f0ba2b93d160"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.442362 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b242dd-f37c-401c-8805-f0ba2b93d160-kube-api-access-9rv5r" (OuterVolumeSpecName: "kube-api-access-9rv5r") pod "c0b242dd-f37c-401c-8805-f0ba2b93d160" (UID: "c0b242dd-f37c-401c-8805-f0ba2b93d160"). InnerVolumeSpecName "kube-api-access-9rv5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.445203 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113dcf2e-c0f7-40d9-8bcf-46469f586f39-kube-api-access-rzp7v" (OuterVolumeSpecName: "kube-api-access-rzp7v") pod "113dcf2e-c0f7-40d9-8bcf-46469f586f39" (UID: "113dcf2e-c0f7-40d9-8bcf-46469f586f39"). InnerVolumeSpecName "kube-api-access-rzp7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.524241 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "113dcf2e-c0f7-40d9-8bcf-46469f586f39" (UID: "113dcf2e-c0f7-40d9-8bcf-46469f586f39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.525889 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0b242dd-f37c-401c-8805-f0ba2b93d160" (UID: "c0b242dd-f37c-401c-8805-f0ba2b93d160"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.533937 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rv5r\" (UniqueName: \"kubernetes.io/projected/c0b242dd-f37c-401c-8805-f0ba2b93d160-kube-api-access-9rv5r\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.533980 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.533995 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.534006 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0b242dd-f37c-401c-8805-f0ba2b93d160-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.534017 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/113dcf2e-c0f7-40d9-8bcf-46469f586f39-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.534028 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzp7v\" (UniqueName: \"kubernetes.io/projected/113dcf2e-c0f7-40d9-8bcf-46469f586f39-kube-api-access-rzp7v\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.785288 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6sgwf" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerName="registry-server" probeResult="failure" output=< Feb 26 16:38:11 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 16:38:11 crc kubenswrapper[5033]: > Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.790936 5033 generic.go:334] "Generic (PLEG): container finished" podID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerID="fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067" exitCode=0 Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.791070 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cb7hx" event={"ID":"113dcf2e-c0f7-40d9-8bcf-46469f586f39","Type":"ContainerDied","Data":"fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067"} Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.791127 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cb7hx" event={"ID":"113dcf2e-c0f7-40d9-8bcf-46469f586f39","Type":"ContainerDied","Data":"efb09fc7d0352550efca49fa27b5d51c2122e1a2d8a2c1f4942c87380e5dc1c6"} Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.791160 5033 scope.go:117] "RemoveContainer" containerID="fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.791622 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cb7hx" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.798923 5033 generic.go:334] "Generic (PLEG): container finished" podID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerID="e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074" exitCode=0 Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.799071 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgmm6" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.799409 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmm6" event={"ID":"c0b242dd-f37c-401c-8805-f0ba2b93d160","Type":"ContainerDied","Data":"e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074"} Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.799683 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgmm6" event={"ID":"c0b242dd-f37c-401c-8805-f0ba2b93d160","Type":"ContainerDied","Data":"8a496e0b315a6b81417497da5fabafea27c882d52bcbdd6a0d7989225fdd763a"} Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.837723 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cb7hx"] Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.841014 5033 scope.go:117] "RemoveContainer" containerID="c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.849682 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cb7hx"] Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.863137 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgmm6"] Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.869469 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kgmm6"] Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.887993 5033 scope.go:117] "RemoveContainer" containerID="8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.912795 5033 scope.go:117] "RemoveContainer" containerID="fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067" Feb 26 16:38:11 crc kubenswrapper[5033]: E0226 16:38:11.913284 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067\": container with ID starting with fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067 not found: ID does not exist" containerID="fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.913345 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067"} err="failed to get container status \"fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067\": rpc error: code = NotFound desc = could not find container \"fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067\": container with ID starting with fa11ea5a259506df05e6ffba22cc87c3eb09d8e6388f0694ee0851e9e9018067 not found: ID does not exist" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.913385 5033 scope.go:117] "RemoveContainer" containerID="c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be" Feb 26 16:38:11 crc kubenswrapper[5033]: E0226 16:38:11.914390 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be\": container with ID starting with c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be not found: ID does not exist" containerID="c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.914420 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be"} err="failed to get container status \"c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be\": rpc error: code = NotFound desc = could not find container \"c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be\": container with ID starting with c979da9f0ffc7b002192541d45e3ccacae5454f1ef2c4b5d7e4071fb4c1de2be not found: ID does not exist" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.914439 5033 scope.go:117] "RemoveContainer" containerID="8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a" Feb 26 16:38:11 crc kubenswrapper[5033]: E0226 16:38:11.914728 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a\": container with ID starting with 8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a not found: ID does not exist" containerID="8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.914753 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a"} err="failed to get container status \"8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a\": rpc error: code = NotFound desc = could not find container \"8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a\": container with ID starting with 8864e3aedbb3fd3ed86bcac7a73068e56fb9a74fbb886411c31ab15af0a5df1a not found: ID does not exist" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.914811 5033 scope.go:117] "RemoveContainer" containerID="e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.961314 5033 scope.go:117] "RemoveContainer" containerID="8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6" Feb 26 16:38:11 crc kubenswrapper[5033]: I0226 16:38:11.981376 5033 scope.go:117] "RemoveContainer" containerID="b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585" Feb 26 16:38:12 crc kubenswrapper[5033]: I0226 16:38:12.004042 5033 scope.go:117] "RemoveContainer" containerID="e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074" Feb 26 16:38:12 crc kubenswrapper[5033]: E0226 16:38:12.004592 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074\": container with ID starting with e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074 not found: ID does not exist" containerID="e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074" Feb 26 16:38:12 crc kubenswrapper[5033]: I0226 16:38:12.004658 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074"} err="failed to get container status \"e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074\": rpc error: code = NotFound desc = could not find container \"e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074\": container with ID starting with e9c5a5a27a17cf1fad1abf105894484c545f3315db8f73124ed36e88f93a3074 not found: ID does not exist" Feb 26 16:38:12 crc kubenswrapper[5033]: I0226 16:38:12.004704 5033 scope.go:117] "RemoveContainer" containerID="8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6" Feb 26 16:38:12 crc kubenswrapper[5033]: E0226 16:38:12.005365 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6\": container with ID starting with 8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6 not found: ID does not exist" containerID="8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6" Feb 26 16:38:12 crc kubenswrapper[5033]: I0226 16:38:12.005433 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6"} err="failed to get container status \"8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6\": rpc error: code = NotFound desc = could not find container \"8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6\": container with ID starting with 8319d82efbcfd033ace88aa5f07fdeab672354f0fa7acbe3cce73d8172baa0e6 not found: ID does not exist" Feb 26 16:38:12 crc kubenswrapper[5033]: I0226 16:38:12.005471 5033 scope.go:117] "RemoveContainer" containerID="b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585" Feb 26 16:38:12 crc kubenswrapper[5033]: E0226 16:38:12.005963 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585\": container with ID starting with b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585 not found: ID does not exist" containerID="b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585" Feb 26 16:38:12 crc kubenswrapper[5033]: I0226 16:38:12.006006 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585"} err="failed to get container status \"b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585\": rpc error: code = NotFound desc = could not find container \"b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585\": container with ID starting with b2bc9781fcfd33c4028a87ede2fe81db63d323c9fb5372cae3ebd18196e67585 not found: ID does not exist" Feb 26 16:38:13 crc kubenswrapper[5033]: I0226 16:38:13.648373 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" path="/var/lib/kubelet/pods/113dcf2e-c0f7-40d9-8bcf-46469f586f39/volumes" Feb 26 16:38:13 crc kubenswrapper[5033]: I0226 16:38:13.649841 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b242dd-f37c-401c-8805-f0ba2b93d160" path="/var/lib/kubelet/pods/c0b242dd-f37c-401c-8805-f0ba2b93d160/volumes" Feb 26 16:38:20 crc kubenswrapper[5033]: I0226 16:38:20.189068 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:38:20 crc kubenswrapper[5033]: I0226 16:38:20.189689 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:38:20 crc kubenswrapper[5033]: I0226 16:38:20.783541 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:20 crc kubenswrapper[5033]: I0226 16:38:20.836767 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:21 crc kubenswrapper[5033]: I0226 16:38:21.021638 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6sgwf"] Feb 26 16:38:21 crc kubenswrapper[5033]: I0226 16:38:21.904809 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6sgwf" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerName="registry-server" containerID="cri-o://c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6" gracePeriod=2 Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.298493 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.403647 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-utilities\") pod \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.403706 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-catalog-content\") pod \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.403816 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kcvf\" (UniqueName: \"kubernetes.io/projected/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-kube-api-access-8kcvf\") pod \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\" (UID: \"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8\") " Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.404706 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-utilities" (OuterVolumeSpecName: "utilities") pod "d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" (UID: "d8aa85b5-1e7e-471a-9690-3e74c80dc3b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.411199 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-kube-api-access-8kcvf" (OuterVolumeSpecName: "kube-api-access-8kcvf") pod "d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" (UID: "d8aa85b5-1e7e-471a-9690-3e74c80dc3b8"). InnerVolumeSpecName "kube-api-access-8kcvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.504906 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.504939 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kcvf\" (UniqueName: \"kubernetes.io/projected/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-kube-api-access-8kcvf\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.531022 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" (UID: "d8aa85b5-1e7e-471a-9690-3e74c80dc3b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.606552 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.914902 5033 generic.go:334] "Generic (PLEG): container finished" podID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerID="c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6" exitCode=0 Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.914952 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sgwf" event={"ID":"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8","Type":"ContainerDied","Data":"c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6"} Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.914970 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6sgwf" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.914983 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sgwf" event={"ID":"d8aa85b5-1e7e-471a-9690-3e74c80dc3b8","Type":"ContainerDied","Data":"e9aba6dc7b00feaddf9a63a8bc5c6d26d3401e152696e2601617ced15a6696b4"} Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.915006 5033 scope.go:117] "RemoveContainer" containerID="c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.936408 5033 scope.go:117] "RemoveContainer" containerID="f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.950073 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6sgwf"] Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.957135 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6sgwf"] Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.974754 5033 scope.go:117] "RemoveContainer" containerID="e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.994782 5033 scope.go:117] "RemoveContainer" containerID="c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6" Feb 26 16:38:22 crc kubenswrapper[5033]: E0226 16:38:22.995351 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6\": container with ID starting with c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6 not found: ID does not exist" containerID="c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.995409 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6"} err="failed to get container status \"c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6\": rpc error: code = NotFound desc = could not find container \"c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6\": container with ID starting with c4590aa1c3f13cba61eb4a245fabc0400dbc3d9ababa915bed1c209a3d3d9fb6 not found: ID does not exist" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.995443 5033 scope.go:117] "RemoveContainer" containerID="f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780" Feb 26 16:38:22 crc kubenswrapper[5033]: E0226 16:38:22.995763 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780\": container with ID starting with f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780 not found: ID does not exist" containerID="f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.995816 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780"} err="failed to get container status \"f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780\": rpc error: code = NotFound desc = could not find container \"f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780\": container with ID starting with f40869ccf6818908a9d537c69fb0bae1deb8016f8fb6b4ef48a145581d391780 not found: ID does not exist" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.995856 5033 scope.go:117] "RemoveContainer" containerID="e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6" Feb 26 16:38:22 crc kubenswrapper[5033]: E0226 16:38:22.996133 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6\": container with ID starting with e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6 not found: ID does not exist" containerID="e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6" Feb 26 16:38:22 crc kubenswrapper[5033]: I0226 16:38:22.996173 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6"} err="failed to get container status \"e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6\": rpc error: code = NotFound desc = could not find container \"e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6\": container with ID starting with e44be8f989813962970d5ddcbe5b8f5a942aac4a0787514bfe278c665ffe62d6 not found: ID does not exist" Feb 26 16:38:23 crc kubenswrapper[5033]: I0226 16:38:23.641547 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" path="/var/lib/kubelet/pods/d8aa85b5-1e7e-471a-9690-3e74c80dc3b8/volumes" Feb 26 16:38:30 crc kubenswrapper[5033]: I0226 16:38:30.925520 5033 scope.go:117] "RemoveContainer" containerID="cd4f707e8a8f179cbcdb59053fa4ad8b59fe3317d17d2aa9e6a4c14781e27ef9" Feb 26 16:38:50 crc kubenswrapper[5033]: I0226 16:38:50.189076 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:38:50 crc kubenswrapper[5033]: I0226 16:38:50.189823 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:39:20 crc kubenswrapper[5033]: I0226 16:39:20.189536 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:39:20 crc kubenswrapper[5033]: I0226 16:39:20.190379 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:39:20 crc kubenswrapper[5033]: I0226 16:39:20.190541 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:39:20 crc kubenswrapper[5033]: I0226 16:39:20.191727 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d4991683170439fd91b891f66e6b5d338934da27938fbdc44f3499149cd3e54"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:39:20 crc kubenswrapper[5033]: I0226 16:39:20.191847 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://3d4991683170439fd91b891f66e6b5d338934da27938fbdc44f3499149cd3e54" gracePeriod=600 Feb 26 16:39:20 crc kubenswrapper[5033]: I0226 16:39:20.427232 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="3d4991683170439fd91b891f66e6b5d338934da27938fbdc44f3499149cd3e54" exitCode=0 Feb 26 16:39:20 crc kubenswrapper[5033]: I0226 16:39:20.427290 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"3d4991683170439fd91b891f66e6b5d338934da27938fbdc44f3499149cd3e54"} Feb 26 16:39:20 crc kubenswrapper[5033]: I0226 16:39:20.427327 5033 scope.go:117] "RemoveContainer" containerID="0e03af7588317f9e59c12d3b4eef73af66e199e5842025f95a6218d6fdb4f129" Feb 26 16:39:21 crc kubenswrapper[5033]: I0226 16:39:21.440741 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe"} Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.157721 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535400-9k4zj"] Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.158852 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.158874 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.158895 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerName="extract-utilities" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.158905 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerName="extract-utilities" Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.158920 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerName="extract-content" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.158930 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerName="extract-content" Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.158945 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a26ae6f-ff53-4605-817f-54e3ef4d0e43" containerName="oc" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.158955 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a26ae6f-ff53-4605-817f-54e3ef4d0e43" containerName="oc" Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.158981 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerName="extract-utilities" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.158991 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerName="extract-utilities" Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.159009 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerName="extract-content" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159019 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerName="extract-content" Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.159038 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159048 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.159070 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerName="extract-utilities" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159082 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerName="extract-utilities" Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.159100 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerName="extract-content" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159112 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerName="extract-content" Feb 26 16:40:00 crc kubenswrapper[5033]: E0226 16:40:00.159130 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159140 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159402 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b242dd-f37c-401c-8805-f0ba2b93d160" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159433 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="113dcf2e-c0f7-40d9-8bcf-46469f586f39" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159453 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a26ae6f-ff53-4605-817f-54e3ef4d0e43" containerName="oc" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159471 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8aa85b5-1e7e-471a-9690-3e74c80dc3b8" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.159992 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535400-9k4zj" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.163088 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.163470 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.163829 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.165686 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535400-9k4zj"] Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.354541 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8w7n\" (UniqueName: \"kubernetes.io/projected/0c5966b7-5f6b-4c57-ad01-abf09daf1aee-kube-api-access-n8w7n\") pod \"auto-csr-approver-29535400-9k4zj\" (UID: \"0c5966b7-5f6b-4c57-ad01-abf09daf1aee\") " pod="openshift-infra/auto-csr-approver-29535400-9k4zj" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.456959 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8w7n\" (UniqueName: \"kubernetes.io/projected/0c5966b7-5f6b-4c57-ad01-abf09daf1aee-kube-api-access-n8w7n\") pod \"auto-csr-approver-29535400-9k4zj\" (UID: \"0c5966b7-5f6b-4c57-ad01-abf09daf1aee\") " pod="openshift-infra/auto-csr-approver-29535400-9k4zj" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.478512 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8w7n\" (UniqueName: \"kubernetes.io/projected/0c5966b7-5f6b-4c57-ad01-abf09daf1aee-kube-api-access-n8w7n\") pod \"auto-csr-approver-29535400-9k4zj\" (UID: \"0c5966b7-5f6b-4c57-ad01-abf09daf1aee\") " pod="openshift-infra/auto-csr-approver-29535400-9k4zj" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.495678 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535400-9k4zj" Feb 26 16:40:00 crc kubenswrapper[5033]: I0226 16:40:00.985787 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535400-9k4zj"] Feb 26 16:40:01 crc kubenswrapper[5033]: I0226 16:40:01.783184 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535400-9k4zj" event={"ID":"0c5966b7-5f6b-4c57-ad01-abf09daf1aee","Type":"ContainerStarted","Data":"6e89c9ca125cd35de4ec5aa94290c1d95f97e8b4c71a6c64e1f798d582001b76"} Feb 26 16:40:02 crc kubenswrapper[5033]: I0226 16:40:02.791063 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535400-9k4zj" event={"ID":"0c5966b7-5f6b-4c57-ad01-abf09daf1aee","Type":"ContainerStarted","Data":"36eaaa0bea08e5536973c1a75a3013053319516982a4e97a097e57e3cbd13038"} Feb 26 16:40:02 crc kubenswrapper[5033]: I0226 16:40:02.808223 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535400-9k4zj" podStartSLOduration=1.453425517 podStartE2EDuration="2.808201128s" podCreationTimestamp="2026-02-26 16:40:00 +0000 UTC" firstStartedPulling="2026-02-26 16:40:00.991381063 +0000 UTC m=+3528.203979703" lastFinishedPulling="2026-02-26 16:40:02.346156654 +0000 UTC m=+3529.558755314" observedRunningTime="2026-02-26 16:40:02.80659782 +0000 UTC m=+3530.019196470" watchObservedRunningTime="2026-02-26 16:40:02.808201128 +0000 UTC m=+3530.020799778" Feb 26 16:40:03 crc kubenswrapper[5033]: I0226 16:40:03.803636 5033 generic.go:334] "Generic (PLEG): container finished" podID="0c5966b7-5f6b-4c57-ad01-abf09daf1aee" containerID="36eaaa0bea08e5536973c1a75a3013053319516982a4e97a097e57e3cbd13038" exitCode=0 Feb 26 16:40:03 crc kubenswrapper[5033]: I0226 16:40:03.803707 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535400-9k4zj" event={"ID":"0c5966b7-5f6b-4c57-ad01-abf09daf1aee","Type":"ContainerDied","Data":"36eaaa0bea08e5536973c1a75a3013053319516982a4e97a097e57e3cbd13038"} Feb 26 16:40:05 crc kubenswrapper[5033]: I0226 16:40:05.073818 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535400-9k4zj" Feb 26 16:40:05 crc kubenswrapper[5033]: I0226 16:40:05.227545 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8w7n\" (UniqueName: \"kubernetes.io/projected/0c5966b7-5f6b-4c57-ad01-abf09daf1aee-kube-api-access-n8w7n\") pod \"0c5966b7-5f6b-4c57-ad01-abf09daf1aee\" (UID: \"0c5966b7-5f6b-4c57-ad01-abf09daf1aee\") " Feb 26 16:40:05 crc kubenswrapper[5033]: I0226 16:40:05.235681 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c5966b7-5f6b-4c57-ad01-abf09daf1aee-kube-api-access-n8w7n" (OuterVolumeSpecName: "kube-api-access-n8w7n") pod "0c5966b7-5f6b-4c57-ad01-abf09daf1aee" (UID: "0c5966b7-5f6b-4c57-ad01-abf09daf1aee"). InnerVolumeSpecName "kube-api-access-n8w7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:40:05 crc kubenswrapper[5033]: I0226 16:40:05.328907 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8w7n\" (UniqueName: \"kubernetes.io/projected/0c5966b7-5f6b-4c57-ad01-abf09daf1aee-kube-api-access-n8w7n\") on node \"crc\" DevicePath \"\"" Feb 26 16:40:05 crc kubenswrapper[5033]: I0226 16:40:05.822034 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535400-9k4zj" event={"ID":"0c5966b7-5f6b-4c57-ad01-abf09daf1aee","Type":"ContainerDied","Data":"6e89c9ca125cd35de4ec5aa94290c1d95f97e8b4c71a6c64e1f798d582001b76"} Feb 26 16:40:05 crc kubenswrapper[5033]: I0226 16:40:05.822088 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535400-9k4zj" Feb 26 16:40:05 crc kubenswrapper[5033]: I0226 16:40:05.822089 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e89c9ca125cd35de4ec5aa94290c1d95f97e8b4c71a6c64e1f798d582001b76" Feb 26 16:40:05 crc kubenswrapper[5033]: I0226 16:40:05.894097 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535394-dmwmj"] Feb 26 16:40:05 crc kubenswrapper[5033]: I0226 16:40:05.901813 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535394-dmwmj"] Feb 26 16:40:07 crc kubenswrapper[5033]: I0226 16:40:07.645122 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d545ecd-2ba8-49aa-bbdd-84590b9c211a" path="/var/lib/kubelet/pods/0d545ecd-2ba8-49aa-bbdd-84590b9c211a/volumes" Feb 26 16:40:31 crc kubenswrapper[5033]: I0226 16:40:31.105816 5033 scope.go:117] "RemoveContainer" containerID="4da70d8fd4ff1db94edc3ed5d1673a81d0af04bf2995d1c88331e5048fcdfe1d" Feb 26 16:41:20 crc kubenswrapper[5033]: I0226 16:41:20.189022 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:41:20 crc kubenswrapper[5033]: I0226 16:41:20.189838 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:41:50 crc kubenswrapper[5033]: I0226 16:41:50.189015 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:41:50 crc kubenswrapper[5033]: I0226 16:41:50.189938 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.151358 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535402-ffz7r"] Feb 26 16:42:00 crc kubenswrapper[5033]: E0226 16:42:00.152301 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5966b7-5f6b-4c57-ad01-abf09daf1aee" containerName="oc" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.152318 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5966b7-5f6b-4c57-ad01-abf09daf1aee" containerName="oc" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.152552 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c5966b7-5f6b-4c57-ad01-abf09daf1aee" containerName="oc" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.153119 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535402-ffz7r" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.154755 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.156850 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.156676 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.170251 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535402-ffz7r"] Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.247079 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffdc6\" (UniqueName: \"kubernetes.io/projected/2e304789-0679-4f67-b13d-2eec7a07331f-kube-api-access-ffdc6\") pod \"auto-csr-approver-29535402-ffz7r\" (UID: \"2e304789-0679-4f67-b13d-2eec7a07331f\") " pod="openshift-infra/auto-csr-approver-29535402-ffz7r" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.349519 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffdc6\" (UniqueName: \"kubernetes.io/projected/2e304789-0679-4f67-b13d-2eec7a07331f-kube-api-access-ffdc6\") pod \"auto-csr-approver-29535402-ffz7r\" (UID: \"2e304789-0679-4f67-b13d-2eec7a07331f\") " pod="openshift-infra/auto-csr-approver-29535402-ffz7r" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.394886 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffdc6\" (UniqueName: \"kubernetes.io/projected/2e304789-0679-4f67-b13d-2eec7a07331f-kube-api-access-ffdc6\") pod \"auto-csr-approver-29535402-ffz7r\" (UID: \"2e304789-0679-4f67-b13d-2eec7a07331f\") " pod="openshift-infra/auto-csr-approver-29535402-ffz7r" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.477581 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535402-ffz7r" Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.884640 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535402-ffz7r"] Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.895806 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:42:00 crc kubenswrapper[5033]: I0226 16:42:00.904418 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535402-ffz7r" event={"ID":"2e304789-0679-4f67-b13d-2eec7a07331f","Type":"ContainerStarted","Data":"be19485f4e8c23ffe7304f5fe1b68105ba931dea24e142187882e994a3e7aa21"} Feb 26 16:42:02 crc kubenswrapper[5033]: I0226 16:42:02.923416 5033 generic.go:334] "Generic (PLEG): container finished" podID="2e304789-0679-4f67-b13d-2eec7a07331f" containerID="db424cca7fac63229e321b5ed3a0fceee677dd241e38b7fe08c5200830b4a16c" exitCode=0 Feb 26 16:42:02 crc kubenswrapper[5033]: I0226 16:42:02.923498 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535402-ffz7r" event={"ID":"2e304789-0679-4f67-b13d-2eec7a07331f","Type":"ContainerDied","Data":"db424cca7fac63229e321b5ed3a0fceee677dd241e38b7fe08c5200830b4a16c"} Feb 26 16:42:04 crc kubenswrapper[5033]: I0226 16:42:04.245934 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535402-ffz7r" Feb 26 16:42:04 crc kubenswrapper[5033]: I0226 16:42:04.312949 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffdc6\" (UniqueName: \"kubernetes.io/projected/2e304789-0679-4f67-b13d-2eec7a07331f-kube-api-access-ffdc6\") pod \"2e304789-0679-4f67-b13d-2eec7a07331f\" (UID: \"2e304789-0679-4f67-b13d-2eec7a07331f\") " Feb 26 16:42:04 crc kubenswrapper[5033]: I0226 16:42:04.318629 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e304789-0679-4f67-b13d-2eec7a07331f-kube-api-access-ffdc6" (OuterVolumeSpecName: "kube-api-access-ffdc6") pod "2e304789-0679-4f67-b13d-2eec7a07331f" (UID: "2e304789-0679-4f67-b13d-2eec7a07331f"). InnerVolumeSpecName "kube-api-access-ffdc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:42:04 crc kubenswrapper[5033]: I0226 16:42:04.414960 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffdc6\" (UniqueName: \"kubernetes.io/projected/2e304789-0679-4f67-b13d-2eec7a07331f-kube-api-access-ffdc6\") on node \"crc\" DevicePath \"\"" Feb 26 16:42:04 crc kubenswrapper[5033]: I0226 16:42:04.944973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535402-ffz7r" event={"ID":"2e304789-0679-4f67-b13d-2eec7a07331f","Type":"ContainerDied","Data":"be19485f4e8c23ffe7304f5fe1b68105ba931dea24e142187882e994a3e7aa21"} Feb 26 16:42:04 crc kubenswrapper[5033]: I0226 16:42:04.945053 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be19485f4e8c23ffe7304f5fe1b68105ba931dea24e142187882e994a3e7aa21" Feb 26 16:42:04 crc kubenswrapper[5033]: I0226 16:42:04.945070 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535402-ffz7r" Feb 26 16:42:05 crc kubenswrapper[5033]: I0226 16:42:05.337294 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535396-sj7rb"] Feb 26 16:42:05 crc kubenswrapper[5033]: I0226 16:42:05.343704 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535396-sj7rb"] Feb 26 16:42:05 crc kubenswrapper[5033]: I0226 16:42:05.646149 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf37743b-0d6c-4977-b6c7-23000743139a" path="/var/lib/kubelet/pods/bf37743b-0d6c-4977-b6c7-23000743139a/volumes" Feb 26 16:42:20 crc kubenswrapper[5033]: I0226 16:42:20.189682 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:42:20 crc kubenswrapper[5033]: I0226 16:42:20.190517 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:42:20 crc kubenswrapper[5033]: I0226 16:42:20.190607 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:42:20 crc kubenswrapper[5033]: I0226 16:42:20.191655 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:42:20 crc kubenswrapper[5033]: I0226 16:42:20.191794 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" gracePeriod=600 Feb 26 16:42:20 crc kubenswrapper[5033]: E0226 16:42:20.318912 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:42:21 crc kubenswrapper[5033]: I0226 16:42:21.105299 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" exitCode=0 Feb 26 16:42:21 crc kubenswrapper[5033]: I0226 16:42:21.105361 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe"} Feb 26 16:42:21 crc kubenswrapper[5033]: I0226 16:42:21.105414 5033 scope.go:117] "RemoveContainer" containerID="3d4991683170439fd91b891f66e6b5d338934da27938fbdc44f3499149cd3e54" Feb 26 16:42:21 crc kubenswrapper[5033]: I0226 16:42:21.106169 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:42:21 crc kubenswrapper[5033]: E0226 16:42:21.106722 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:42:31 crc kubenswrapper[5033]: I0226 16:42:31.202789 5033 scope.go:117] "RemoveContainer" containerID="c8a4e098ccef81f5c108cab5f42211f88303c9bf3e819cac8158e97e6422740a" Feb 26 16:42:34 crc kubenswrapper[5033]: I0226 16:42:34.628225 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:42:34 crc kubenswrapper[5033]: E0226 16:42:34.628885 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:42:48 crc kubenswrapper[5033]: I0226 16:42:48.627990 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:42:48 crc kubenswrapper[5033]: E0226 16:42:48.628708 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:42:59 crc kubenswrapper[5033]: I0226 16:42:59.629461 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:42:59 crc kubenswrapper[5033]: E0226 16:42:59.630559 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:43:12 crc kubenswrapper[5033]: I0226 16:43:12.629032 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:43:12 crc kubenswrapper[5033]: E0226 16:43:12.629908 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:43:25 crc kubenswrapper[5033]: I0226 16:43:25.628151 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:43:25 crc kubenswrapper[5033]: E0226 16:43:25.629060 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:43:31 crc kubenswrapper[5033]: I0226 16:43:31.274489 5033 scope.go:117] "RemoveContainer" containerID="73a2b43d346ef902aa521387094ce07560c9d1212b0422a14d3a13ffc748e0ea" Feb 26 16:43:31 crc kubenswrapper[5033]: I0226 16:43:31.305667 5033 scope.go:117] "RemoveContainer" containerID="a3a9ff35f2ccd99dfd6fdea414d8ed50a3c60d4ad05a4049fe830d53532bc7af" Feb 26 16:43:31 crc kubenswrapper[5033]: I0226 16:43:31.328330 5033 scope.go:117] "RemoveContainer" containerID="4cfedeed38b2cdc1a0dfe97eadba4856d1508fe0cd4d44279fc949532b717f3f" Feb 26 16:43:40 crc kubenswrapper[5033]: I0226 16:43:40.628039 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:43:40 crc kubenswrapper[5033]: E0226 16:43:40.629082 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:43:54 crc kubenswrapper[5033]: I0226 16:43:54.627642 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:43:54 crc kubenswrapper[5033]: E0226 16:43:54.629704 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.156922 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535404-8bcl2"] Feb 26 16:44:00 crc kubenswrapper[5033]: E0226 16:44:00.157772 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e304789-0679-4f67-b13d-2eec7a07331f" containerName="oc" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.157794 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e304789-0679-4f67-b13d-2eec7a07331f" containerName="oc" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.158059 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e304789-0679-4f67-b13d-2eec7a07331f" containerName="oc" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.158782 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535404-8bcl2" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.162806 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.162886 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.163341 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.174391 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535404-8bcl2"] Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.324765 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnjfj\" (UniqueName: \"kubernetes.io/projected/cb078472-5501-419f-b276-7a91c83ffdc5-kube-api-access-cnjfj\") pod \"auto-csr-approver-29535404-8bcl2\" (UID: \"cb078472-5501-419f-b276-7a91c83ffdc5\") " pod="openshift-infra/auto-csr-approver-29535404-8bcl2" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.426365 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnjfj\" (UniqueName: \"kubernetes.io/projected/cb078472-5501-419f-b276-7a91c83ffdc5-kube-api-access-cnjfj\") pod \"auto-csr-approver-29535404-8bcl2\" (UID: \"cb078472-5501-419f-b276-7a91c83ffdc5\") " pod="openshift-infra/auto-csr-approver-29535404-8bcl2" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.446606 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnjfj\" (UniqueName: \"kubernetes.io/projected/cb078472-5501-419f-b276-7a91c83ffdc5-kube-api-access-cnjfj\") pod \"auto-csr-approver-29535404-8bcl2\" (UID: \"cb078472-5501-419f-b276-7a91c83ffdc5\") " pod="openshift-infra/auto-csr-approver-29535404-8bcl2" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.483048 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535404-8bcl2" Feb 26 16:44:00 crc kubenswrapper[5033]: I0226 16:44:00.731487 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535404-8bcl2"] Feb 26 16:44:01 crc kubenswrapper[5033]: I0226 16:44:01.031605 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535404-8bcl2" event={"ID":"cb078472-5501-419f-b276-7a91c83ffdc5","Type":"ContainerStarted","Data":"2baa9169cdd4c8f5bd15f96bc82fd9e566f5c9cd6b4d473877eb133b6498fdc2"} Feb 26 16:44:03 crc kubenswrapper[5033]: I0226 16:44:03.050824 5033 generic.go:334] "Generic (PLEG): container finished" podID="cb078472-5501-419f-b276-7a91c83ffdc5" containerID="7229e103d78a6c4fa54277770509de5aaeadb7d3c0d424de86a8c5a232134397" exitCode=0 Feb 26 16:44:03 crc kubenswrapper[5033]: I0226 16:44:03.051123 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535404-8bcl2" event={"ID":"cb078472-5501-419f-b276-7a91c83ffdc5","Type":"ContainerDied","Data":"7229e103d78a6c4fa54277770509de5aaeadb7d3c0d424de86a8c5a232134397"} Feb 26 16:44:04 crc kubenswrapper[5033]: I0226 16:44:04.377724 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535404-8bcl2" Feb 26 16:44:04 crc kubenswrapper[5033]: I0226 16:44:04.483672 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnjfj\" (UniqueName: \"kubernetes.io/projected/cb078472-5501-419f-b276-7a91c83ffdc5-kube-api-access-cnjfj\") pod \"cb078472-5501-419f-b276-7a91c83ffdc5\" (UID: \"cb078472-5501-419f-b276-7a91c83ffdc5\") " Feb 26 16:44:04 crc kubenswrapper[5033]: I0226 16:44:04.491314 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb078472-5501-419f-b276-7a91c83ffdc5-kube-api-access-cnjfj" (OuterVolumeSpecName: "kube-api-access-cnjfj") pod "cb078472-5501-419f-b276-7a91c83ffdc5" (UID: "cb078472-5501-419f-b276-7a91c83ffdc5"). InnerVolumeSpecName "kube-api-access-cnjfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:44:04 crc kubenswrapper[5033]: I0226 16:44:04.586054 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnjfj\" (UniqueName: \"kubernetes.io/projected/cb078472-5501-419f-b276-7a91c83ffdc5-kube-api-access-cnjfj\") on node \"crc\" DevicePath \"\"" Feb 26 16:44:05 crc kubenswrapper[5033]: I0226 16:44:05.069851 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535404-8bcl2" event={"ID":"cb078472-5501-419f-b276-7a91c83ffdc5","Type":"ContainerDied","Data":"2baa9169cdd4c8f5bd15f96bc82fd9e566f5c9cd6b4d473877eb133b6498fdc2"} Feb 26 16:44:05 crc kubenswrapper[5033]: I0226 16:44:05.069904 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2baa9169cdd4c8f5bd15f96bc82fd9e566f5c9cd6b4d473877eb133b6498fdc2" Feb 26 16:44:05 crc kubenswrapper[5033]: I0226 16:44:05.069928 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535404-8bcl2" Feb 26 16:44:05 crc kubenswrapper[5033]: I0226 16:44:05.458336 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535398-dzgcx"] Feb 26 16:44:05 crc kubenswrapper[5033]: I0226 16:44:05.462693 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535398-dzgcx"] Feb 26 16:44:05 crc kubenswrapper[5033]: I0226 16:44:05.628066 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:44:05 crc kubenswrapper[5033]: E0226 16:44:05.628573 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:44:05 crc kubenswrapper[5033]: I0226 16:44:05.638134 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a26ae6f-ff53-4605-817f-54e3ef4d0e43" path="/var/lib/kubelet/pods/6a26ae6f-ff53-4605-817f-54e3ef4d0e43/volumes" Feb 26 16:44:19 crc kubenswrapper[5033]: I0226 16:44:19.628717 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:44:19 crc kubenswrapper[5033]: E0226 16:44:19.629436 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:44:30 crc kubenswrapper[5033]: I0226 16:44:30.628451 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:44:30 crc kubenswrapper[5033]: E0226 16:44:30.629772 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:44:31 crc kubenswrapper[5033]: I0226 16:44:31.377386 5033 scope.go:117] "RemoveContainer" containerID="fac1768a6ce14e05d8d06bfd03cf3812390a8cfd98fc84a22a8defc55269bb29" Feb 26 16:44:42 crc kubenswrapper[5033]: I0226 16:44:42.629328 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:44:42 crc kubenswrapper[5033]: E0226 16:44:42.630456 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:44:55 crc kubenswrapper[5033]: I0226 16:44:55.629061 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:44:55 crc kubenswrapper[5033]: E0226 16:44:55.631186 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.149160 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh"] Feb 26 16:45:00 crc kubenswrapper[5033]: E0226 16:45:00.149913 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb078472-5501-419f-b276-7a91c83ffdc5" containerName="oc" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.149929 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb078472-5501-419f-b276-7a91c83ffdc5" containerName="oc" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.150150 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb078472-5501-419f-b276-7a91c83ffdc5" containerName="oc" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.150738 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.152770 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.154126 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.177483 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh"] Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.262642 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24ce89a2-cf67-4dee-8c2e-a4015b437452-secret-volume\") pod \"collect-profiles-29535405-spslh\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.262709 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24ce89a2-cf67-4dee-8c2e-a4015b437452-config-volume\") pod \"collect-profiles-29535405-spslh\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.262762 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwhgd\" (UniqueName: \"kubernetes.io/projected/24ce89a2-cf67-4dee-8c2e-a4015b437452-kube-api-access-rwhgd\") pod \"collect-profiles-29535405-spslh\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.364725 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24ce89a2-cf67-4dee-8c2e-a4015b437452-secret-volume\") pod \"collect-profiles-29535405-spslh\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.364795 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24ce89a2-cf67-4dee-8c2e-a4015b437452-config-volume\") pod \"collect-profiles-29535405-spslh\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.365020 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwhgd\" (UniqueName: \"kubernetes.io/projected/24ce89a2-cf67-4dee-8c2e-a4015b437452-kube-api-access-rwhgd\") pod \"collect-profiles-29535405-spslh\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.366702 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24ce89a2-cf67-4dee-8c2e-a4015b437452-config-volume\") pod \"collect-profiles-29535405-spslh\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.372302 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24ce89a2-cf67-4dee-8c2e-a4015b437452-secret-volume\") pod \"collect-profiles-29535405-spslh\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.393535 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwhgd\" (UniqueName: \"kubernetes.io/projected/24ce89a2-cf67-4dee-8c2e-a4015b437452-kube-api-access-rwhgd\") pod \"collect-profiles-29535405-spslh\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.473087 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:00 crc kubenswrapper[5033]: I0226 16:45:00.978807 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh"] Feb 26 16:45:01 crc kubenswrapper[5033]: I0226 16:45:01.564040 5033 generic.go:334] "Generic (PLEG): container finished" podID="24ce89a2-cf67-4dee-8c2e-a4015b437452" containerID="f4a06bbf7a03ee5e396b5129a026ebd0effb30f02ba853e5174e29e06dfcdf75" exitCode=0 Feb 26 16:45:01 crc kubenswrapper[5033]: I0226 16:45:01.564358 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" event={"ID":"24ce89a2-cf67-4dee-8c2e-a4015b437452","Type":"ContainerDied","Data":"f4a06bbf7a03ee5e396b5129a026ebd0effb30f02ba853e5174e29e06dfcdf75"} Feb 26 16:45:01 crc kubenswrapper[5033]: I0226 16:45:01.564386 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" event={"ID":"24ce89a2-cf67-4dee-8c2e-a4015b437452","Type":"ContainerStarted","Data":"1bc27d23c803cb773c795569ecf33bddc9db6221ffa430c393be278c9673088b"} Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.039563 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.109333 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24ce89a2-cf67-4dee-8c2e-a4015b437452-config-volume\") pod \"24ce89a2-cf67-4dee-8c2e-a4015b437452\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.109547 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwhgd\" (UniqueName: \"kubernetes.io/projected/24ce89a2-cf67-4dee-8c2e-a4015b437452-kube-api-access-rwhgd\") pod \"24ce89a2-cf67-4dee-8c2e-a4015b437452\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.109693 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24ce89a2-cf67-4dee-8c2e-a4015b437452-secret-volume\") pod \"24ce89a2-cf67-4dee-8c2e-a4015b437452\" (UID: \"24ce89a2-cf67-4dee-8c2e-a4015b437452\") " Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.110131 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24ce89a2-cf67-4dee-8c2e-a4015b437452-config-volume" (OuterVolumeSpecName: "config-volume") pod "24ce89a2-cf67-4dee-8c2e-a4015b437452" (UID: "24ce89a2-cf67-4dee-8c2e-a4015b437452"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.110388 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24ce89a2-cf67-4dee-8c2e-a4015b437452-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.115797 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ce89a2-cf67-4dee-8c2e-a4015b437452-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "24ce89a2-cf67-4dee-8c2e-a4015b437452" (UID: "24ce89a2-cf67-4dee-8c2e-a4015b437452"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.116455 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24ce89a2-cf67-4dee-8c2e-a4015b437452-kube-api-access-rwhgd" (OuterVolumeSpecName: "kube-api-access-rwhgd") pod "24ce89a2-cf67-4dee-8c2e-a4015b437452" (UID: "24ce89a2-cf67-4dee-8c2e-a4015b437452"). InnerVolumeSpecName "kube-api-access-rwhgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.211172 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwhgd\" (UniqueName: \"kubernetes.io/projected/24ce89a2-cf67-4dee-8c2e-a4015b437452-kube-api-access-rwhgd\") on node \"crc\" DevicePath \"\"" Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.211203 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24ce89a2-cf67-4dee-8c2e-a4015b437452-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.583203 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" event={"ID":"24ce89a2-cf67-4dee-8c2e-a4015b437452","Type":"ContainerDied","Data":"1bc27d23c803cb773c795569ecf33bddc9db6221ffa430c393be278c9673088b"} Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.583297 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bc27d23c803cb773c795569ecf33bddc9db6221ffa430c393be278c9673088b" Feb 26 16:45:03 crc kubenswrapper[5033]: I0226 16:45:03.583373 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh" Feb 26 16:45:04 crc kubenswrapper[5033]: I0226 16:45:04.138235 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p"] Feb 26 16:45:04 crc kubenswrapper[5033]: I0226 16:45:04.147420 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535360-zj98p"] Feb 26 16:45:05 crc kubenswrapper[5033]: I0226 16:45:05.642721 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc53151-c374-44bc-8857-c72bd11200ee" path="/var/lib/kubelet/pods/4cc53151-c374-44bc-8857-c72bd11200ee/volumes" Feb 26 16:45:08 crc kubenswrapper[5033]: I0226 16:45:08.627988 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:45:08 crc kubenswrapper[5033]: E0226 16:45:08.628850 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:45:23 crc kubenswrapper[5033]: I0226 16:45:23.633975 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:45:23 crc kubenswrapper[5033]: E0226 16:45:23.634695 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:45:31 crc kubenswrapper[5033]: I0226 16:45:31.461904 5033 scope.go:117] "RemoveContainer" containerID="484e05841cbcd7671734fcc38f847bb472a86ed47f838f6feddd98f39ca3d553" Feb 26 16:45:36 crc kubenswrapper[5033]: I0226 16:45:36.628663 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:45:36 crc kubenswrapper[5033]: E0226 16:45:36.629177 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:45:49 crc kubenswrapper[5033]: I0226 16:45:49.628303 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:45:49 crc kubenswrapper[5033]: E0226 16:45:49.630970 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.160018 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535406-szgkv"] Feb 26 16:46:00 crc kubenswrapper[5033]: E0226 16:46:00.160973 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ce89a2-cf67-4dee-8c2e-a4015b437452" containerName="collect-profiles" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.160994 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ce89a2-cf67-4dee-8c2e-a4015b437452" containerName="collect-profiles" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.161178 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ce89a2-cf67-4dee-8c2e-a4015b437452" containerName="collect-profiles" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.161739 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535406-szgkv" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.164662 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.166158 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.167238 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.176276 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535406-szgkv"] Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.260073 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf7nm\" (UniqueName: \"kubernetes.io/projected/d450c041-0cba-498d-94f9-c0b97047a70d-kube-api-access-bf7nm\") pod \"auto-csr-approver-29535406-szgkv\" (UID: \"d450c041-0cba-498d-94f9-c0b97047a70d\") " pod="openshift-infra/auto-csr-approver-29535406-szgkv" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.361354 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf7nm\" (UniqueName: \"kubernetes.io/projected/d450c041-0cba-498d-94f9-c0b97047a70d-kube-api-access-bf7nm\") pod \"auto-csr-approver-29535406-szgkv\" (UID: \"d450c041-0cba-498d-94f9-c0b97047a70d\") " pod="openshift-infra/auto-csr-approver-29535406-szgkv" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.379287 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf7nm\" (UniqueName: \"kubernetes.io/projected/d450c041-0cba-498d-94f9-c0b97047a70d-kube-api-access-bf7nm\") pod \"auto-csr-approver-29535406-szgkv\" (UID: \"d450c041-0cba-498d-94f9-c0b97047a70d\") " pod="openshift-infra/auto-csr-approver-29535406-szgkv" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.490935 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535406-szgkv" Feb 26 16:46:00 crc kubenswrapper[5033]: I0226 16:46:00.960420 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535406-szgkv"] Feb 26 16:46:01 crc kubenswrapper[5033]: I0226 16:46:01.118188 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535406-szgkv" event={"ID":"d450c041-0cba-498d-94f9-c0b97047a70d","Type":"ContainerStarted","Data":"bce03f807bebf5421f2cccaf20c5719f70caacdc40bd94001745fcb5d00ede01"} Feb 26 16:46:03 crc kubenswrapper[5033]: I0226 16:46:03.148965 5033 generic.go:334] "Generic (PLEG): container finished" podID="d450c041-0cba-498d-94f9-c0b97047a70d" containerID="6f0d45e6a18ba5068d285a4573772eee8e5c591f9938c207a5fbc90ca4d2c622" exitCode=0 Feb 26 16:46:03 crc kubenswrapper[5033]: I0226 16:46:03.149073 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535406-szgkv" event={"ID":"d450c041-0cba-498d-94f9-c0b97047a70d","Type":"ContainerDied","Data":"6f0d45e6a18ba5068d285a4573772eee8e5c591f9938c207a5fbc90ca4d2c622"} Feb 26 16:46:03 crc kubenswrapper[5033]: I0226 16:46:03.638289 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:46:03 crc kubenswrapper[5033]: E0226 16:46:03.638659 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:46:04 crc kubenswrapper[5033]: I0226 16:46:04.518083 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535406-szgkv" Feb 26 16:46:04 crc kubenswrapper[5033]: I0226 16:46:04.636098 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf7nm\" (UniqueName: \"kubernetes.io/projected/d450c041-0cba-498d-94f9-c0b97047a70d-kube-api-access-bf7nm\") pod \"d450c041-0cba-498d-94f9-c0b97047a70d\" (UID: \"d450c041-0cba-498d-94f9-c0b97047a70d\") " Feb 26 16:46:04 crc kubenswrapper[5033]: I0226 16:46:04.643018 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d450c041-0cba-498d-94f9-c0b97047a70d-kube-api-access-bf7nm" (OuterVolumeSpecName: "kube-api-access-bf7nm") pod "d450c041-0cba-498d-94f9-c0b97047a70d" (UID: "d450c041-0cba-498d-94f9-c0b97047a70d"). InnerVolumeSpecName "kube-api-access-bf7nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:46:04 crc kubenswrapper[5033]: I0226 16:46:04.738015 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf7nm\" (UniqueName: \"kubernetes.io/projected/d450c041-0cba-498d-94f9-c0b97047a70d-kube-api-access-bf7nm\") on node \"crc\" DevicePath \"\"" Feb 26 16:46:05 crc kubenswrapper[5033]: I0226 16:46:05.166948 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535406-szgkv" event={"ID":"d450c041-0cba-498d-94f9-c0b97047a70d","Type":"ContainerDied","Data":"bce03f807bebf5421f2cccaf20c5719f70caacdc40bd94001745fcb5d00ede01"} Feb 26 16:46:05 crc kubenswrapper[5033]: I0226 16:46:05.167365 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce03f807bebf5421f2cccaf20c5719f70caacdc40bd94001745fcb5d00ede01" Feb 26 16:46:05 crc kubenswrapper[5033]: I0226 16:46:05.167016 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535406-szgkv" Feb 26 16:46:05 crc kubenswrapper[5033]: I0226 16:46:05.587454 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535400-9k4zj"] Feb 26 16:46:05 crc kubenswrapper[5033]: I0226 16:46:05.597076 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535400-9k4zj"] Feb 26 16:46:05 crc kubenswrapper[5033]: I0226 16:46:05.638483 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c5966b7-5f6b-4c57-ad01-abf09daf1aee" path="/var/lib/kubelet/pods/0c5966b7-5f6b-4c57-ad01-abf09daf1aee/volumes" Feb 26 16:46:14 crc kubenswrapper[5033]: I0226 16:46:14.628307 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:46:14 crc kubenswrapper[5033]: E0226 16:46:14.629388 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:46:26 crc kubenswrapper[5033]: I0226 16:46:26.628358 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:46:26 crc kubenswrapper[5033]: E0226 16:46:26.629198 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:46:31 crc kubenswrapper[5033]: I0226 16:46:31.534397 5033 scope.go:117] "RemoveContainer" containerID="36eaaa0bea08e5536973c1a75a3013053319516982a4e97a097e57e3cbd13038" Feb 26 16:46:41 crc kubenswrapper[5033]: I0226 16:46:41.629360 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:46:41 crc kubenswrapper[5033]: E0226 16:46:41.630562 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:46:55 crc kubenswrapper[5033]: I0226 16:46:55.628363 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:46:55 crc kubenswrapper[5033]: E0226 16:46:55.629464 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:47:09 crc kubenswrapper[5033]: I0226 16:47:09.629029 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:47:09 crc kubenswrapper[5033]: E0226 16:47:09.630422 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:47:22 crc kubenswrapper[5033]: I0226 16:47:22.628894 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:47:22 crc kubenswrapper[5033]: I0226 16:47:22.850130 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"8061a1ba29aacb31e5d56280c8199bb84d1098f36b51676717d39e2448413150"} Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.493206 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mwf6x"] Feb 26 16:47:59 crc kubenswrapper[5033]: E0226 16:47:59.494102 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d450c041-0cba-498d-94f9-c0b97047a70d" containerName="oc" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.494116 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d450c041-0cba-498d-94f9-c0b97047a70d" containerName="oc" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.494248 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d450c041-0cba-498d-94f9-c0b97047a70d" containerName="oc" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.495190 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.519446 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mwf6x"] Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.591506 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhvlq\" (UniqueName: \"kubernetes.io/projected/1799f267-5d16-49d2-81d6-1cf0836f0776-kube-api-access-nhvlq\") pod \"community-operators-mwf6x\" (UID: \"1799f267-5d16-49d2-81d6-1cf0836f0776\") " pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.591641 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1799f267-5d16-49d2-81d6-1cf0836f0776-catalog-content\") pod \"community-operators-mwf6x\" (UID: \"1799f267-5d16-49d2-81d6-1cf0836f0776\") " pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.591693 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1799f267-5d16-49d2-81d6-1cf0836f0776-utilities\") pod \"community-operators-mwf6x\" (UID: \"1799f267-5d16-49d2-81d6-1cf0836f0776\") " pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.692827 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1799f267-5d16-49d2-81d6-1cf0836f0776-utilities\") pod \"community-operators-mwf6x\" (UID: \"1799f267-5d16-49d2-81d6-1cf0836f0776\") " pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.692957 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhvlq\" (UniqueName: \"kubernetes.io/projected/1799f267-5d16-49d2-81d6-1cf0836f0776-kube-api-access-nhvlq\") pod \"community-operators-mwf6x\" (UID: \"1799f267-5d16-49d2-81d6-1cf0836f0776\") " pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.693034 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1799f267-5d16-49d2-81d6-1cf0836f0776-catalog-content\") pod \"community-operators-mwf6x\" (UID: \"1799f267-5d16-49d2-81d6-1cf0836f0776\") " pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.693631 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1799f267-5d16-49d2-81d6-1cf0836f0776-catalog-content\") pod \"community-operators-mwf6x\" (UID: \"1799f267-5d16-49d2-81d6-1cf0836f0776\") " pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.693688 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1799f267-5d16-49d2-81d6-1cf0836f0776-utilities\") pod \"community-operators-mwf6x\" (UID: \"1799f267-5d16-49d2-81d6-1cf0836f0776\") " pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.714998 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhvlq\" (UniqueName: \"kubernetes.io/projected/1799f267-5d16-49d2-81d6-1cf0836f0776-kube-api-access-nhvlq\") pod \"community-operators-mwf6x\" (UID: \"1799f267-5d16-49d2-81d6-1cf0836f0776\") " pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:47:59 crc kubenswrapper[5033]: I0226 16:47:59.815241 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.145743 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535408-gf6zl"] Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.146673 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535408-gf6zl" Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.150089 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.150267 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.150465 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.154837 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535408-gf6zl"] Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.301090 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2s9k\" (UniqueName: \"kubernetes.io/projected/4702edf6-d380-47e9-943a-b58d44359aa6-kube-api-access-c2s9k\") pod \"auto-csr-approver-29535408-gf6zl\" (UID: \"4702edf6-d380-47e9-943a-b58d44359aa6\") " pod="openshift-infra/auto-csr-approver-29535408-gf6zl" Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.313336 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mwf6x"] Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.402491 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2s9k\" (UniqueName: \"kubernetes.io/projected/4702edf6-d380-47e9-943a-b58d44359aa6-kube-api-access-c2s9k\") pod \"auto-csr-approver-29535408-gf6zl\" (UID: \"4702edf6-d380-47e9-943a-b58d44359aa6\") " pod="openshift-infra/auto-csr-approver-29535408-gf6zl" Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.434074 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2s9k\" (UniqueName: \"kubernetes.io/projected/4702edf6-d380-47e9-943a-b58d44359aa6-kube-api-access-c2s9k\") pod \"auto-csr-approver-29535408-gf6zl\" (UID: \"4702edf6-d380-47e9-943a-b58d44359aa6\") " pod="openshift-infra/auto-csr-approver-29535408-gf6zl" Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.469602 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535408-gf6zl" Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.745846 5033 generic.go:334] "Generic (PLEG): container finished" podID="1799f267-5d16-49d2-81d6-1cf0836f0776" containerID="0f2ad243b5a8ef02aecb8fab4d7736870e7cf272d20c8fd4ae3c32918cb19197" exitCode=0 Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.746037 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwf6x" event={"ID":"1799f267-5d16-49d2-81d6-1cf0836f0776","Type":"ContainerDied","Data":"0f2ad243b5a8ef02aecb8fab4d7736870e7cf272d20c8fd4ae3c32918cb19197"} Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.746330 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwf6x" event={"ID":"1799f267-5d16-49d2-81d6-1cf0836f0776","Type":"ContainerStarted","Data":"6bc495c7f431b9a3c974fcb2156bef5aed0d7d9f3b2017a3995ebdaa744a9a25"} Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.750052 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:48:00 crc kubenswrapper[5033]: I0226 16:48:00.841941 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535408-gf6zl"] Feb 26 16:48:01 crc kubenswrapper[5033]: I0226 16:48:01.753559 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535408-gf6zl" event={"ID":"4702edf6-d380-47e9-943a-b58d44359aa6","Type":"ContainerStarted","Data":"a7e908dd5659f27ee9e964364d04916fa3edf92d9764e2a7000fd536b8804a01"} Feb 26 16:48:02 crc kubenswrapper[5033]: I0226 16:48:02.761452 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535408-gf6zl" event={"ID":"4702edf6-d380-47e9-943a-b58d44359aa6","Type":"ContainerStarted","Data":"fb594d0a1c280c8cee73c2db7ed87922d38c1eb67709021f8f731a4d9b19f7ef"} Feb 26 16:48:02 crc kubenswrapper[5033]: I0226 16:48:02.774041 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535408-gf6zl" podStartSLOduration=1.507444655 podStartE2EDuration="2.774023075s" podCreationTimestamp="2026-02-26 16:48:00 +0000 UTC" firstStartedPulling="2026-02-26 16:48:00.847714021 +0000 UTC m=+4008.060312671" lastFinishedPulling="2026-02-26 16:48:02.114292441 +0000 UTC m=+4009.326891091" observedRunningTime="2026-02-26 16:48:02.773752789 +0000 UTC m=+4009.986351439" watchObservedRunningTime="2026-02-26 16:48:02.774023075 +0000 UTC m=+4009.986621725" Feb 26 16:48:03 crc kubenswrapper[5033]: I0226 16:48:03.777982 5033 generic.go:334] "Generic (PLEG): container finished" podID="4702edf6-d380-47e9-943a-b58d44359aa6" containerID="fb594d0a1c280c8cee73c2db7ed87922d38c1eb67709021f8f731a4d9b19f7ef" exitCode=0 Feb 26 16:48:03 crc kubenswrapper[5033]: I0226 16:48:03.778034 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535408-gf6zl" event={"ID":"4702edf6-d380-47e9-943a-b58d44359aa6","Type":"ContainerDied","Data":"fb594d0a1c280c8cee73c2db7ed87922d38c1eb67709021f8f731a4d9b19f7ef"} Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.152468 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535408-gf6zl" Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.273975 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2s9k\" (UniqueName: \"kubernetes.io/projected/4702edf6-d380-47e9-943a-b58d44359aa6-kube-api-access-c2s9k\") pod \"4702edf6-d380-47e9-943a-b58d44359aa6\" (UID: \"4702edf6-d380-47e9-943a-b58d44359aa6\") " Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.279067 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4702edf6-d380-47e9-943a-b58d44359aa6-kube-api-access-c2s9k" (OuterVolumeSpecName: "kube-api-access-c2s9k") pod "4702edf6-d380-47e9-943a-b58d44359aa6" (UID: "4702edf6-d380-47e9-943a-b58d44359aa6"). InnerVolumeSpecName "kube-api-access-c2s9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.375886 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2s9k\" (UniqueName: \"kubernetes.io/projected/4702edf6-d380-47e9-943a-b58d44359aa6-kube-api-access-c2s9k\") on node \"crc\" DevicePath \"\"" Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.799327 5033 generic.go:334] "Generic (PLEG): container finished" podID="1799f267-5d16-49d2-81d6-1cf0836f0776" containerID="5bca1e2ca4f510684950346555379bcaa85c56c30ebfba323befa8a0de3bc81d" exitCode=0 Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.799415 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwf6x" event={"ID":"1799f267-5d16-49d2-81d6-1cf0836f0776","Type":"ContainerDied","Data":"5bca1e2ca4f510684950346555379bcaa85c56c30ebfba323befa8a0de3bc81d"} Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.803064 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535408-gf6zl" event={"ID":"4702edf6-d380-47e9-943a-b58d44359aa6","Type":"ContainerDied","Data":"a7e908dd5659f27ee9e964364d04916fa3edf92d9764e2a7000fd536b8804a01"} Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.803116 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e908dd5659f27ee9e964364d04916fa3edf92d9764e2a7000fd536b8804a01" Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.803163 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535408-gf6zl" Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.865068 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535402-ffz7r"] Feb 26 16:48:05 crc kubenswrapper[5033]: I0226 16:48:05.871461 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535402-ffz7r"] Feb 26 16:48:06 crc kubenswrapper[5033]: I0226 16:48:06.813526 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwf6x" event={"ID":"1799f267-5d16-49d2-81d6-1cf0836f0776","Type":"ContainerStarted","Data":"80ab869514b525a755f43085b72bc9145dbb0073c5448487ac5e84ec39e720e9"} Feb 26 16:48:06 crc kubenswrapper[5033]: I0226 16:48:06.843688 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mwf6x" podStartSLOduration=2.388228743 podStartE2EDuration="7.843658357s" podCreationTimestamp="2026-02-26 16:47:59 +0000 UTC" firstStartedPulling="2026-02-26 16:48:00.749377777 +0000 UTC m=+4007.961976457" lastFinishedPulling="2026-02-26 16:48:06.204807411 +0000 UTC m=+4013.417406071" observedRunningTime="2026-02-26 16:48:06.834912604 +0000 UTC m=+4014.047511254" watchObservedRunningTime="2026-02-26 16:48:06.843658357 +0000 UTC m=+4014.056257047" Feb 26 16:48:07 crc kubenswrapper[5033]: I0226 16:48:07.644571 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e304789-0679-4f67-b13d-2eec7a07331f" path="/var/lib/kubelet/pods/2e304789-0679-4f67-b13d-2eec7a07331f/volumes" Feb 26 16:48:09 crc kubenswrapper[5033]: I0226 16:48:09.816371 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:48:09 crc kubenswrapper[5033]: I0226 16:48:09.816923 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:48:09 crc kubenswrapper[5033]: I0226 16:48:09.872540 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:48:19 crc kubenswrapper[5033]: I0226 16:48:19.897480 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mwf6x" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.011037 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mwf6x"] Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.067036 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdwq8"] Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.067395 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qdwq8" podUID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerName="registry-server" containerID="cri-o://33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431" gracePeriod=2 Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.483609 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdwq8" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.520354 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-utilities\") pod \"bd27068a-bd2d-482e-90ee-144ad9348bbe\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.520490 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkfln\" (UniqueName: \"kubernetes.io/projected/bd27068a-bd2d-482e-90ee-144ad9348bbe-kube-api-access-fkfln\") pod \"bd27068a-bd2d-482e-90ee-144ad9348bbe\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.520554 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-catalog-content\") pod \"bd27068a-bd2d-482e-90ee-144ad9348bbe\" (UID: \"bd27068a-bd2d-482e-90ee-144ad9348bbe\") " Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.520898 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-utilities" (OuterVolumeSpecName: "utilities") pod "bd27068a-bd2d-482e-90ee-144ad9348bbe" (UID: "bd27068a-bd2d-482e-90ee-144ad9348bbe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.525651 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd27068a-bd2d-482e-90ee-144ad9348bbe-kube-api-access-fkfln" (OuterVolumeSpecName: "kube-api-access-fkfln") pod "bd27068a-bd2d-482e-90ee-144ad9348bbe" (UID: "bd27068a-bd2d-482e-90ee-144ad9348bbe"). InnerVolumeSpecName "kube-api-access-fkfln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.594959 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd27068a-bd2d-482e-90ee-144ad9348bbe" (UID: "bd27068a-bd2d-482e-90ee-144ad9348bbe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.622055 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.622091 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkfln\" (UniqueName: \"kubernetes.io/projected/bd27068a-bd2d-482e-90ee-144ad9348bbe-kube-api-access-fkfln\") on node \"crc\" DevicePath \"\"" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.622102 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd27068a-bd2d-482e-90ee-144ad9348bbe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.937613 5033 generic.go:334] "Generic (PLEG): container finished" podID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerID="33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431" exitCode=0 Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.937649 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdwq8" event={"ID":"bd27068a-bd2d-482e-90ee-144ad9348bbe","Type":"ContainerDied","Data":"33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431"} Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.937685 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdwq8" event={"ID":"bd27068a-bd2d-482e-90ee-144ad9348bbe","Type":"ContainerDied","Data":"097ac23ab0d6ca9a33369ffc39f056ea9d6f78d5a12dfa3e0a41bc10b7ac19e5"} Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.937684 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdwq8" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.937699 5033 scope.go:117] "RemoveContainer" containerID="33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.954061 5033 scope.go:117] "RemoveContainer" containerID="46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145" Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.971201 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdwq8"] Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.976985 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qdwq8"] Feb 26 16:48:20 crc kubenswrapper[5033]: I0226 16:48:20.997732 5033 scope.go:117] "RemoveContainer" containerID="a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148" Feb 26 16:48:21 crc kubenswrapper[5033]: I0226 16:48:21.012387 5033 scope.go:117] "RemoveContainer" containerID="33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431" Feb 26 16:48:21 crc kubenswrapper[5033]: E0226 16:48:21.012808 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431\": container with ID starting with 33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431 not found: ID does not exist" containerID="33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431" Feb 26 16:48:21 crc kubenswrapper[5033]: I0226 16:48:21.012849 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431"} err="failed to get container status \"33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431\": rpc error: code = NotFound desc = could not find container \"33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431\": container with ID starting with 33740b98860da82b2238113a34414bfd5075ab391680072ee0453862d1e93431 not found: ID does not exist" Feb 26 16:48:21 crc kubenswrapper[5033]: I0226 16:48:21.012874 5033 scope.go:117] "RemoveContainer" containerID="46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145" Feb 26 16:48:21 crc kubenswrapper[5033]: E0226 16:48:21.013077 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145\": container with ID starting with 46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145 not found: ID does not exist" containerID="46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145" Feb 26 16:48:21 crc kubenswrapper[5033]: I0226 16:48:21.013099 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145"} err="failed to get container status \"46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145\": rpc error: code = NotFound desc = could not find container \"46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145\": container with ID starting with 46349aacb4bca15f1734489f5f1b6253d0f3c35321ff5f28502424435d8c1145 not found: ID does not exist" Feb 26 16:48:21 crc kubenswrapper[5033]: I0226 16:48:21.013113 5033 scope.go:117] "RemoveContainer" containerID="a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148" Feb 26 16:48:21 crc kubenswrapper[5033]: E0226 16:48:21.013272 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148\": container with ID starting with a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148 not found: ID does not exist" containerID="a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148" Feb 26 16:48:21 crc kubenswrapper[5033]: I0226 16:48:21.013295 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148"} err="failed to get container status \"a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148\": rpc error: code = NotFound desc = could not find container \"a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148\": container with ID starting with a6820d7de8be1d492d65feb22e083387bf8b9cbae9c778e9826bef5c3b92b148 not found: ID does not exist" Feb 26 16:48:21 crc kubenswrapper[5033]: I0226 16:48:21.647951 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd27068a-bd2d-482e-90ee-144ad9348bbe" path="/var/lib/kubelet/pods/bd27068a-bd2d-482e-90ee-144ad9348bbe/volumes" Feb 26 16:48:31 crc kubenswrapper[5033]: I0226 16:48:31.664425 5033 scope.go:117] "RemoveContainer" containerID="db424cca7fac63229e321b5ed3a0fceee677dd241e38b7fe08c5200830b4a16c" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.407553 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vrwpb"] Feb 26 16:49:05 crc kubenswrapper[5033]: E0226 16:49:05.408689 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerName="registry-server" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.408712 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerName="registry-server" Feb 26 16:49:05 crc kubenswrapper[5033]: E0226 16:49:05.408727 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerName="extract-utilities" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.408738 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerName="extract-utilities" Feb 26 16:49:05 crc kubenswrapper[5033]: E0226 16:49:05.408772 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4702edf6-d380-47e9-943a-b58d44359aa6" containerName="oc" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.408784 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4702edf6-d380-47e9-943a-b58d44359aa6" containerName="oc" Feb 26 16:49:05 crc kubenswrapper[5033]: E0226 16:49:05.408806 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerName="extract-content" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.408817 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerName="extract-content" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.409066 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4702edf6-d380-47e9-943a-b58d44359aa6" containerName="oc" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.409090 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd27068a-bd2d-482e-90ee-144ad9348bbe" containerName="registry-server" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.410637 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.416383 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vrwpb"] Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.542500 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-utilities\") pod \"redhat-operators-vrwpb\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.543029 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-catalog-content\") pod \"redhat-operators-vrwpb\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.543527 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg6wd\" (UniqueName: \"kubernetes.io/projected/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-kube-api-access-cg6wd\") pod \"redhat-operators-vrwpb\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.644811 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-utilities\") pod \"redhat-operators-vrwpb\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.644958 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-catalog-content\") pod \"redhat-operators-vrwpb\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.644980 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg6wd\" (UniqueName: \"kubernetes.io/projected/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-kube-api-access-cg6wd\") pod \"redhat-operators-vrwpb\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.645284 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-utilities\") pod \"redhat-operators-vrwpb\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.645557 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-catalog-content\") pod \"redhat-operators-vrwpb\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.667582 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg6wd\" (UniqueName: \"kubernetes.io/projected/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-kube-api-access-cg6wd\") pod \"redhat-operators-vrwpb\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:05 crc kubenswrapper[5033]: I0226 16:49:05.744776 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:06 crc kubenswrapper[5033]: I0226 16:49:06.194126 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vrwpb"] Feb 26 16:49:06 crc kubenswrapper[5033]: I0226 16:49:06.409301 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerID="e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b" exitCode=0 Feb 26 16:49:06 crc kubenswrapper[5033]: I0226 16:49:06.409368 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vrwpb" event={"ID":"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e","Type":"ContainerDied","Data":"e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b"} Feb 26 16:49:06 crc kubenswrapper[5033]: I0226 16:49:06.409443 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vrwpb" event={"ID":"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e","Type":"ContainerStarted","Data":"3b45c9285e510dc022f4a1cf6058616bfaed48e5207054eebfbbf346cb8007dc"} Feb 26 16:49:07 crc kubenswrapper[5033]: I0226 16:49:07.419967 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vrwpb" event={"ID":"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e","Type":"ContainerStarted","Data":"c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d"} Feb 26 16:49:08 crc kubenswrapper[5033]: I0226 16:49:08.436607 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerID="c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d" exitCode=0 Feb 26 16:49:08 crc kubenswrapper[5033]: I0226 16:49:08.436712 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vrwpb" event={"ID":"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e","Type":"ContainerDied","Data":"c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d"} Feb 26 16:49:09 crc kubenswrapper[5033]: I0226 16:49:09.446764 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vrwpb" event={"ID":"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e","Type":"ContainerStarted","Data":"9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3"} Feb 26 16:49:09 crc kubenswrapper[5033]: I0226 16:49:09.476731 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vrwpb" podStartSLOduration=2.012770273 podStartE2EDuration="4.476703488s" podCreationTimestamp="2026-02-26 16:49:05 +0000 UTC" firstStartedPulling="2026-02-26 16:49:06.410981101 +0000 UTC m=+4073.623579751" lastFinishedPulling="2026-02-26 16:49:08.874914276 +0000 UTC m=+4076.087512966" observedRunningTime="2026-02-26 16:49:09.474084165 +0000 UTC m=+4076.686682845" watchObservedRunningTime="2026-02-26 16:49:09.476703488 +0000 UTC m=+4076.689302138" Feb 26 16:49:15 crc kubenswrapper[5033]: I0226 16:49:15.745822 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:15 crc kubenswrapper[5033]: I0226 16:49:15.746539 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:16 crc kubenswrapper[5033]: I0226 16:49:16.817541 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vrwpb" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerName="registry-server" probeResult="failure" output=< Feb 26 16:49:16 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 16:49:16 crc kubenswrapper[5033]: > Feb 26 16:49:25 crc kubenswrapper[5033]: I0226 16:49:25.807450 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:25 crc kubenswrapper[5033]: I0226 16:49:25.876098 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:26 crc kubenswrapper[5033]: I0226 16:49:26.050527 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vrwpb"] Feb 26 16:49:27 crc kubenswrapper[5033]: I0226 16:49:27.593400 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vrwpb" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerName="registry-server" containerID="cri-o://9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3" gracePeriod=2 Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.046518 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.087572 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-utilities\") pod \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.087769 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg6wd\" (UniqueName: \"kubernetes.io/projected/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-kube-api-access-cg6wd\") pod \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.087860 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-catalog-content\") pod \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\" (UID: \"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e\") " Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.088965 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-utilities" (OuterVolumeSpecName: "utilities") pod "f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" (UID: "f9ea89e3-5f75-44b9-b59c-4fcb23164a3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.115643 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-kube-api-access-cg6wd" (OuterVolumeSpecName: "kube-api-access-cg6wd") pod "f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" (UID: "f9ea89e3-5f75-44b9-b59c-4fcb23164a3e"). InnerVolumeSpecName "kube-api-access-cg6wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.189560 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg6wd\" (UniqueName: \"kubernetes.io/projected/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-kube-api-access-cg6wd\") on node \"crc\" DevicePath \"\"" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.189601 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.299123 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" (UID: "f9ea89e3-5f75-44b9-b59c-4fcb23164a3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.392240 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.604954 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerID="9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3" exitCode=0 Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.605054 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vrwpb" event={"ID":"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e","Type":"ContainerDied","Data":"9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3"} Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.605404 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vrwpb" event={"ID":"f9ea89e3-5f75-44b9-b59c-4fcb23164a3e","Type":"ContainerDied","Data":"3b45c9285e510dc022f4a1cf6058616bfaed48e5207054eebfbbf346cb8007dc"} Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.605446 5033 scope.go:117] "RemoveContainer" containerID="9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.605095 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vrwpb" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.638545 5033 scope.go:117] "RemoveContainer" containerID="c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.640135 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vrwpb"] Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.645631 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vrwpb"] Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.659638 5033 scope.go:117] "RemoveContainer" containerID="e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.702291 5033 scope.go:117] "RemoveContainer" containerID="9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3" Feb 26 16:49:28 crc kubenswrapper[5033]: E0226 16:49:28.702782 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3\": container with ID starting with 9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3 not found: ID does not exist" containerID="9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.702825 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3"} err="failed to get container status \"9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3\": rpc error: code = NotFound desc = could not find container \"9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3\": container with ID starting with 9e93a772e74e50a86be931cceb4d27a64a014b4f2d82874b4b45741b1d7284e3 not found: ID does not exist" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.702851 5033 scope.go:117] "RemoveContainer" containerID="c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d" Feb 26 16:49:28 crc kubenswrapper[5033]: E0226 16:49:28.703169 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d\": container with ID starting with c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d not found: ID does not exist" containerID="c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.703279 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d"} err="failed to get container status \"c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d\": rpc error: code = NotFound desc = could not find container \"c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d\": container with ID starting with c1c56ff8c86965fd01a42dfaf7f12cb29fde51b7972539119126c7c72314db6d not found: ID does not exist" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.703393 5033 scope.go:117] "RemoveContainer" containerID="e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b" Feb 26 16:49:28 crc kubenswrapper[5033]: E0226 16:49:28.703692 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b\": container with ID starting with e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b not found: ID does not exist" containerID="e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b" Feb 26 16:49:28 crc kubenswrapper[5033]: I0226 16:49:28.703717 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b"} err="failed to get container status \"e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b\": rpc error: code = NotFound desc = could not find container \"e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b\": container with ID starting with e4c6001ffb9be3153a2025f3e494aa52247101508ffce7a5a88fc83455b4b16b not found: ID does not exist" Feb 26 16:49:29 crc kubenswrapper[5033]: I0226 16:49:29.635410 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" path="/var/lib/kubelet/pods/f9ea89e3-5f75-44b9-b59c-4fcb23164a3e/volumes" Feb 26 16:49:50 crc kubenswrapper[5033]: I0226 16:49:50.189659 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:49:50 crc kubenswrapper[5033]: I0226 16:49:50.190359 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.144520 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535410-g7ttl"] Feb 26 16:50:00 crc kubenswrapper[5033]: E0226 16:50:00.145347 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerName="extract-content" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.145361 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerName="extract-content" Feb 26 16:50:00 crc kubenswrapper[5033]: E0226 16:50:00.145380 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerName="extract-utilities" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.145404 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerName="extract-utilities" Feb 26 16:50:00 crc kubenswrapper[5033]: E0226 16:50:00.145420 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerName="registry-server" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.145428 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerName="registry-server" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.145552 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ea89e3-5f75-44b9-b59c-4fcb23164a3e" containerName="registry-server" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.145966 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535410-g7ttl" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.149370 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.149906 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.150348 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.156078 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535410-g7ttl"] Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.288847 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs69t\" (UniqueName: \"kubernetes.io/projected/7d1ed0d8-c83f-4717-a408-6192539ff0ce-kube-api-access-gs69t\") pod \"auto-csr-approver-29535410-g7ttl\" (UID: \"7d1ed0d8-c83f-4717-a408-6192539ff0ce\") " pod="openshift-infra/auto-csr-approver-29535410-g7ttl" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.389875 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs69t\" (UniqueName: \"kubernetes.io/projected/7d1ed0d8-c83f-4717-a408-6192539ff0ce-kube-api-access-gs69t\") pod \"auto-csr-approver-29535410-g7ttl\" (UID: \"7d1ed0d8-c83f-4717-a408-6192539ff0ce\") " pod="openshift-infra/auto-csr-approver-29535410-g7ttl" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.421365 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs69t\" (UniqueName: \"kubernetes.io/projected/7d1ed0d8-c83f-4717-a408-6192539ff0ce-kube-api-access-gs69t\") pod \"auto-csr-approver-29535410-g7ttl\" (UID: \"7d1ed0d8-c83f-4717-a408-6192539ff0ce\") " pod="openshift-infra/auto-csr-approver-29535410-g7ttl" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.480635 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535410-g7ttl" Feb 26 16:50:00 crc kubenswrapper[5033]: I0226 16:50:00.772697 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535410-g7ttl"] Feb 26 16:50:00 crc kubenswrapper[5033]: W0226 16:50:00.783393 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d1ed0d8_c83f_4717_a408_6192539ff0ce.slice/crio-9eaaef972594e55cfc813a1e4b8c4ad9f68cfd62485220990b7a07a9eb6576c4 WatchSource:0}: Error finding container 9eaaef972594e55cfc813a1e4b8c4ad9f68cfd62485220990b7a07a9eb6576c4: Status 404 returned error can't find the container with id 9eaaef972594e55cfc813a1e4b8c4ad9f68cfd62485220990b7a07a9eb6576c4 Feb 26 16:50:01 crc kubenswrapper[5033]: I0226 16:50:01.241958 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535410-g7ttl" event={"ID":"7d1ed0d8-c83f-4717-a408-6192539ff0ce","Type":"ContainerStarted","Data":"9eaaef972594e55cfc813a1e4b8c4ad9f68cfd62485220990b7a07a9eb6576c4"} Feb 26 16:50:03 crc kubenswrapper[5033]: I0226 16:50:03.261849 5033 generic.go:334] "Generic (PLEG): container finished" podID="7d1ed0d8-c83f-4717-a408-6192539ff0ce" containerID="dc42205538cf81754a2ab25a6e8da207ef40e7050402b726e0591b000d9de73d" exitCode=0 Feb 26 16:50:03 crc kubenswrapper[5033]: I0226 16:50:03.261951 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535410-g7ttl" event={"ID":"7d1ed0d8-c83f-4717-a408-6192539ff0ce","Type":"ContainerDied","Data":"dc42205538cf81754a2ab25a6e8da207ef40e7050402b726e0591b000d9de73d"} Feb 26 16:50:05 crc kubenswrapper[5033]: I0226 16:50:05.016547 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535410-g7ttl" Feb 26 16:50:05 crc kubenswrapper[5033]: I0226 16:50:05.168397 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs69t\" (UniqueName: \"kubernetes.io/projected/7d1ed0d8-c83f-4717-a408-6192539ff0ce-kube-api-access-gs69t\") pod \"7d1ed0d8-c83f-4717-a408-6192539ff0ce\" (UID: \"7d1ed0d8-c83f-4717-a408-6192539ff0ce\") " Feb 26 16:50:05 crc kubenswrapper[5033]: I0226 16:50:05.178801 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1ed0d8-c83f-4717-a408-6192539ff0ce-kube-api-access-gs69t" (OuterVolumeSpecName: "kube-api-access-gs69t") pod "7d1ed0d8-c83f-4717-a408-6192539ff0ce" (UID: "7d1ed0d8-c83f-4717-a408-6192539ff0ce"). InnerVolumeSpecName "kube-api-access-gs69t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:50:05 crc kubenswrapper[5033]: I0226 16:50:05.270309 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs69t\" (UniqueName: \"kubernetes.io/projected/7d1ed0d8-c83f-4717-a408-6192539ff0ce-kube-api-access-gs69t\") on node \"crc\" DevicePath \"\"" Feb 26 16:50:05 crc kubenswrapper[5033]: I0226 16:50:05.279286 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535410-g7ttl" event={"ID":"7d1ed0d8-c83f-4717-a408-6192539ff0ce","Type":"ContainerDied","Data":"9eaaef972594e55cfc813a1e4b8c4ad9f68cfd62485220990b7a07a9eb6576c4"} Feb 26 16:50:05 crc kubenswrapper[5033]: I0226 16:50:05.279328 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eaaef972594e55cfc813a1e4b8c4ad9f68cfd62485220990b7a07a9eb6576c4" Feb 26 16:50:05 crc kubenswrapper[5033]: I0226 16:50:05.279344 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535410-g7ttl" Feb 26 16:50:06 crc kubenswrapper[5033]: I0226 16:50:06.092601 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535404-8bcl2"] Feb 26 16:50:06 crc kubenswrapper[5033]: I0226 16:50:06.100772 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535404-8bcl2"] Feb 26 16:50:07 crc kubenswrapper[5033]: I0226 16:50:07.640518 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb078472-5501-419f-b276-7a91c83ffdc5" path="/var/lib/kubelet/pods/cb078472-5501-419f-b276-7a91c83ffdc5/volumes" Feb 26 16:50:20 crc kubenswrapper[5033]: I0226 16:50:20.211201 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:50:20 crc kubenswrapper[5033]: I0226 16:50:20.211904 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:50:31 crc kubenswrapper[5033]: I0226 16:50:31.792535 5033 scope.go:117] "RemoveContainer" containerID="7229e103d78a6c4fa54277770509de5aaeadb7d3c0d424de86a8c5a232134397" Feb 26 16:50:50 crc kubenswrapper[5033]: I0226 16:50:50.189969 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:50:50 crc kubenswrapper[5033]: I0226 16:50:50.190674 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:50:50 crc kubenswrapper[5033]: I0226 16:50:50.190740 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:50:50 crc kubenswrapper[5033]: I0226 16:50:50.191706 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8061a1ba29aacb31e5d56280c8199bb84d1098f36b51676717d39e2448413150"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:50:50 crc kubenswrapper[5033]: I0226 16:50:50.191806 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://8061a1ba29aacb31e5d56280c8199bb84d1098f36b51676717d39e2448413150" gracePeriod=600 Feb 26 16:50:50 crc kubenswrapper[5033]: I0226 16:50:50.692232 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="8061a1ba29aacb31e5d56280c8199bb84d1098f36b51676717d39e2448413150" exitCode=0 Feb 26 16:50:50 crc kubenswrapper[5033]: I0226 16:50:50.692394 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"8061a1ba29aacb31e5d56280c8199bb84d1098f36b51676717d39e2448413150"} Feb 26 16:50:50 crc kubenswrapper[5033]: I0226 16:50:50.692607 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35"} Feb 26 16:50:50 crc kubenswrapper[5033]: I0226 16:50:50.692634 5033 scope.go:117] "RemoveContainer" containerID="a5fd6454ca0d61f74bb5d0c35e21cd133ea1ec85f7520bf7c960a981c57fdbbe" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.164917 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535412-4vqtd"] Feb 26 16:52:00 crc kubenswrapper[5033]: E0226 16:52:00.165994 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1ed0d8-c83f-4717-a408-6192539ff0ce" containerName="oc" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.166014 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1ed0d8-c83f-4717-a408-6192539ff0ce" containerName="oc" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.166302 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1ed0d8-c83f-4717-a408-6192539ff0ce" containerName="oc" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.167044 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535412-4vqtd" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.169834 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.170125 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.170327 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.178811 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535412-4vqtd"] Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.207602 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpr4x\" (UniqueName: \"kubernetes.io/projected/be1e8e1b-0b39-4159-9fc5-6574e9030c79-kube-api-access-bpr4x\") pod \"auto-csr-approver-29535412-4vqtd\" (UID: \"be1e8e1b-0b39-4159-9fc5-6574e9030c79\") " pod="openshift-infra/auto-csr-approver-29535412-4vqtd" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.309575 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpr4x\" (UniqueName: \"kubernetes.io/projected/be1e8e1b-0b39-4159-9fc5-6574e9030c79-kube-api-access-bpr4x\") pod \"auto-csr-approver-29535412-4vqtd\" (UID: \"be1e8e1b-0b39-4159-9fc5-6574e9030c79\") " pod="openshift-infra/auto-csr-approver-29535412-4vqtd" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.337496 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpr4x\" (UniqueName: \"kubernetes.io/projected/be1e8e1b-0b39-4159-9fc5-6574e9030c79-kube-api-access-bpr4x\") pod \"auto-csr-approver-29535412-4vqtd\" (UID: \"be1e8e1b-0b39-4159-9fc5-6574e9030c79\") " pod="openshift-infra/auto-csr-approver-29535412-4vqtd" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.499846 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535412-4vqtd" Feb 26 16:52:00 crc kubenswrapper[5033]: I0226 16:52:00.970025 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535412-4vqtd"] Feb 26 16:52:00 crc kubenswrapper[5033]: W0226 16:52:00.980905 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe1e8e1b_0b39_4159_9fc5_6574e9030c79.slice/crio-789b906947c869d39fb8462224bbb03514b3b80352de606c83b1e633ac2ab3a9 WatchSource:0}: Error finding container 789b906947c869d39fb8462224bbb03514b3b80352de606c83b1e633ac2ab3a9: Status 404 returned error can't find the container with id 789b906947c869d39fb8462224bbb03514b3b80352de606c83b1e633ac2ab3a9 Feb 26 16:52:01 crc kubenswrapper[5033]: I0226 16:52:01.335464 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535412-4vqtd" event={"ID":"be1e8e1b-0b39-4159-9fc5-6574e9030c79","Type":"ContainerStarted","Data":"789b906947c869d39fb8462224bbb03514b3b80352de606c83b1e633ac2ab3a9"} Feb 26 16:52:02 crc kubenswrapper[5033]: I0226 16:52:02.349345 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535412-4vqtd" event={"ID":"be1e8e1b-0b39-4159-9fc5-6574e9030c79","Type":"ContainerStarted","Data":"82652c0f3026154d45b5e1d3c157b853a7545069842c76c57649a303dfc24f56"} Feb 26 16:52:02 crc kubenswrapper[5033]: I0226 16:52:02.381649 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535412-4vqtd" podStartSLOduration=1.308771652 podStartE2EDuration="2.381614651s" podCreationTimestamp="2026-02-26 16:52:00 +0000 UTC" firstStartedPulling="2026-02-26 16:52:00.984759235 +0000 UTC m=+4248.197357885" lastFinishedPulling="2026-02-26 16:52:02.057602234 +0000 UTC m=+4249.270200884" observedRunningTime="2026-02-26 16:52:02.369005915 +0000 UTC m=+4249.581604595" watchObservedRunningTime="2026-02-26 16:52:02.381614651 +0000 UTC m=+4249.594213341" Feb 26 16:52:03 crc kubenswrapper[5033]: I0226 16:52:03.362558 5033 generic.go:334] "Generic (PLEG): container finished" podID="be1e8e1b-0b39-4159-9fc5-6574e9030c79" containerID="82652c0f3026154d45b5e1d3c157b853a7545069842c76c57649a303dfc24f56" exitCode=0 Feb 26 16:52:03 crc kubenswrapper[5033]: I0226 16:52:03.362652 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535412-4vqtd" event={"ID":"be1e8e1b-0b39-4159-9fc5-6574e9030c79","Type":"ContainerDied","Data":"82652c0f3026154d45b5e1d3c157b853a7545069842c76c57649a303dfc24f56"} Feb 26 16:52:04 crc kubenswrapper[5033]: I0226 16:52:04.748960 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535412-4vqtd" Feb 26 16:52:04 crc kubenswrapper[5033]: I0226 16:52:04.779710 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpr4x\" (UniqueName: \"kubernetes.io/projected/be1e8e1b-0b39-4159-9fc5-6574e9030c79-kube-api-access-bpr4x\") pod \"be1e8e1b-0b39-4159-9fc5-6574e9030c79\" (UID: \"be1e8e1b-0b39-4159-9fc5-6574e9030c79\") " Feb 26 16:52:04 crc kubenswrapper[5033]: I0226 16:52:04.789692 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1e8e1b-0b39-4159-9fc5-6574e9030c79-kube-api-access-bpr4x" (OuterVolumeSpecName: "kube-api-access-bpr4x") pod "be1e8e1b-0b39-4159-9fc5-6574e9030c79" (UID: "be1e8e1b-0b39-4159-9fc5-6574e9030c79"). InnerVolumeSpecName "kube-api-access-bpr4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:52:04 crc kubenswrapper[5033]: I0226 16:52:04.881356 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpr4x\" (UniqueName: \"kubernetes.io/projected/be1e8e1b-0b39-4159-9fc5-6574e9030c79-kube-api-access-bpr4x\") on node \"crc\" DevicePath \"\"" Feb 26 16:52:05 crc kubenswrapper[5033]: I0226 16:52:05.384412 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535412-4vqtd" event={"ID":"be1e8e1b-0b39-4159-9fc5-6574e9030c79","Type":"ContainerDied","Data":"789b906947c869d39fb8462224bbb03514b3b80352de606c83b1e633ac2ab3a9"} Feb 26 16:52:05 crc kubenswrapper[5033]: I0226 16:52:05.384473 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="789b906947c869d39fb8462224bbb03514b3b80352de606c83b1e633ac2ab3a9" Feb 26 16:52:05 crc kubenswrapper[5033]: I0226 16:52:05.384479 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535412-4vqtd" Feb 26 16:52:05 crc kubenswrapper[5033]: I0226 16:52:05.471664 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535406-szgkv"] Feb 26 16:52:05 crc kubenswrapper[5033]: I0226 16:52:05.480669 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535406-szgkv"] Feb 26 16:52:05 crc kubenswrapper[5033]: I0226 16:52:05.639158 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d450c041-0cba-498d-94f9-c0b97047a70d" path="/var/lib/kubelet/pods/d450c041-0cba-498d-94f9-c0b97047a70d/volumes" Feb 26 16:52:31 crc kubenswrapper[5033]: I0226 16:52:31.896630 5033 scope.go:117] "RemoveContainer" containerID="6f0d45e6a18ba5068d285a4573772eee8e5c591f9938c207a5fbc90ca4d2c622" Feb 26 16:52:50 crc kubenswrapper[5033]: I0226 16:52:50.189734 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:52:50 crc kubenswrapper[5033]: I0226 16:52:50.190725 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:53:20 crc kubenswrapper[5033]: I0226 16:53:20.189224 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:53:20 crc kubenswrapper[5033]: I0226 16:53:20.191917 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:53:50 crc kubenswrapper[5033]: I0226 16:53:50.189568 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:53:50 crc kubenswrapper[5033]: I0226 16:53:50.190202 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:53:50 crc kubenswrapper[5033]: I0226 16:53:50.190301 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 16:53:50 crc kubenswrapper[5033]: I0226 16:53:50.191105 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:53:50 crc kubenswrapper[5033]: I0226 16:53:50.191209 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" gracePeriod=600 Feb 26 16:53:50 crc kubenswrapper[5033]: E0226 16:53:50.310842 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:53:50 crc kubenswrapper[5033]: I0226 16:53:50.380163 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" exitCode=0 Feb 26 16:53:50 crc kubenswrapper[5033]: I0226 16:53:50.380235 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35"} Feb 26 16:53:50 crc kubenswrapper[5033]: I0226 16:53:50.380346 5033 scope.go:117] "RemoveContainer" containerID="8061a1ba29aacb31e5d56280c8199bb84d1098f36b51676717d39e2448413150" Feb 26 16:53:50 crc kubenswrapper[5033]: I0226 16:53:50.381195 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:53:50 crc kubenswrapper[5033]: E0226 16:53:50.381695 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.155041 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535414-rpddx"] Feb 26 16:54:00 crc kubenswrapper[5033]: E0226 16:54:00.155887 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1e8e1b-0b39-4159-9fc5-6574e9030c79" containerName="oc" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.155899 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1e8e1b-0b39-4159-9fc5-6574e9030c79" containerName="oc" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.156029 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1e8e1b-0b39-4159-9fc5-6574e9030c79" containerName="oc" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.156486 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535414-rpddx" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.159616 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.162350 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.162959 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.173676 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535414-rpddx"] Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.324042 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krtsk\" (UniqueName: \"kubernetes.io/projected/4755fe0e-1649-43dd-94b1-770898fdfdf4-kube-api-access-krtsk\") pod \"auto-csr-approver-29535414-rpddx\" (UID: \"4755fe0e-1649-43dd-94b1-770898fdfdf4\") " pod="openshift-infra/auto-csr-approver-29535414-rpddx" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.425527 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krtsk\" (UniqueName: \"kubernetes.io/projected/4755fe0e-1649-43dd-94b1-770898fdfdf4-kube-api-access-krtsk\") pod \"auto-csr-approver-29535414-rpddx\" (UID: \"4755fe0e-1649-43dd-94b1-770898fdfdf4\") " pod="openshift-infra/auto-csr-approver-29535414-rpddx" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.460541 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krtsk\" (UniqueName: \"kubernetes.io/projected/4755fe0e-1649-43dd-94b1-770898fdfdf4-kube-api-access-krtsk\") pod \"auto-csr-approver-29535414-rpddx\" (UID: \"4755fe0e-1649-43dd-94b1-770898fdfdf4\") " pod="openshift-infra/auto-csr-approver-29535414-rpddx" Feb 26 16:54:00 crc kubenswrapper[5033]: I0226 16:54:00.483862 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535414-rpddx" Feb 26 16:54:01 crc kubenswrapper[5033]: I0226 16:54:01.011119 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535414-rpddx"] Feb 26 16:54:01 crc kubenswrapper[5033]: I0226 16:54:01.012548 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:54:01 crc kubenswrapper[5033]: I0226 16:54:01.494852 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535414-rpddx" event={"ID":"4755fe0e-1649-43dd-94b1-770898fdfdf4","Type":"ContainerStarted","Data":"5f32f06a5dd354bf1762ac0584e9aba3e2dadfa75a94e3ace732339f4c86106d"} Feb 26 16:54:01 crc kubenswrapper[5033]: I0226 16:54:01.628684 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:54:01 crc kubenswrapper[5033]: E0226 16:54:01.628932 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:54:02 crc kubenswrapper[5033]: I0226 16:54:02.506321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535414-rpddx" event={"ID":"4755fe0e-1649-43dd-94b1-770898fdfdf4","Type":"ContainerStarted","Data":"66a155d78f6eb40ef07e6bc35b266ef32b64c1890b2753d0508c9873e8732b9f"} Feb 26 16:54:02 crc kubenswrapper[5033]: I0226 16:54:02.531990 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535414-rpddx" podStartSLOduration=1.635623222 podStartE2EDuration="2.531966756s" podCreationTimestamp="2026-02-26 16:54:00 +0000 UTC" firstStartedPulling="2026-02-26 16:54:01.011881947 +0000 UTC m=+4368.224480627" lastFinishedPulling="2026-02-26 16:54:01.908225461 +0000 UTC m=+4369.120824161" observedRunningTime="2026-02-26 16:54:02.52385917 +0000 UTC m=+4369.736457890" watchObservedRunningTime="2026-02-26 16:54:02.531966756 +0000 UTC m=+4369.744565416" Feb 26 16:54:03 crc kubenswrapper[5033]: I0226 16:54:03.540568 5033 generic.go:334] "Generic (PLEG): container finished" podID="4755fe0e-1649-43dd-94b1-770898fdfdf4" containerID="66a155d78f6eb40ef07e6bc35b266ef32b64c1890b2753d0508c9873e8732b9f" exitCode=0 Feb 26 16:54:03 crc kubenswrapper[5033]: I0226 16:54:03.540648 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535414-rpddx" event={"ID":"4755fe0e-1649-43dd-94b1-770898fdfdf4","Type":"ContainerDied","Data":"66a155d78f6eb40ef07e6bc35b266ef32b64c1890b2753d0508c9873e8732b9f"} Feb 26 16:54:05 crc kubenswrapper[5033]: I0226 16:54:05.062430 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535414-rpddx" Feb 26 16:54:05 crc kubenswrapper[5033]: I0226 16:54:05.193909 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krtsk\" (UniqueName: \"kubernetes.io/projected/4755fe0e-1649-43dd-94b1-770898fdfdf4-kube-api-access-krtsk\") pod \"4755fe0e-1649-43dd-94b1-770898fdfdf4\" (UID: \"4755fe0e-1649-43dd-94b1-770898fdfdf4\") " Feb 26 16:54:05 crc kubenswrapper[5033]: I0226 16:54:05.203113 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4755fe0e-1649-43dd-94b1-770898fdfdf4-kube-api-access-krtsk" (OuterVolumeSpecName: "kube-api-access-krtsk") pod "4755fe0e-1649-43dd-94b1-770898fdfdf4" (UID: "4755fe0e-1649-43dd-94b1-770898fdfdf4"). InnerVolumeSpecName "kube-api-access-krtsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:54:05 crc kubenswrapper[5033]: I0226 16:54:05.295744 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krtsk\" (UniqueName: \"kubernetes.io/projected/4755fe0e-1649-43dd-94b1-770898fdfdf4-kube-api-access-krtsk\") on node \"crc\" DevicePath \"\"" Feb 26 16:54:05 crc kubenswrapper[5033]: I0226 16:54:05.561391 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535414-rpddx" event={"ID":"4755fe0e-1649-43dd-94b1-770898fdfdf4","Type":"ContainerDied","Data":"5f32f06a5dd354bf1762ac0584e9aba3e2dadfa75a94e3ace732339f4c86106d"} Feb 26 16:54:05 crc kubenswrapper[5033]: I0226 16:54:05.561444 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f32f06a5dd354bf1762ac0584e9aba3e2dadfa75a94e3ace732339f4c86106d" Feb 26 16:54:05 crc kubenswrapper[5033]: I0226 16:54:05.561516 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535414-rpddx" Feb 26 16:54:05 crc kubenswrapper[5033]: I0226 16:54:05.637099 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535408-gf6zl"] Feb 26 16:54:05 crc kubenswrapper[5033]: I0226 16:54:05.643682 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535408-gf6zl"] Feb 26 16:54:07 crc kubenswrapper[5033]: I0226 16:54:07.644483 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4702edf6-d380-47e9-943a-b58d44359aa6" path="/var/lib/kubelet/pods/4702edf6-d380-47e9-943a-b58d44359aa6/volumes" Feb 26 16:54:16 crc kubenswrapper[5033]: I0226 16:54:16.628681 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:54:16 crc kubenswrapper[5033]: E0226 16:54:16.629843 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:54:29 crc kubenswrapper[5033]: I0226 16:54:29.628711 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:54:29 crc kubenswrapper[5033]: E0226 16:54:29.629598 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:54:32 crc kubenswrapper[5033]: I0226 16:54:32.020485 5033 scope.go:117] "RemoveContainer" containerID="fb594d0a1c280c8cee73c2db7ed87922d38c1eb67709021f8f731a4d9b19f7ef" Feb 26 16:54:41 crc kubenswrapper[5033]: I0226 16:54:41.628725 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:54:41 crc kubenswrapper[5033]: E0226 16:54:41.629887 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:54:55 crc kubenswrapper[5033]: I0226 16:54:55.628502 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:54:55 crc kubenswrapper[5033]: E0226 16:54:55.629470 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:55:09 crc kubenswrapper[5033]: I0226 16:55:09.628246 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:55:09 crc kubenswrapper[5033]: E0226 16:55:09.629503 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.326740 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jcz2c"] Feb 26 16:55:12 crc kubenswrapper[5033]: E0226 16:55:12.327629 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4755fe0e-1649-43dd-94b1-770898fdfdf4" containerName="oc" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.327652 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4755fe0e-1649-43dd-94b1-770898fdfdf4" containerName="oc" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.327912 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4755fe0e-1649-43dd-94b1-770898fdfdf4" containerName="oc" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.329731 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.368065 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcz2c"] Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.440129 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbntx\" (UniqueName: \"kubernetes.io/projected/96d11364-1ac3-4431-a4b9-920650c64c1e-kube-api-access-kbntx\") pod \"certified-operators-jcz2c\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.440533 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-catalog-content\") pod \"certified-operators-jcz2c\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.440675 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-utilities\") pod \"certified-operators-jcz2c\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.541525 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbntx\" (UniqueName: \"kubernetes.io/projected/96d11364-1ac3-4431-a4b9-920650c64c1e-kube-api-access-kbntx\") pod \"certified-operators-jcz2c\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.541847 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-catalog-content\") pod \"certified-operators-jcz2c\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.541966 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-utilities\") pod \"certified-operators-jcz2c\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.542354 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-catalog-content\") pod \"certified-operators-jcz2c\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.542505 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-utilities\") pod \"certified-operators-jcz2c\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.565142 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbntx\" (UniqueName: \"kubernetes.io/projected/96d11364-1ac3-4431-a4b9-920650c64c1e-kube-api-access-kbntx\") pod \"certified-operators-jcz2c\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.719400 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:12 crc kubenswrapper[5033]: I0226 16:55:12.977051 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcz2c"] Feb 26 16:55:13 crc kubenswrapper[5033]: I0226 16:55:13.145075 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcz2c" event={"ID":"96d11364-1ac3-4431-a4b9-920650c64c1e","Type":"ContainerStarted","Data":"f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c"} Feb 26 16:55:13 crc kubenswrapper[5033]: I0226 16:55:13.145351 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcz2c" event={"ID":"96d11364-1ac3-4431-a4b9-920650c64c1e","Type":"ContainerStarted","Data":"be672059cbb9861df04d6818e13ca07bfe5ffc0be6b237d272eac0b78652da82"} Feb 26 16:55:14 crc kubenswrapper[5033]: I0226 16:55:14.162376 5033 generic.go:334] "Generic (PLEG): container finished" podID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerID="f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c" exitCode=0 Feb 26 16:55:14 crc kubenswrapper[5033]: I0226 16:55:14.162433 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcz2c" event={"ID":"96d11364-1ac3-4431-a4b9-920650c64c1e","Type":"ContainerDied","Data":"f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c"} Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.181033 5033 generic.go:334] "Generic (PLEG): container finished" podID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerID="4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2" exitCode=0 Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.181083 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcz2c" event={"ID":"96d11364-1ac3-4431-a4b9-920650c64c1e","Type":"ContainerDied","Data":"4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2"} Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.727336 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kxfqc"] Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.730693 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.747627 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxfqc"] Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.837232 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-catalog-content\") pod \"redhat-marketplace-kxfqc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.837325 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l27px\" (UniqueName: \"kubernetes.io/projected/4abc68cf-c09c-42d5-b418-d00907e46fdc-kube-api-access-l27px\") pod \"redhat-marketplace-kxfqc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.837370 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-utilities\") pod \"redhat-marketplace-kxfqc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.938526 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l27px\" (UniqueName: \"kubernetes.io/projected/4abc68cf-c09c-42d5-b418-d00907e46fdc-kube-api-access-l27px\") pod \"redhat-marketplace-kxfqc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.938580 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-utilities\") pod \"redhat-marketplace-kxfqc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.938651 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-catalog-content\") pod \"redhat-marketplace-kxfqc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.939136 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-catalog-content\") pod \"redhat-marketplace-kxfqc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.939219 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-utilities\") pod \"redhat-marketplace-kxfqc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:16 crc kubenswrapper[5033]: I0226 16:55:16.969899 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l27px\" (UniqueName: \"kubernetes.io/projected/4abc68cf-c09c-42d5-b418-d00907e46fdc-kube-api-access-l27px\") pod \"redhat-marketplace-kxfqc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:17 crc kubenswrapper[5033]: I0226 16:55:17.067686 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:17 crc kubenswrapper[5033]: I0226 16:55:17.192423 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcz2c" event={"ID":"96d11364-1ac3-4431-a4b9-920650c64c1e","Type":"ContainerStarted","Data":"f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7"} Feb 26 16:55:17 crc kubenswrapper[5033]: I0226 16:55:17.222138 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jcz2c" podStartSLOduration=2.51398804 podStartE2EDuration="5.222118845s" podCreationTimestamp="2026-02-26 16:55:12 +0000 UTC" firstStartedPulling="2026-02-26 16:55:14.165373635 +0000 UTC m=+4441.377972325" lastFinishedPulling="2026-02-26 16:55:16.87350447 +0000 UTC m=+4444.086103130" observedRunningTime="2026-02-26 16:55:17.216847397 +0000 UTC m=+4444.429446057" watchObservedRunningTime="2026-02-26 16:55:17.222118845 +0000 UTC m=+4444.434717495" Feb 26 16:55:17 crc kubenswrapper[5033]: I0226 16:55:17.500930 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxfqc"] Feb 26 16:55:17 crc kubenswrapper[5033]: W0226 16:55:17.512181 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4abc68cf_c09c_42d5_b418_d00907e46fdc.slice/crio-a7b25b9e7ebbf8f50a369c6000bc531238eea929da7f37b81cf519302b3b7485 WatchSource:0}: Error finding container a7b25b9e7ebbf8f50a369c6000bc531238eea929da7f37b81cf519302b3b7485: Status 404 returned error can't find the container with id a7b25b9e7ebbf8f50a369c6000bc531238eea929da7f37b81cf519302b3b7485 Feb 26 16:55:18 crc kubenswrapper[5033]: I0226 16:55:18.202656 5033 generic.go:334] "Generic (PLEG): container finished" podID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerID="2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b" exitCode=0 Feb 26 16:55:18 crc kubenswrapper[5033]: I0226 16:55:18.202745 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxfqc" event={"ID":"4abc68cf-c09c-42d5-b418-d00907e46fdc","Type":"ContainerDied","Data":"2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b"} Feb 26 16:55:18 crc kubenswrapper[5033]: I0226 16:55:18.203070 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxfqc" event={"ID":"4abc68cf-c09c-42d5-b418-d00907e46fdc","Type":"ContainerStarted","Data":"a7b25b9e7ebbf8f50a369c6000bc531238eea929da7f37b81cf519302b3b7485"} Feb 26 16:55:20 crc kubenswrapper[5033]: I0226 16:55:20.224646 5033 generic.go:334] "Generic (PLEG): container finished" podID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerID="6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c" exitCode=0 Feb 26 16:55:20 crc kubenswrapper[5033]: I0226 16:55:20.224761 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxfqc" event={"ID":"4abc68cf-c09c-42d5-b418-d00907e46fdc","Type":"ContainerDied","Data":"6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c"} Feb 26 16:55:21 crc kubenswrapper[5033]: I0226 16:55:21.238571 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxfqc" event={"ID":"4abc68cf-c09c-42d5-b418-d00907e46fdc","Type":"ContainerStarted","Data":"1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb"} Feb 26 16:55:22 crc kubenswrapper[5033]: I0226 16:55:22.720292 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:22 crc kubenswrapper[5033]: I0226 16:55:22.720360 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:22 crc kubenswrapper[5033]: I0226 16:55:22.786692 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:22 crc kubenswrapper[5033]: I0226 16:55:22.816845 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kxfqc" podStartSLOduration=4.416916018 podStartE2EDuration="6.816814019s" podCreationTimestamp="2026-02-26 16:55:16 +0000 UTC" firstStartedPulling="2026-02-26 16:55:18.205233466 +0000 UTC m=+4445.417832116" lastFinishedPulling="2026-02-26 16:55:20.605131467 +0000 UTC m=+4447.817730117" observedRunningTime="2026-02-26 16:55:21.266331782 +0000 UTC m=+4448.478930432" watchObservedRunningTime="2026-02-26 16:55:22.816814019 +0000 UTC m=+4450.029412719" Feb 26 16:55:23 crc kubenswrapper[5033]: I0226 16:55:23.315719 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:23 crc kubenswrapper[5033]: I0226 16:55:23.635499 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:55:23 crc kubenswrapper[5033]: E0226 16:55:23.635909 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:55:24 crc kubenswrapper[5033]: I0226 16:55:24.310575 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcz2c"] Feb 26 16:55:25 crc kubenswrapper[5033]: I0226 16:55:25.271521 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jcz2c" podUID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerName="registry-server" containerID="cri-o://f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7" gracePeriod=2 Feb 26 16:55:25 crc kubenswrapper[5033]: I0226 16:55:25.742017 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:25 crc kubenswrapper[5033]: I0226 16:55:25.896866 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-catalog-content\") pod \"96d11364-1ac3-4431-a4b9-920650c64c1e\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " Feb 26 16:55:25 crc kubenswrapper[5033]: I0226 16:55:25.896935 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-utilities\") pod \"96d11364-1ac3-4431-a4b9-920650c64c1e\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " Feb 26 16:55:25 crc kubenswrapper[5033]: I0226 16:55:25.896996 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbntx\" (UniqueName: \"kubernetes.io/projected/96d11364-1ac3-4431-a4b9-920650c64c1e-kube-api-access-kbntx\") pod \"96d11364-1ac3-4431-a4b9-920650c64c1e\" (UID: \"96d11364-1ac3-4431-a4b9-920650c64c1e\") " Feb 26 16:55:25 crc kubenswrapper[5033]: I0226 16:55:25.897817 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-utilities" (OuterVolumeSpecName: "utilities") pod "96d11364-1ac3-4431-a4b9-920650c64c1e" (UID: "96d11364-1ac3-4431-a4b9-920650c64c1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:55:25 crc kubenswrapper[5033]: I0226 16:55:25.903547 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d11364-1ac3-4431-a4b9-920650c64c1e-kube-api-access-kbntx" (OuterVolumeSpecName: "kube-api-access-kbntx") pod "96d11364-1ac3-4431-a4b9-920650c64c1e" (UID: "96d11364-1ac3-4431-a4b9-920650c64c1e"). InnerVolumeSpecName "kube-api-access-kbntx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:55:25 crc kubenswrapper[5033]: I0226 16:55:25.999190 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:55:25 crc kubenswrapper[5033]: I0226 16:55:25.999244 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbntx\" (UniqueName: \"kubernetes.io/projected/96d11364-1ac3-4431-a4b9-920650c64c1e-kube-api-access-kbntx\") on node \"crc\" DevicePath \"\"" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.280072 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcz2c" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.280347 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcz2c" event={"ID":"96d11364-1ac3-4431-a4b9-920650c64c1e","Type":"ContainerDied","Data":"f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7"} Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.280414 5033 scope.go:117] "RemoveContainer" containerID="f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.280978 5033 generic.go:334] "Generic (PLEG): container finished" podID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerID="f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7" exitCode=0 Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.281067 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcz2c" event={"ID":"96d11364-1ac3-4431-a4b9-920650c64c1e","Type":"ContainerDied","Data":"be672059cbb9861df04d6818e13ca07bfe5ffc0be6b237d272eac0b78652da82"} Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.300817 5033 scope.go:117] "RemoveContainer" containerID="4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.316633 5033 scope.go:117] "RemoveContainer" containerID="f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.338572 5033 scope.go:117] "RemoveContainer" containerID="f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7" Feb 26 16:55:26 crc kubenswrapper[5033]: E0226 16:55:26.339174 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7\": container with ID starting with f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7 not found: ID does not exist" containerID="f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.339221 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7"} err="failed to get container status \"f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7\": rpc error: code = NotFound desc = could not find container \"f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7\": container with ID starting with f4ff83368019d4c01d0176d8e2167a8c7695b432a1d187b305a2c600248848f7 not found: ID does not exist" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.339269 5033 scope.go:117] "RemoveContainer" containerID="4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2" Feb 26 16:55:26 crc kubenswrapper[5033]: E0226 16:55:26.339629 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2\": container with ID starting with 4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2 not found: ID does not exist" containerID="4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.339651 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2"} err="failed to get container status \"4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2\": rpc error: code = NotFound desc = could not find container \"4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2\": container with ID starting with 4210be22e0c5432a42117202b138d05cad3908aa238025b503c39cc8652ab5f2 not found: ID does not exist" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.339662 5033 scope.go:117] "RemoveContainer" containerID="f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c" Feb 26 16:55:26 crc kubenswrapper[5033]: E0226 16:55:26.339855 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c\": container with ID starting with f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c not found: ID does not exist" containerID="f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.339872 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c"} err="failed to get container status \"f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c\": rpc error: code = NotFound desc = could not find container \"f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c\": container with ID starting with f5003992bdf04c2606b6a28b3fb93a4ba1ed6e3fa2e085a71d05cbe6b53bf28c not found: ID does not exist" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.721906 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96d11364-1ac3-4431-a4b9-920650c64c1e" (UID: "96d11364-1ac3-4431-a4b9-920650c64c1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.809322 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d11364-1ac3-4431-a4b9-920650c64c1e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.920588 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcz2c"] Feb 26 16:55:26 crc kubenswrapper[5033]: I0226 16:55:26.926833 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jcz2c"] Feb 26 16:55:27 crc kubenswrapper[5033]: I0226 16:55:27.068175 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:27 crc kubenswrapper[5033]: I0226 16:55:27.068232 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:27 crc kubenswrapper[5033]: I0226 16:55:27.137832 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:27 crc kubenswrapper[5033]: I0226 16:55:27.354938 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:27 crc kubenswrapper[5033]: I0226 16:55:27.642580 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96d11364-1ac3-4431-a4b9-920650c64c1e" path="/var/lib/kubelet/pods/96d11364-1ac3-4431-a4b9-920650c64c1e/volumes" Feb 26 16:55:29 crc kubenswrapper[5033]: I0226 16:55:29.514679 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxfqc"] Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.317393 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kxfqc" podUID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerName="registry-server" containerID="cri-o://1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb" gracePeriod=2 Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.755033 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.869133 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l27px\" (UniqueName: \"kubernetes.io/projected/4abc68cf-c09c-42d5-b418-d00907e46fdc-kube-api-access-l27px\") pod \"4abc68cf-c09c-42d5-b418-d00907e46fdc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.869174 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-catalog-content\") pod \"4abc68cf-c09c-42d5-b418-d00907e46fdc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.869250 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-utilities\") pod \"4abc68cf-c09c-42d5-b418-d00907e46fdc\" (UID: \"4abc68cf-c09c-42d5-b418-d00907e46fdc\") " Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.870466 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-utilities" (OuterVolumeSpecName: "utilities") pod "4abc68cf-c09c-42d5-b418-d00907e46fdc" (UID: "4abc68cf-c09c-42d5-b418-d00907e46fdc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.875073 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abc68cf-c09c-42d5-b418-d00907e46fdc-kube-api-access-l27px" (OuterVolumeSpecName: "kube-api-access-l27px") pod "4abc68cf-c09c-42d5-b418-d00907e46fdc" (UID: "4abc68cf-c09c-42d5-b418-d00907e46fdc"). InnerVolumeSpecName "kube-api-access-l27px". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.900295 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4abc68cf-c09c-42d5-b418-d00907e46fdc" (UID: "4abc68cf-c09c-42d5-b418-d00907e46fdc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.971014 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l27px\" (UniqueName: \"kubernetes.io/projected/4abc68cf-c09c-42d5-b418-d00907e46fdc-kube-api-access-l27px\") on node \"crc\" DevicePath \"\"" Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.971052 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:55:30 crc kubenswrapper[5033]: I0226 16:55:30.971068 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abc68cf-c09c-42d5-b418-d00907e46fdc-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.335004 5033 generic.go:334] "Generic (PLEG): container finished" podID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerID="1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb" exitCode=0 Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.335083 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxfqc" event={"ID":"4abc68cf-c09c-42d5-b418-d00907e46fdc","Type":"ContainerDied","Data":"1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb"} Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.335132 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kxfqc" event={"ID":"4abc68cf-c09c-42d5-b418-d00907e46fdc","Type":"ContainerDied","Data":"a7b25b9e7ebbf8f50a369c6000bc531238eea929da7f37b81cf519302b3b7485"} Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.335168 5033 scope.go:117] "RemoveContainer" containerID="1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.335746 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kxfqc" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.374123 5033 scope.go:117] "RemoveContainer" containerID="6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.392885 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxfqc"] Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.402877 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kxfqc"] Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.406857 5033 scope.go:117] "RemoveContainer" containerID="2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.435679 5033 scope.go:117] "RemoveContainer" containerID="1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb" Feb 26 16:55:31 crc kubenswrapper[5033]: E0226 16:55:31.436673 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb\": container with ID starting with 1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb not found: ID does not exist" containerID="1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.437004 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb"} err="failed to get container status \"1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb\": rpc error: code = NotFound desc = could not find container \"1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb\": container with ID starting with 1fae4f37163a36485492be732f72a7861260e92a52ab163a000394fff259e5fb not found: ID does not exist" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.437220 5033 scope.go:117] "RemoveContainer" containerID="6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c" Feb 26 16:55:31 crc kubenswrapper[5033]: E0226 16:55:31.437953 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c\": container with ID starting with 6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c not found: ID does not exist" containerID="6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.437990 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c"} err="failed to get container status \"6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c\": rpc error: code = NotFound desc = could not find container \"6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c\": container with ID starting with 6d03c9bbb794bfab9645923595b7aefa5d75935ee283fa19ccafb605a03bf73c not found: ID does not exist" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.438015 5033 scope.go:117] "RemoveContainer" containerID="2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b" Feb 26 16:55:31 crc kubenswrapper[5033]: E0226 16:55:31.438659 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b\": container with ID starting with 2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b not found: ID does not exist" containerID="2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.438722 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b"} err="failed to get container status \"2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b\": rpc error: code = NotFound desc = could not find container \"2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b\": container with ID starting with 2e9c8d944032ea8003847ba081af1d09d9e5c79f642b0a86e5aba3ca54a8c60b not found: ID does not exist" Feb 26 16:55:31 crc kubenswrapper[5033]: I0226 16:55:31.644453 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abc68cf-c09c-42d5-b418-d00907e46fdc" path="/var/lib/kubelet/pods/4abc68cf-c09c-42d5-b418-d00907e46fdc/volumes" Feb 26 16:55:38 crc kubenswrapper[5033]: I0226 16:55:38.628705 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:55:38 crc kubenswrapper[5033]: E0226 16:55:38.629685 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:55:51 crc kubenswrapper[5033]: I0226 16:55:51.628525 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:55:51 crc kubenswrapper[5033]: E0226 16:55:51.629743 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.154700 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535416-9fs9j"] Feb 26 16:56:00 crc kubenswrapper[5033]: E0226 16:56:00.155461 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerName="extract-content" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.155476 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerName="extract-content" Feb 26 16:56:00 crc kubenswrapper[5033]: E0226 16:56:00.155492 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerName="registry-server" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.155498 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerName="registry-server" Feb 26 16:56:00 crc kubenswrapper[5033]: E0226 16:56:00.155512 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerName="extract-utilities" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.155518 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerName="extract-utilities" Feb 26 16:56:00 crc kubenswrapper[5033]: E0226 16:56:00.155533 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerName="extract-utilities" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.155538 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerName="extract-utilities" Feb 26 16:56:00 crc kubenswrapper[5033]: E0226 16:56:00.155547 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerName="extract-content" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.155553 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerName="extract-content" Feb 26 16:56:00 crc kubenswrapper[5033]: E0226 16:56:00.155566 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerName="registry-server" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.155572 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerName="registry-server" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.155708 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d11364-1ac3-4431-a4b9-920650c64c1e" containerName="registry-server" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.155726 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abc68cf-c09c-42d5-b418-d00907e46fdc" containerName="registry-server" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.156129 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535416-9fs9j" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.159101 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.159639 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.160967 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.180374 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535416-9fs9j"] Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.260388 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxjsc\" (UniqueName: \"kubernetes.io/projected/87f3eb14-b53a-4ef0-87c3-644c41acd01e-kube-api-access-xxjsc\") pod \"auto-csr-approver-29535416-9fs9j\" (UID: \"87f3eb14-b53a-4ef0-87c3-644c41acd01e\") " pod="openshift-infra/auto-csr-approver-29535416-9fs9j" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.362586 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxjsc\" (UniqueName: \"kubernetes.io/projected/87f3eb14-b53a-4ef0-87c3-644c41acd01e-kube-api-access-xxjsc\") pod \"auto-csr-approver-29535416-9fs9j\" (UID: \"87f3eb14-b53a-4ef0-87c3-644c41acd01e\") " pod="openshift-infra/auto-csr-approver-29535416-9fs9j" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.395675 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxjsc\" (UniqueName: \"kubernetes.io/projected/87f3eb14-b53a-4ef0-87c3-644c41acd01e-kube-api-access-xxjsc\") pod \"auto-csr-approver-29535416-9fs9j\" (UID: \"87f3eb14-b53a-4ef0-87c3-644c41acd01e\") " pod="openshift-infra/auto-csr-approver-29535416-9fs9j" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.476051 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535416-9fs9j" Feb 26 16:56:00 crc kubenswrapper[5033]: I0226 16:56:00.965556 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535416-9fs9j"] Feb 26 16:56:01 crc kubenswrapper[5033]: I0226 16:56:01.601333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535416-9fs9j" event={"ID":"87f3eb14-b53a-4ef0-87c3-644c41acd01e","Type":"ContainerStarted","Data":"114845c63104d63f26a06b95ac44a1ff1be2ab7f948704d9ded0fdc8ab36dedb"} Feb 26 16:56:02 crc kubenswrapper[5033]: I0226 16:56:02.613066 5033 generic.go:334] "Generic (PLEG): container finished" podID="87f3eb14-b53a-4ef0-87c3-644c41acd01e" containerID="bb2d1386ae9a53e5ff134a2af183ebce1a0c6777fda4e81eb3504ed4ad19ce3e" exitCode=0 Feb 26 16:56:02 crc kubenswrapper[5033]: I0226 16:56:02.613152 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535416-9fs9j" event={"ID":"87f3eb14-b53a-4ef0-87c3-644c41acd01e","Type":"ContainerDied","Data":"bb2d1386ae9a53e5ff134a2af183ebce1a0c6777fda4e81eb3504ed4ad19ce3e"} Feb 26 16:56:03 crc kubenswrapper[5033]: I0226 16:56:03.943131 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535416-9fs9j" Feb 26 16:56:04 crc kubenswrapper[5033]: I0226 16:56:04.120100 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxjsc\" (UniqueName: \"kubernetes.io/projected/87f3eb14-b53a-4ef0-87c3-644c41acd01e-kube-api-access-xxjsc\") pod \"87f3eb14-b53a-4ef0-87c3-644c41acd01e\" (UID: \"87f3eb14-b53a-4ef0-87c3-644c41acd01e\") " Feb 26 16:56:04 crc kubenswrapper[5033]: I0226 16:56:04.129374 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f3eb14-b53a-4ef0-87c3-644c41acd01e-kube-api-access-xxjsc" (OuterVolumeSpecName: "kube-api-access-xxjsc") pod "87f3eb14-b53a-4ef0-87c3-644c41acd01e" (UID: "87f3eb14-b53a-4ef0-87c3-644c41acd01e"). InnerVolumeSpecName "kube-api-access-xxjsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:56:04 crc kubenswrapper[5033]: I0226 16:56:04.222338 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxjsc\" (UniqueName: \"kubernetes.io/projected/87f3eb14-b53a-4ef0-87c3-644c41acd01e-kube-api-access-xxjsc\") on node \"crc\" DevicePath \"\"" Feb 26 16:56:04 crc kubenswrapper[5033]: I0226 16:56:04.627940 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:56:04 crc kubenswrapper[5033]: E0226 16:56:04.628227 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:56:04 crc kubenswrapper[5033]: I0226 16:56:04.630799 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535416-9fs9j" event={"ID":"87f3eb14-b53a-4ef0-87c3-644c41acd01e","Type":"ContainerDied","Data":"114845c63104d63f26a06b95ac44a1ff1be2ab7f948704d9ded0fdc8ab36dedb"} Feb 26 16:56:04 crc kubenswrapper[5033]: I0226 16:56:04.630838 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="114845c63104d63f26a06b95ac44a1ff1be2ab7f948704d9ded0fdc8ab36dedb" Feb 26 16:56:04 crc kubenswrapper[5033]: I0226 16:56:04.630847 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535416-9fs9j" Feb 26 16:56:05 crc kubenswrapper[5033]: I0226 16:56:05.029452 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535410-g7ttl"] Feb 26 16:56:05 crc kubenswrapper[5033]: I0226 16:56:05.037662 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535410-g7ttl"] Feb 26 16:56:05 crc kubenswrapper[5033]: I0226 16:56:05.644021 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1ed0d8-c83f-4717-a408-6192539ff0ce" path="/var/lib/kubelet/pods/7d1ed0d8-c83f-4717-a408-6192539ff0ce/volumes" Feb 26 16:56:15 crc kubenswrapper[5033]: I0226 16:56:15.628207 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:56:15 crc kubenswrapper[5033]: E0226 16:56:15.629131 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:56:26 crc kubenswrapper[5033]: I0226 16:56:26.628891 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:56:26 crc kubenswrapper[5033]: E0226 16:56:26.630305 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:56:32 crc kubenswrapper[5033]: I0226 16:56:32.166531 5033 scope.go:117] "RemoveContainer" containerID="dc42205538cf81754a2ab25a6e8da207ef40e7050402b726e0591b000d9de73d" Feb 26 16:56:38 crc kubenswrapper[5033]: I0226 16:56:38.630106 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:56:38 crc kubenswrapper[5033]: E0226 16:56:38.631526 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:56:53 crc kubenswrapper[5033]: I0226 16:56:53.636579 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:56:53 crc kubenswrapper[5033]: E0226 16:56:53.637560 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:57:08 crc kubenswrapper[5033]: I0226 16:57:08.628592 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:57:08 crc kubenswrapper[5033]: E0226 16:57:08.629891 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:57:21 crc kubenswrapper[5033]: I0226 16:57:21.629160 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:57:21 crc kubenswrapper[5033]: E0226 16:57:21.630779 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:57:35 crc kubenswrapper[5033]: I0226 16:57:35.629019 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:57:35 crc kubenswrapper[5033]: E0226 16:57:35.630010 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:57:49 crc kubenswrapper[5033]: I0226 16:57:49.629517 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:57:49 crc kubenswrapper[5033]: E0226 16:57:49.630821 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.147985 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535418-jhmvc"] Feb 26 16:58:00 crc kubenswrapper[5033]: E0226 16:58:00.149066 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f3eb14-b53a-4ef0-87c3-644c41acd01e" containerName="oc" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.149086 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f3eb14-b53a-4ef0-87c3-644c41acd01e" containerName="oc" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.149360 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f3eb14-b53a-4ef0-87c3-644c41acd01e" containerName="oc" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.150085 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535418-jhmvc" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.155606 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.155636 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.157607 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.171214 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535418-jhmvc"] Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.257090 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdwch\" (UniqueName: \"kubernetes.io/projected/26980423-2111-46cc-b0cd-b247409db6a5-kube-api-access-bdwch\") pod \"auto-csr-approver-29535418-jhmvc\" (UID: \"26980423-2111-46cc-b0cd-b247409db6a5\") " pod="openshift-infra/auto-csr-approver-29535418-jhmvc" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.359194 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdwch\" (UniqueName: \"kubernetes.io/projected/26980423-2111-46cc-b0cd-b247409db6a5-kube-api-access-bdwch\") pod \"auto-csr-approver-29535418-jhmvc\" (UID: \"26980423-2111-46cc-b0cd-b247409db6a5\") " pod="openshift-infra/auto-csr-approver-29535418-jhmvc" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.381179 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdwch\" (UniqueName: \"kubernetes.io/projected/26980423-2111-46cc-b0cd-b247409db6a5-kube-api-access-bdwch\") pod \"auto-csr-approver-29535418-jhmvc\" (UID: \"26980423-2111-46cc-b0cd-b247409db6a5\") " pod="openshift-infra/auto-csr-approver-29535418-jhmvc" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.510869 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535418-jhmvc" Feb 26 16:58:00 crc kubenswrapper[5033]: I0226 16:58:00.966119 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535418-jhmvc"] Feb 26 16:58:01 crc kubenswrapper[5033]: I0226 16:58:01.651627 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535418-jhmvc" event={"ID":"26980423-2111-46cc-b0cd-b247409db6a5","Type":"ContainerStarted","Data":"ec4e534ad080368833f23c46a061d5f3ffed96b3da6ce7510576732c7a237b05"} Feb 26 16:58:02 crc kubenswrapper[5033]: I0226 16:58:02.628457 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:58:02 crc kubenswrapper[5033]: E0226 16:58:02.629065 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:58:02 crc kubenswrapper[5033]: I0226 16:58:02.659091 5033 generic.go:334] "Generic (PLEG): container finished" podID="26980423-2111-46cc-b0cd-b247409db6a5" containerID="e62a4bda19c45c778d715e558d22e47c0fde6b8bff69e01198d8f50dc7025c0d" exitCode=0 Feb 26 16:58:02 crc kubenswrapper[5033]: I0226 16:58:02.659134 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535418-jhmvc" event={"ID":"26980423-2111-46cc-b0cd-b247409db6a5","Type":"ContainerDied","Data":"e62a4bda19c45c778d715e558d22e47c0fde6b8bff69e01198d8f50dc7025c0d"} Feb 26 16:58:03 crc kubenswrapper[5033]: I0226 16:58:03.930639 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535418-jhmvc" Feb 26 16:58:04 crc kubenswrapper[5033]: I0226 16:58:04.122467 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdwch\" (UniqueName: \"kubernetes.io/projected/26980423-2111-46cc-b0cd-b247409db6a5-kube-api-access-bdwch\") pod \"26980423-2111-46cc-b0cd-b247409db6a5\" (UID: \"26980423-2111-46cc-b0cd-b247409db6a5\") " Feb 26 16:58:04 crc kubenswrapper[5033]: I0226 16:58:04.131468 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26980423-2111-46cc-b0cd-b247409db6a5-kube-api-access-bdwch" (OuterVolumeSpecName: "kube-api-access-bdwch") pod "26980423-2111-46cc-b0cd-b247409db6a5" (UID: "26980423-2111-46cc-b0cd-b247409db6a5"). InnerVolumeSpecName "kube-api-access-bdwch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:58:04 crc kubenswrapper[5033]: I0226 16:58:04.225360 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdwch\" (UniqueName: \"kubernetes.io/projected/26980423-2111-46cc-b0cd-b247409db6a5-kube-api-access-bdwch\") on node \"crc\" DevicePath \"\"" Feb 26 16:58:04 crc kubenswrapper[5033]: I0226 16:58:04.679477 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535418-jhmvc" event={"ID":"26980423-2111-46cc-b0cd-b247409db6a5","Type":"ContainerDied","Data":"ec4e534ad080368833f23c46a061d5f3ffed96b3da6ce7510576732c7a237b05"} Feb 26 16:58:04 crc kubenswrapper[5033]: I0226 16:58:04.679846 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec4e534ad080368833f23c46a061d5f3ffed96b3da6ce7510576732c7a237b05" Feb 26 16:58:04 crc kubenswrapper[5033]: I0226 16:58:04.679534 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535418-jhmvc" Feb 26 16:58:05 crc kubenswrapper[5033]: I0226 16:58:05.009020 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535412-4vqtd"] Feb 26 16:58:05 crc kubenswrapper[5033]: I0226 16:58:05.014996 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535412-4vqtd"] Feb 26 16:58:05 crc kubenswrapper[5033]: I0226 16:58:05.636603 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be1e8e1b-0b39-4159-9fc5-6574e9030c79" path="/var/lib/kubelet/pods/be1e8e1b-0b39-4159-9fc5-6574e9030c79/volumes" Feb 26 16:58:17 crc kubenswrapper[5033]: I0226 16:58:17.629634 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:58:17 crc kubenswrapper[5033]: E0226 16:58:17.630598 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:58:29 crc kubenswrapper[5033]: I0226 16:58:29.628516 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:58:29 crc kubenswrapper[5033]: E0226 16:58:29.629645 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:58:32 crc kubenswrapper[5033]: I0226 16:58:32.266442 5033 scope.go:117] "RemoveContainer" containerID="82652c0f3026154d45b5e1d3c157b853a7545069842c76c57649a303dfc24f56" Feb 26 16:58:41 crc kubenswrapper[5033]: I0226 16:58:41.628666 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:58:41 crc kubenswrapper[5033]: E0226 16:58:41.629717 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 16:58:53 crc kubenswrapper[5033]: I0226 16:58:53.633923 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 16:58:54 crc kubenswrapper[5033]: I0226 16:58:54.123587 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"a4f312ff5f757f37f6f0328584c5b77de391b92d8488a888b49a674b4796ab8e"} Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.455829 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7d28v"] Feb 26 16:59:10 crc kubenswrapper[5033]: E0226 16:59:10.456849 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26980423-2111-46cc-b0cd-b247409db6a5" containerName="oc" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.456870 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="26980423-2111-46cc-b0cd-b247409db6a5" containerName="oc" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.457108 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="26980423-2111-46cc-b0cd-b247409db6a5" containerName="oc" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.461324 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.509061 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7d28v"] Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.552420 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-catalog-content\") pod \"redhat-operators-7d28v\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.552511 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4pf5\" (UniqueName: \"kubernetes.io/projected/56f1d6eb-950f-4eae-8357-f77da894a2d8-kube-api-access-g4pf5\") pod \"redhat-operators-7d28v\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.552552 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-utilities\") pod \"redhat-operators-7d28v\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.653317 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-catalog-content\") pod \"redhat-operators-7d28v\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.653382 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4pf5\" (UniqueName: \"kubernetes.io/projected/56f1d6eb-950f-4eae-8357-f77da894a2d8-kube-api-access-g4pf5\") pod \"redhat-operators-7d28v\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.653410 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-utilities\") pod \"redhat-operators-7d28v\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.653910 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-utilities\") pod \"redhat-operators-7d28v\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.653901 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-catalog-content\") pod \"redhat-operators-7d28v\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.674050 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4pf5\" (UniqueName: \"kubernetes.io/projected/56f1d6eb-950f-4eae-8357-f77da894a2d8-kube-api-access-g4pf5\") pod \"redhat-operators-7d28v\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:10 crc kubenswrapper[5033]: I0226 16:59:10.815918 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:11 crc kubenswrapper[5033]: I0226 16:59:11.247980 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7d28v"] Feb 26 16:59:11 crc kubenswrapper[5033]: I0226 16:59:11.295905 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d28v" event={"ID":"56f1d6eb-950f-4eae-8357-f77da894a2d8","Type":"ContainerStarted","Data":"aa0e3d3801806e7c11b0ba7294f0cda75f5d0f813433513c0ef55c0db1d11f09"} Feb 26 16:59:12 crc kubenswrapper[5033]: I0226 16:59:12.307695 5033 generic.go:334] "Generic (PLEG): container finished" podID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerID="a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3" exitCode=0 Feb 26 16:59:12 crc kubenswrapper[5033]: I0226 16:59:12.307768 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d28v" event={"ID":"56f1d6eb-950f-4eae-8357-f77da894a2d8","Type":"ContainerDied","Data":"a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3"} Feb 26 16:59:12 crc kubenswrapper[5033]: I0226 16:59:12.310506 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:59:14 crc kubenswrapper[5033]: I0226 16:59:14.330159 5033 generic.go:334] "Generic (PLEG): container finished" podID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerID="514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989" exitCode=0 Feb 26 16:59:14 crc kubenswrapper[5033]: I0226 16:59:14.330285 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d28v" event={"ID":"56f1d6eb-950f-4eae-8357-f77da894a2d8","Type":"ContainerDied","Data":"514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989"} Feb 26 16:59:16 crc kubenswrapper[5033]: I0226 16:59:16.352999 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d28v" event={"ID":"56f1d6eb-950f-4eae-8357-f77da894a2d8","Type":"ContainerStarted","Data":"88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1"} Feb 26 16:59:16 crc kubenswrapper[5033]: I0226 16:59:16.380313 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7d28v" podStartSLOduration=3.5961731 podStartE2EDuration="6.380297932s" podCreationTimestamp="2026-02-26 16:59:10 +0000 UTC" firstStartedPulling="2026-02-26 16:59:12.309998081 +0000 UTC m=+4679.522596761" lastFinishedPulling="2026-02-26 16:59:15.094122903 +0000 UTC m=+4682.306721593" observedRunningTime="2026-02-26 16:59:16.376022628 +0000 UTC m=+4683.588621278" watchObservedRunningTime="2026-02-26 16:59:16.380297932 +0000 UTC m=+4683.592896582" Feb 26 16:59:17 crc kubenswrapper[5033]: I0226 16:59:17.876466 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-92b7r"] Feb 26 16:59:17 crc kubenswrapper[5033]: I0226 16:59:17.887586 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-92b7r"] Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.039106 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-6pb7v"] Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.040513 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.044657 5033 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xlhkr" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.045403 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.045675 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.046768 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.051970 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6pb7v"] Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.168281 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d05643ca-db5e-49a4-a1b8-f93f598539a9-crc-storage\") pod \"crc-storage-crc-6pb7v\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.168329 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sllg\" (UniqueName: \"kubernetes.io/projected/d05643ca-db5e-49a4-a1b8-f93f598539a9-kube-api-access-9sllg\") pod \"crc-storage-crc-6pb7v\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.168366 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d05643ca-db5e-49a4-a1b8-f93f598539a9-node-mnt\") pod \"crc-storage-crc-6pb7v\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.269776 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d05643ca-db5e-49a4-a1b8-f93f598539a9-crc-storage\") pod \"crc-storage-crc-6pb7v\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.269824 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sllg\" (UniqueName: \"kubernetes.io/projected/d05643ca-db5e-49a4-a1b8-f93f598539a9-kube-api-access-9sllg\") pod \"crc-storage-crc-6pb7v\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.269871 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d05643ca-db5e-49a4-a1b8-f93f598539a9-node-mnt\") pod \"crc-storage-crc-6pb7v\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.270225 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d05643ca-db5e-49a4-a1b8-f93f598539a9-node-mnt\") pod \"crc-storage-crc-6pb7v\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.270558 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d05643ca-db5e-49a4-a1b8-f93f598539a9-crc-storage\") pod \"crc-storage-crc-6pb7v\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.288405 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sllg\" (UniqueName: \"kubernetes.io/projected/d05643ca-db5e-49a4-a1b8-f93f598539a9-kube-api-access-9sllg\") pod \"crc-storage-crc-6pb7v\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.358805 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:18 crc kubenswrapper[5033]: I0226 16:59:18.868063 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6pb7v"] Feb 26 16:59:19 crc kubenswrapper[5033]: I0226 16:59:19.380142 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6pb7v" event={"ID":"d05643ca-db5e-49a4-a1b8-f93f598539a9","Type":"ContainerStarted","Data":"2fa60227a8babed5f6ec3c5beb0c24fb6514d9adab774f56ca73f88fe11e8ce2"} Feb 26 16:59:19 crc kubenswrapper[5033]: I0226 16:59:19.643929 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c3abd5-9b43-4dc5-afbc-ba8b0433c512" path="/var/lib/kubelet/pods/72c3abd5-9b43-4dc5-afbc-ba8b0433c512/volumes" Feb 26 16:59:20 crc kubenswrapper[5033]: I0226 16:59:20.391038 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6pb7v" event={"ID":"d05643ca-db5e-49a4-a1b8-f93f598539a9","Type":"ContainerStarted","Data":"c2075e770d3ce75777505f3f2a586c10bb46c80baf0b29ea3fde8c090452d755"} Feb 26 16:59:20 crc kubenswrapper[5033]: I0226 16:59:20.418030 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-6pb7v" podStartSLOduration=1.704300749 podStartE2EDuration="2.418008919s" podCreationTimestamp="2026-02-26 16:59:18 +0000 UTC" firstStartedPulling="2026-02-26 16:59:18.872487093 +0000 UTC m=+4686.085085743" lastFinishedPulling="2026-02-26 16:59:19.586195243 +0000 UTC m=+4686.798793913" observedRunningTime="2026-02-26 16:59:20.408762745 +0000 UTC m=+4687.621361395" watchObservedRunningTime="2026-02-26 16:59:20.418008919 +0000 UTC m=+4687.630607579" Feb 26 16:59:20 crc kubenswrapper[5033]: I0226 16:59:20.816052 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:20 crc kubenswrapper[5033]: I0226 16:59:20.816520 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:21 crc kubenswrapper[5033]: I0226 16:59:21.404849 5033 generic.go:334] "Generic (PLEG): container finished" podID="d05643ca-db5e-49a4-a1b8-f93f598539a9" containerID="c2075e770d3ce75777505f3f2a586c10bb46c80baf0b29ea3fde8c090452d755" exitCode=0 Feb 26 16:59:21 crc kubenswrapper[5033]: I0226 16:59:21.404903 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6pb7v" event={"ID":"d05643ca-db5e-49a4-a1b8-f93f598539a9","Type":"ContainerDied","Data":"c2075e770d3ce75777505f3f2a586c10bb46c80baf0b29ea3fde8c090452d755"} Feb 26 16:59:21 crc kubenswrapper[5033]: I0226 16:59:21.885988 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7d28v" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerName="registry-server" probeResult="failure" output=< Feb 26 16:59:21 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 16:59:21 crc kubenswrapper[5033]: > Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.707723 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.837437 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d05643ca-db5e-49a4-a1b8-f93f598539a9-crc-storage\") pod \"d05643ca-db5e-49a4-a1b8-f93f598539a9\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.837561 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d05643ca-db5e-49a4-a1b8-f93f598539a9-node-mnt\") pod \"d05643ca-db5e-49a4-a1b8-f93f598539a9\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.837649 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sllg\" (UniqueName: \"kubernetes.io/projected/d05643ca-db5e-49a4-a1b8-f93f598539a9-kube-api-access-9sllg\") pod \"d05643ca-db5e-49a4-a1b8-f93f598539a9\" (UID: \"d05643ca-db5e-49a4-a1b8-f93f598539a9\") " Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.837828 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d05643ca-db5e-49a4-a1b8-f93f598539a9-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d05643ca-db5e-49a4-a1b8-f93f598539a9" (UID: "d05643ca-db5e-49a4-a1b8-f93f598539a9"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.838043 5033 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d05643ca-db5e-49a4-a1b8-f93f598539a9-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.842748 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05643ca-db5e-49a4-a1b8-f93f598539a9-kube-api-access-9sllg" (OuterVolumeSpecName: "kube-api-access-9sllg") pod "d05643ca-db5e-49a4-a1b8-f93f598539a9" (UID: "d05643ca-db5e-49a4-a1b8-f93f598539a9"). InnerVolumeSpecName "kube-api-access-9sllg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.855929 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d05643ca-db5e-49a4-a1b8-f93f598539a9-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d05643ca-db5e-49a4-a1b8-f93f598539a9" (UID: "d05643ca-db5e-49a4-a1b8-f93f598539a9"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.939655 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sllg\" (UniqueName: \"kubernetes.io/projected/d05643ca-db5e-49a4-a1b8-f93f598539a9-kube-api-access-9sllg\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:22 crc kubenswrapper[5033]: I0226 16:59:22.939696 5033 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d05643ca-db5e-49a4-a1b8-f93f598539a9-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.420410 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6pb7v" event={"ID":"d05643ca-db5e-49a4-a1b8-f93f598539a9","Type":"ContainerDied","Data":"2fa60227a8babed5f6ec3c5beb0c24fb6514d9adab774f56ca73f88fe11e8ce2"} Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.420443 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fa60227a8babed5f6ec3c5beb0c24fb6514d9adab774f56ca73f88fe11e8ce2" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.420474 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6pb7v" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.791611 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-52msx"] Feb 26 16:59:23 crc kubenswrapper[5033]: E0226 16:59:23.792346 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05643ca-db5e-49a4-a1b8-f93f598539a9" containerName="storage" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.792363 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05643ca-db5e-49a4-a1b8-f93f598539a9" containerName="storage" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.792523 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05643ca-db5e-49a4-a1b8-f93f598539a9" containerName="storage" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.793486 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.826441 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52msx"] Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.852911 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-utilities\") pod \"community-operators-52msx\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.852972 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp4r2\" (UniqueName: \"kubernetes.io/projected/e81a41b5-975b-4ddc-a914-c472e6c2c67b-kube-api-access-cp4r2\") pod \"community-operators-52msx\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.853073 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-catalog-content\") pod \"community-operators-52msx\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.954457 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-utilities\") pod \"community-operators-52msx\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.954541 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp4r2\" (UniqueName: \"kubernetes.io/projected/e81a41b5-975b-4ddc-a914-c472e6c2c67b-kube-api-access-cp4r2\") pod \"community-operators-52msx\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.954594 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-catalog-content\") pod \"community-operators-52msx\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.955050 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-utilities\") pod \"community-operators-52msx\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.955063 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-catalog-content\") pod \"community-operators-52msx\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:23 crc kubenswrapper[5033]: I0226 16:59:23.973433 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp4r2\" (UniqueName: \"kubernetes.io/projected/e81a41b5-975b-4ddc-a914-c472e6c2c67b-kube-api-access-cp4r2\") pod \"community-operators-52msx\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.121784 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.661412 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52msx"] Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.752712 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-6pb7v"] Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.767553 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-6pb7v"] Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.882738 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-khlr6"] Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.883666 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.886598 5033 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xlhkr" Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.886947 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.887361 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.901726 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.905450 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-khlr6"] Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.977506 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnh7d\" (UniqueName: \"kubernetes.io/projected/7d76feac-e79a-4fac-8d15-f4c84da60a8e-kube-api-access-fnh7d\") pod \"crc-storage-crc-khlr6\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.977747 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7d76feac-e79a-4fac-8d15-f4c84da60a8e-crc-storage\") pod \"crc-storage-crc-khlr6\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:24 crc kubenswrapper[5033]: I0226 16:59:24.977784 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7d76feac-e79a-4fac-8d15-f4c84da60a8e-node-mnt\") pod \"crc-storage-crc-khlr6\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.079013 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7d76feac-e79a-4fac-8d15-f4c84da60a8e-crc-storage\") pod \"crc-storage-crc-khlr6\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.079072 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7d76feac-e79a-4fac-8d15-f4c84da60a8e-node-mnt\") pod \"crc-storage-crc-khlr6\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.079145 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnh7d\" (UniqueName: \"kubernetes.io/projected/7d76feac-e79a-4fac-8d15-f4c84da60a8e-kube-api-access-fnh7d\") pod \"crc-storage-crc-khlr6\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.079772 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7d76feac-e79a-4fac-8d15-f4c84da60a8e-node-mnt\") pod \"crc-storage-crc-khlr6\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.080101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7d76feac-e79a-4fac-8d15-f4c84da60a8e-crc-storage\") pod \"crc-storage-crc-khlr6\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.376814 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnh7d\" (UniqueName: \"kubernetes.io/projected/7d76feac-e79a-4fac-8d15-f4c84da60a8e-kube-api-access-fnh7d\") pod \"crc-storage-crc-khlr6\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.437877 5033 generic.go:334] "Generic (PLEG): container finished" podID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerID="bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d" exitCode=0 Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.437932 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52msx" event={"ID":"e81a41b5-975b-4ddc-a914-c472e6c2c67b","Type":"ContainerDied","Data":"bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d"} Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.437962 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52msx" event={"ID":"e81a41b5-975b-4ddc-a914-c472e6c2c67b","Type":"ContainerStarted","Data":"d9b2a4d1b9c6a9acf45d81811944e49cb1295a88d2eb666fc79531ba18ae37f5"} Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.522894 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:25 crc kubenswrapper[5033]: I0226 16:59:25.641342 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05643ca-db5e-49a4-a1b8-f93f598539a9" path="/var/lib/kubelet/pods/d05643ca-db5e-49a4-a1b8-f93f598539a9/volumes" Feb 26 16:59:26 crc kubenswrapper[5033]: I0226 16:59:26.095072 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-khlr6"] Feb 26 16:59:26 crc kubenswrapper[5033]: W0226 16:59:26.102285 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d76feac_e79a_4fac_8d15_f4c84da60a8e.slice/crio-8a5cb0281f81730b24735d6391a45a968706fe431f581c14abd4a89e08c2be84 WatchSource:0}: Error finding container 8a5cb0281f81730b24735d6391a45a968706fe431f581c14abd4a89e08c2be84: Status 404 returned error can't find the container with id 8a5cb0281f81730b24735d6391a45a968706fe431f581c14abd4a89e08c2be84 Feb 26 16:59:26 crc kubenswrapper[5033]: I0226 16:59:26.447324 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-khlr6" event={"ID":"7d76feac-e79a-4fac-8d15-f4c84da60a8e","Type":"ContainerStarted","Data":"8a5cb0281f81730b24735d6391a45a968706fe431f581c14abd4a89e08c2be84"} Feb 26 16:59:28 crc kubenswrapper[5033]: I0226 16:59:28.469012 5033 generic.go:334] "Generic (PLEG): container finished" podID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerID="75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1" exitCode=0 Feb 26 16:59:28 crc kubenswrapper[5033]: I0226 16:59:28.469273 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52msx" event={"ID":"e81a41b5-975b-4ddc-a914-c472e6c2c67b","Type":"ContainerDied","Data":"75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1"} Feb 26 16:59:29 crc kubenswrapper[5033]: I0226 16:59:29.481438 5033 generic.go:334] "Generic (PLEG): container finished" podID="7d76feac-e79a-4fac-8d15-f4c84da60a8e" containerID="442d9f643ea52fd8208aa028614ee9a9ca1fd0842f23697a125538f6f93d84f6" exitCode=0 Feb 26 16:59:29 crc kubenswrapper[5033]: I0226 16:59:29.481495 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-khlr6" event={"ID":"7d76feac-e79a-4fac-8d15-f4c84da60a8e","Type":"ContainerDied","Data":"442d9f643ea52fd8208aa028614ee9a9ca1fd0842f23697a125538f6f93d84f6"} Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.787155 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.882262 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.886129 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7d76feac-e79a-4fac-8d15-f4c84da60a8e-node-mnt\") pod \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.886179 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnh7d\" (UniqueName: \"kubernetes.io/projected/7d76feac-e79a-4fac-8d15-f4c84da60a8e-kube-api-access-fnh7d\") pod \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.886204 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7d76feac-e79a-4fac-8d15-f4c84da60a8e-crc-storage\") pod \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\" (UID: \"7d76feac-e79a-4fac-8d15-f4c84da60a8e\") " Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.887448 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d76feac-e79a-4fac-8d15-f4c84da60a8e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7d76feac-e79a-4fac-8d15-f4c84da60a8e" (UID: "7d76feac-e79a-4fac-8d15-f4c84da60a8e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.898017 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d76feac-e79a-4fac-8d15-f4c84da60a8e-kube-api-access-fnh7d" (OuterVolumeSpecName: "kube-api-access-fnh7d") pod "7d76feac-e79a-4fac-8d15-f4c84da60a8e" (UID: "7d76feac-e79a-4fac-8d15-f4c84da60a8e"). InnerVolumeSpecName "kube-api-access-fnh7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.919250 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d76feac-e79a-4fac-8d15-f4c84da60a8e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7d76feac-e79a-4fac-8d15-f4c84da60a8e" (UID: "7d76feac-e79a-4fac-8d15-f4c84da60a8e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.935210 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.987570 5033 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7d76feac-e79a-4fac-8d15-f4c84da60a8e-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.987606 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnh7d\" (UniqueName: \"kubernetes.io/projected/7d76feac-e79a-4fac-8d15-f4c84da60a8e-kube-api-access-fnh7d\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:30 crc kubenswrapper[5033]: I0226 16:59:30.987620 5033 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7d76feac-e79a-4fac-8d15-f4c84da60a8e-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:31 crc kubenswrapper[5033]: I0226 16:59:31.500270 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-khlr6" event={"ID":"7d76feac-e79a-4fac-8d15-f4c84da60a8e","Type":"ContainerDied","Data":"8a5cb0281f81730b24735d6391a45a968706fe431f581c14abd4a89e08c2be84"} Feb 26 16:59:31 crc kubenswrapper[5033]: I0226 16:59:31.500292 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khlr6" Feb 26 16:59:31 crc kubenswrapper[5033]: I0226 16:59:31.500340 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a5cb0281f81730b24735d6391a45a968706fe431f581c14abd4a89e08c2be84" Feb 26 16:59:31 crc kubenswrapper[5033]: I0226 16:59:31.503840 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52msx" event={"ID":"e81a41b5-975b-4ddc-a914-c472e6c2c67b","Type":"ContainerStarted","Data":"04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c"} Feb 26 16:59:31 crc kubenswrapper[5033]: I0226 16:59:31.540110 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-52msx" podStartSLOduration=3.451781955 podStartE2EDuration="8.540093004s" podCreationTimestamp="2026-02-26 16:59:23 +0000 UTC" firstStartedPulling="2026-02-26 16:59:25.44008933 +0000 UTC m=+4692.652687990" lastFinishedPulling="2026-02-26 16:59:30.528400389 +0000 UTC m=+4697.740999039" observedRunningTime="2026-02-26 16:59:31.528231115 +0000 UTC m=+4698.740829775" watchObservedRunningTime="2026-02-26 16:59:31.540093004 +0000 UTC m=+4698.752691644" Feb 26 16:59:31 crc kubenswrapper[5033]: I0226 16:59:31.580752 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7d28v"] Feb 26 16:59:32 crc kubenswrapper[5033]: I0226 16:59:32.349633 5033 scope.go:117] "RemoveContainer" containerID="a47c29de880901af71fb92193bb4ce5792e9f58369fcaa326aa5a05f90ca5b12" Feb 26 16:59:32 crc kubenswrapper[5033]: I0226 16:59:32.511701 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7d28v" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerName="registry-server" containerID="cri-o://88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1" gracePeriod=2 Feb 26 16:59:32 crc kubenswrapper[5033]: I0226 16:59:32.985346 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.124636 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4pf5\" (UniqueName: \"kubernetes.io/projected/56f1d6eb-950f-4eae-8357-f77da894a2d8-kube-api-access-g4pf5\") pod \"56f1d6eb-950f-4eae-8357-f77da894a2d8\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.124732 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-utilities\") pod \"56f1d6eb-950f-4eae-8357-f77da894a2d8\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.124830 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-catalog-content\") pod \"56f1d6eb-950f-4eae-8357-f77da894a2d8\" (UID: \"56f1d6eb-950f-4eae-8357-f77da894a2d8\") " Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.128228 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-utilities" (OuterVolumeSpecName: "utilities") pod "56f1d6eb-950f-4eae-8357-f77da894a2d8" (UID: "56f1d6eb-950f-4eae-8357-f77da894a2d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.134848 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f1d6eb-950f-4eae-8357-f77da894a2d8-kube-api-access-g4pf5" (OuterVolumeSpecName: "kube-api-access-g4pf5") pod "56f1d6eb-950f-4eae-8357-f77da894a2d8" (UID: "56f1d6eb-950f-4eae-8357-f77da894a2d8"). InnerVolumeSpecName "kube-api-access-g4pf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.226697 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4pf5\" (UniqueName: \"kubernetes.io/projected/56f1d6eb-950f-4eae-8357-f77da894a2d8-kube-api-access-g4pf5\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.226775 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.296071 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56f1d6eb-950f-4eae-8357-f77da894a2d8" (UID: "56f1d6eb-950f-4eae-8357-f77da894a2d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.328368 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f1d6eb-950f-4eae-8357-f77da894a2d8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.529502 5033 generic.go:334] "Generic (PLEG): container finished" podID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerID="88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1" exitCode=0 Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.529554 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d28v" event={"ID":"56f1d6eb-950f-4eae-8357-f77da894a2d8","Type":"ContainerDied","Data":"88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1"} Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.529602 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d28v" event={"ID":"56f1d6eb-950f-4eae-8357-f77da894a2d8","Type":"ContainerDied","Data":"aa0e3d3801806e7c11b0ba7294f0cda75f5d0f813433513c0ef55c0db1d11f09"} Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.529623 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7d28v" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.529633 5033 scope.go:117] "RemoveContainer" containerID="88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.553452 5033 scope.go:117] "RemoveContainer" containerID="514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.582016 5033 scope.go:117] "RemoveContainer" containerID="a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.584780 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7d28v"] Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.596742 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7d28v"] Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.619600 5033 scope.go:117] "RemoveContainer" containerID="88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1" Feb 26 16:59:33 crc kubenswrapper[5033]: E0226 16:59:33.620098 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1\": container with ID starting with 88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1 not found: ID does not exist" containerID="88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.620141 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1"} err="failed to get container status \"88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1\": rpc error: code = NotFound desc = could not find container \"88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1\": container with ID starting with 88799cabed03745b44bdb272688e65534aceffaf358003ac218a34158ed499d1 not found: ID does not exist" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.620165 5033 scope.go:117] "RemoveContainer" containerID="514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989" Feb 26 16:59:33 crc kubenswrapper[5033]: E0226 16:59:33.621349 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989\": container with ID starting with 514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989 not found: ID does not exist" containerID="514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.621405 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989"} err="failed to get container status \"514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989\": rpc error: code = NotFound desc = could not find container \"514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989\": container with ID starting with 514d6294fe2d918f8d336a5311faf86eedaeadaf9dc4a5358a0ec9f77a5c2989 not found: ID does not exist" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.621439 5033 scope.go:117] "RemoveContainer" containerID="a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3" Feb 26 16:59:33 crc kubenswrapper[5033]: E0226 16:59:33.630558 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3\": container with ID starting with a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3 not found: ID does not exist" containerID="a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.630611 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3"} err="failed to get container status \"a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3\": rpc error: code = NotFound desc = could not find container \"a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3\": container with ID starting with a50f0151f1af8556546918e3f5a7db6b0faab23e4aa3342a7f428914a2c888d3 not found: ID does not exist" Feb 26 16:59:33 crc kubenswrapper[5033]: I0226 16:59:33.658072 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" path="/var/lib/kubelet/pods/56f1d6eb-950f-4eae-8357-f77da894a2d8/volumes" Feb 26 16:59:34 crc kubenswrapper[5033]: I0226 16:59:34.122918 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:34 crc kubenswrapper[5033]: I0226 16:59:34.122986 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:34 crc kubenswrapper[5033]: I0226 16:59:34.194393 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:44 crc kubenswrapper[5033]: I0226 16:59:44.177123 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:44 crc kubenswrapper[5033]: I0226 16:59:44.237156 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52msx"] Feb 26 16:59:44 crc kubenswrapper[5033]: I0226 16:59:44.627778 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-52msx" podUID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerName="registry-server" containerID="cri-o://04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c" gracePeriod=2 Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.093650 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.234741 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp4r2\" (UniqueName: \"kubernetes.io/projected/e81a41b5-975b-4ddc-a914-c472e6c2c67b-kube-api-access-cp4r2\") pod \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.234822 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-catalog-content\") pod \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.234895 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-utilities\") pod \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\" (UID: \"e81a41b5-975b-4ddc-a914-c472e6c2c67b\") " Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.235930 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-utilities" (OuterVolumeSpecName: "utilities") pod "e81a41b5-975b-4ddc-a914-c472e6c2c67b" (UID: "e81a41b5-975b-4ddc-a914-c472e6c2c67b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.243546 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81a41b5-975b-4ddc-a914-c472e6c2c67b-kube-api-access-cp4r2" (OuterVolumeSpecName: "kube-api-access-cp4r2") pod "e81a41b5-975b-4ddc-a914-c472e6c2c67b" (UID: "e81a41b5-975b-4ddc-a914-c472e6c2c67b"). InnerVolumeSpecName "kube-api-access-cp4r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.295498 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e81a41b5-975b-4ddc-a914-c472e6c2c67b" (UID: "e81a41b5-975b-4ddc-a914-c472e6c2c67b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.336471 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.336512 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp4r2\" (UniqueName: \"kubernetes.io/projected/e81a41b5-975b-4ddc-a914-c472e6c2c67b-kube-api-access-cp4r2\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.336525 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e81a41b5-975b-4ddc-a914-c472e6c2c67b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.640926 5033 generic.go:334] "Generic (PLEG): container finished" podID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerID="04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c" exitCode=0 Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.641023 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52msx" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.643000 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52msx" event={"ID":"e81a41b5-975b-4ddc-a914-c472e6c2c67b","Type":"ContainerDied","Data":"04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c"} Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.643052 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52msx" event={"ID":"e81a41b5-975b-4ddc-a914-c472e6c2c67b","Type":"ContainerDied","Data":"d9b2a4d1b9c6a9acf45d81811944e49cb1295a88d2eb666fc79531ba18ae37f5"} Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.643076 5033 scope.go:117] "RemoveContainer" containerID="04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.679706 5033 scope.go:117] "RemoveContainer" containerID="75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.681725 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52msx"] Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.689334 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-52msx"] Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.699642 5033 scope.go:117] "RemoveContainer" containerID="bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.719249 5033 scope.go:117] "RemoveContainer" containerID="04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c" Feb 26 16:59:45 crc kubenswrapper[5033]: E0226 16:59:45.719980 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c\": container with ID starting with 04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c not found: ID does not exist" containerID="04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.720028 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c"} err="failed to get container status \"04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c\": rpc error: code = NotFound desc = could not find container \"04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c\": container with ID starting with 04b09701dc98a5358a062f7f00b571a1a5df6c90a843e9813d90f34808884d1c not found: ID does not exist" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.720055 5033 scope.go:117] "RemoveContainer" containerID="75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1" Feb 26 16:59:45 crc kubenswrapper[5033]: E0226 16:59:45.720582 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1\": container with ID starting with 75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1 not found: ID does not exist" containerID="75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.720631 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1"} err="failed to get container status \"75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1\": rpc error: code = NotFound desc = could not find container \"75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1\": container with ID starting with 75449044643b7f7860a244569d7571442169a1b1b69fefd1dd29a213486a80f1 not found: ID does not exist" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.720675 5033 scope.go:117] "RemoveContainer" containerID="bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d" Feb 26 16:59:45 crc kubenswrapper[5033]: E0226 16:59:45.721004 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d\": container with ID starting with bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d not found: ID does not exist" containerID="bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d" Feb 26 16:59:45 crc kubenswrapper[5033]: I0226 16:59:45.721031 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d"} err="failed to get container status \"bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d\": rpc error: code = NotFound desc = could not find container \"bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d\": container with ID starting with bb24d1ccf318f7fa437cf4ef8c6710cb3e612052a2f65da9c84fd169c5cd218d not found: ID does not exist" Feb 26 16:59:47 crc kubenswrapper[5033]: I0226 16:59:47.643681 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" path="/var/lib/kubelet/pods/e81a41b5-975b-4ddc-a914-c472e6c2c67b/volumes" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.152037 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535420-hcbwm"] Feb 26 17:00:00 crc kubenswrapper[5033]: E0226 17:00:00.153397 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d76feac-e79a-4fac-8d15-f4c84da60a8e" containerName="storage" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153426 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d76feac-e79a-4fac-8d15-f4c84da60a8e" containerName="storage" Feb 26 17:00:00 crc kubenswrapper[5033]: E0226 17:00:00.153460 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerName="extract-content" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153474 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerName="extract-content" Feb 26 17:00:00 crc kubenswrapper[5033]: E0226 17:00:00.153495 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerName="registry-server" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153507 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerName="registry-server" Feb 26 17:00:00 crc kubenswrapper[5033]: E0226 17:00:00.153527 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerName="extract-content" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153541 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerName="extract-content" Feb 26 17:00:00 crc kubenswrapper[5033]: E0226 17:00:00.153568 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerName="extract-utilities" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153582 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerName="extract-utilities" Feb 26 17:00:00 crc kubenswrapper[5033]: E0226 17:00:00.153603 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerName="registry-server" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153616 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerName="registry-server" Feb 26 17:00:00 crc kubenswrapper[5033]: E0226 17:00:00.153638 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerName="extract-utilities" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153652 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerName="extract-utilities" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153898 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81a41b5-975b-4ddc-a914-c472e6c2c67b" containerName="registry-server" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153942 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f1d6eb-950f-4eae-8357-f77da894a2d8" containerName="registry-server" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.153964 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d76feac-e79a-4fac-8d15-f4c84da60a8e" containerName="storage" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.154690 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535420-hcbwm" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.159590 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.160107 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.160549 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.163662 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535420-hcbwm"] Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.256550 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l"] Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.258548 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.261542 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.265055 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.287068 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dwf4\" (UniqueName: \"kubernetes.io/projected/8ed8b94e-b065-4fd0-add3-23acfc3e9e63-kube-api-access-7dwf4\") pod \"auto-csr-approver-29535420-hcbwm\" (UID: \"8ed8b94e-b065-4fd0-add3-23acfc3e9e63\") " pod="openshift-infra/auto-csr-approver-29535420-hcbwm" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.289435 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l"] Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.389662 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-config-volume\") pod \"collect-profiles-29535420-t8z4l\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.389739 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dwf4\" (UniqueName: \"kubernetes.io/projected/8ed8b94e-b065-4fd0-add3-23acfc3e9e63-kube-api-access-7dwf4\") pod \"auto-csr-approver-29535420-hcbwm\" (UID: \"8ed8b94e-b065-4fd0-add3-23acfc3e9e63\") " pod="openshift-infra/auto-csr-approver-29535420-hcbwm" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.389789 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9jt4\" (UniqueName: \"kubernetes.io/projected/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-kube-api-access-z9jt4\") pod \"collect-profiles-29535420-t8z4l\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.389863 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-secret-volume\") pod \"collect-profiles-29535420-t8z4l\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.414421 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dwf4\" (UniqueName: \"kubernetes.io/projected/8ed8b94e-b065-4fd0-add3-23acfc3e9e63-kube-api-access-7dwf4\") pod \"auto-csr-approver-29535420-hcbwm\" (UID: \"8ed8b94e-b065-4fd0-add3-23acfc3e9e63\") " pod="openshift-infra/auto-csr-approver-29535420-hcbwm" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.491323 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-config-volume\") pod \"collect-profiles-29535420-t8z4l\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.491378 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9jt4\" (UniqueName: \"kubernetes.io/projected/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-kube-api-access-z9jt4\") pod \"collect-profiles-29535420-t8z4l\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.491413 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-secret-volume\") pod \"collect-profiles-29535420-t8z4l\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.491604 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535420-hcbwm" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.493541 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-config-volume\") pod \"collect-profiles-29535420-t8z4l\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.495412 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-secret-volume\") pod \"collect-profiles-29535420-t8z4l\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.507146 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9jt4\" (UniqueName: \"kubernetes.io/projected/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-kube-api-access-z9jt4\") pod \"collect-profiles-29535420-t8z4l\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.586969 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.897471 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l"] Feb 26 17:00:00 crc kubenswrapper[5033]: I0226 17:00:00.913133 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535420-hcbwm"] Feb 26 17:00:01 crc kubenswrapper[5033]: I0226 17:00:01.777965 5033 generic.go:334] "Generic (PLEG): container finished" podID="2a5ef96b-1bb4-48cd-a0ce-5e4eac167285" containerID="cc67a8a1fb5c45dc1c3cde41214cc762eb2ea7c8e008b4ebfa09fa376fdde52f" exitCode=0 Feb 26 17:00:01 crc kubenswrapper[5033]: I0226 17:00:01.778062 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" event={"ID":"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285","Type":"ContainerDied","Data":"cc67a8a1fb5c45dc1c3cde41214cc762eb2ea7c8e008b4ebfa09fa376fdde52f"} Feb 26 17:00:01 crc kubenswrapper[5033]: I0226 17:00:01.778618 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" event={"ID":"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285","Type":"ContainerStarted","Data":"f947b20dcbc640efc9cf4f33b56e457a34f35687a99c553b5f03c05931e8a1c9"} Feb 26 17:00:01 crc kubenswrapper[5033]: I0226 17:00:01.781140 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535420-hcbwm" event={"ID":"8ed8b94e-b065-4fd0-add3-23acfc3e9e63","Type":"ContainerStarted","Data":"e54ca405be850974aa58341e68e1ff54a3e220651912a9d0804a120ad9f7f956"} Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.074960 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.234080 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9jt4\" (UniqueName: \"kubernetes.io/projected/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-kube-api-access-z9jt4\") pod \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.234349 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-secret-volume\") pod \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.234412 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-config-volume\") pod \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\" (UID: \"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285\") " Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.235231 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a5ef96b-1bb4-48cd-a0ce-5e4eac167285" (UID: "2a5ef96b-1bb4-48cd-a0ce-5e4eac167285"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.239370 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-kube-api-access-z9jt4" (OuterVolumeSpecName: "kube-api-access-z9jt4") pod "2a5ef96b-1bb4-48cd-a0ce-5e4eac167285" (UID: "2a5ef96b-1bb4-48cd-a0ce-5e4eac167285"). InnerVolumeSpecName "kube-api-access-z9jt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.241357 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a5ef96b-1bb4-48cd-a0ce-5e4eac167285" (UID: "2a5ef96b-1bb4-48cd-a0ce-5e4eac167285"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.336428 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.336476 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.336490 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9jt4\" (UniqueName: \"kubernetes.io/projected/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285-kube-api-access-z9jt4\") on node \"crc\" DevicePath \"\"" Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.802694 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" event={"ID":"2a5ef96b-1bb4-48cd-a0ce-5e4eac167285","Type":"ContainerDied","Data":"f947b20dcbc640efc9cf4f33b56e457a34f35687a99c553b5f03c05931e8a1c9"} Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.802754 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f947b20dcbc640efc9cf4f33b56e457a34f35687a99c553b5f03c05931e8a1c9" Feb 26 17:00:03 crc kubenswrapper[5033]: I0226 17:00:03.802762 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l" Feb 26 17:00:04 crc kubenswrapper[5033]: I0226 17:00:04.149340 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z"] Feb 26 17:00:04 crc kubenswrapper[5033]: I0226 17:00:04.162133 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535375-txk7z"] Feb 26 17:00:05 crc kubenswrapper[5033]: I0226 17:00:05.651592 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="923c05a2-b537-4cd4-9cc9-1484790d83f7" path="/var/lib/kubelet/pods/923c05a2-b537-4cd4-9cc9-1484790d83f7/volumes" Feb 26 17:00:24 crc kubenswrapper[5033]: I0226 17:00:24.965112 5033 generic.go:334] "Generic (PLEG): container finished" podID="8ed8b94e-b065-4fd0-add3-23acfc3e9e63" containerID="1101471bee3125f12d57880571031d5c16f06608112226021fc8b683d14b42fd" exitCode=0 Feb 26 17:00:24 crc kubenswrapper[5033]: I0226 17:00:24.965225 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535420-hcbwm" event={"ID":"8ed8b94e-b065-4fd0-add3-23acfc3e9e63","Type":"ContainerDied","Data":"1101471bee3125f12d57880571031d5c16f06608112226021fc8b683d14b42fd"} Feb 26 17:00:26 crc kubenswrapper[5033]: I0226 17:00:26.311885 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535420-hcbwm" Feb 26 17:00:26 crc kubenswrapper[5033]: I0226 17:00:26.546412 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dwf4\" (UniqueName: \"kubernetes.io/projected/8ed8b94e-b065-4fd0-add3-23acfc3e9e63-kube-api-access-7dwf4\") pod \"8ed8b94e-b065-4fd0-add3-23acfc3e9e63\" (UID: \"8ed8b94e-b065-4fd0-add3-23acfc3e9e63\") " Feb 26 17:00:26 crc kubenswrapper[5033]: I0226 17:00:26.551863 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed8b94e-b065-4fd0-add3-23acfc3e9e63-kube-api-access-7dwf4" (OuterVolumeSpecName: "kube-api-access-7dwf4") pod "8ed8b94e-b065-4fd0-add3-23acfc3e9e63" (UID: "8ed8b94e-b065-4fd0-add3-23acfc3e9e63"). InnerVolumeSpecName "kube-api-access-7dwf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:00:26 crc kubenswrapper[5033]: I0226 17:00:26.648252 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dwf4\" (UniqueName: \"kubernetes.io/projected/8ed8b94e-b065-4fd0-add3-23acfc3e9e63-kube-api-access-7dwf4\") on node \"crc\" DevicePath \"\"" Feb 26 17:00:26 crc kubenswrapper[5033]: I0226 17:00:26.987603 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535420-hcbwm" event={"ID":"8ed8b94e-b065-4fd0-add3-23acfc3e9e63","Type":"ContainerDied","Data":"e54ca405be850974aa58341e68e1ff54a3e220651912a9d0804a120ad9f7f956"} Feb 26 17:00:26 crc kubenswrapper[5033]: I0226 17:00:26.987660 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e54ca405be850974aa58341e68e1ff54a3e220651912a9d0804a120ad9f7f956" Feb 26 17:00:26 crc kubenswrapper[5033]: I0226 17:00:26.987703 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535420-hcbwm" Feb 26 17:00:27 crc kubenswrapper[5033]: I0226 17:00:27.380712 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535414-rpddx"] Feb 26 17:00:27 crc kubenswrapper[5033]: I0226 17:00:27.387581 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535414-rpddx"] Feb 26 17:00:27 crc kubenswrapper[5033]: I0226 17:00:27.645897 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4755fe0e-1649-43dd-94b1-770898fdfdf4" path="/var/lib/kubelet/pods/4755fe0e-1649-43dd-94b1-770898fdfdf4/volumes" Feb 26 17:00:32 crc kubenswrapper[5033]: I0226 17:00:32.439505 5033 scope.go:117] "RemoveContainer" containerID="01459ab016970271b248fb04be3666dd2a642bc0a098c7a11761d0a06fc49566" Feb 26 17:00:32 crc kubenswrapper[5033]: I0226 17:00:32.504510 5033 scope.go:117] "RemoveContainer" containerID="66a155d78f6eb40ef07e6bc35b266ef32b64c1890b2753d0508c9873e8732b9f" Feb 26 17:01:20 crc kubenswrapper[5033]: I0226 17:01:20.189361 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:01:20 crc kubenswrapper[5033]: I0226 17:01:20.190046 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:01:50 crc kubenswrapper[5033]: I0226 17:01:50.189475 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:01:50 crc kubenswrapper[5033]: I0226 17:01:50.190233 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.162658 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535422-hnwlv"] Feb 26 17:02:00 crc kubenswrapper[5033]: E0226 17:02:00.163475 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed8b94e-b065-4fd0-add3-23acfc3e9e63" containerName="oc" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.163491 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed8b94e-b065-4fd0-add3-23acfc3e9e63" containerName="oc" Feb 26 17:02:00 crc kubenswrapper[5033]: E0226 17:02:00.163503 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5ef96b-1bb4-48cd-a0ce-5e4eac167285" containerName="collect-profiles" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.163512 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5ef96b-1bb4-48cd-a0ce-5e4eac167285" containerName="collect-profiles" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.163692 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a5ef96b-1bb4-48cd-a0ce-5e4eac167285" containerName="collect-profiles" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.163716 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed8b94e-b065-4fd0-add3-23acfc3e9e63" containerName="oc" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.164218 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535422-hnwlv" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.166620 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.166679 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.167183 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.187316 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535422-hnwlv"] Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.344479 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h92d6\" (UniqueName: \"kubernetes.io/projected/2efbc103-b984-4067-9864-7c45a0835260-kube-api-access-h92d6\") pod \"auto-csr-approver-29535422-hnwlv\" (UID: \"2efbc103-b984-4067-9864-7c45a0835260\") " pod="openshift-infra/auto-csr-approver-29535422-hnwlv" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.446583 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h92d6\" (UniqueName: \"kubernetes.io/projected/2efbc103-b984-4067-9864-7c45a0835260-kube-api-access-h92d6\") pod \"auto-csr-approver-29535422-hnwlv\" (UID: \"2efbc103-b984-4067-9864-7c45a0835260\") " pod="openshift-infra/auto-csr-approver-29535422-hnwlv" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.473208 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h92d6\" (UniqueName: \"kubernetes.io/projected/2efbc103-b984-4067-9864-7c45a0835260-kube-api-access-h92d6\") pod \"auto-csr-approver-29535422-hnwlv\" (UID: \"2efbc103-b984-4067-9864-7c45a0835260\") " pod="openshift-infra/auto-csr-approver-29535422-hnwlv" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.481870 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535422-hnwlv" Feb 26 17:02:00 crc kubenswrapper[5033]: I0226 17:02:00.961234 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535422-hnwlv"] Feb 26 17:02:00 crc kubenswrapper[5033]: W0226 17:02:00.968716 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2efbc103_b984_4067_9864_7c45a0835260.slice/crio-44cf1cd6cc01a6a4a6ac53ae7b482ab45e0a46580e97a764f617ba62530a7e37 WatchSource:0}: Error finding container 44cf1cd6cc01a6a4a6ac53ae7b482ab45e0a46580e97a764f617ba62530a7e37: Status 404 returned error can't find the container with id 44cf1cd6cc01a6a4a6ac53ae7b482ab45e0a46580e97a764f617ba62530a7e37 Feb 26 17:02:01 crc kubenswrapper[5033]: I0226 17:02:01.797948 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535422-hnwlv" event={"ID":"2efbc103-b984-4067-9864-7c45a0835260","Type":"ContainerStarted","Data":"44cf1cd6cc01a6a4a6ac53ae7b482ab45e0a46580e97a764f617ba62530a7e37"} Feb 26 17:02:03 crc kubenswrapper[5033]: I0226 17:02:03.824464 5033 generic.go:334] "Generic (PLEG): container finished" podID="2efbc103-b984-4067-9864-7c45a0835260" containerID="14629a4fb769d1ccae386c715516c214346e4a5f6506293f302d975c464f0adc" exitCode=0 Feb 26 17:02:03 crc kubenswrapper[5033]: I0226 17:02:03.824533 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535422-hnwlv" event={"ID":"2efbc103-b984-4067-9864-7c45a0835260","Type":"ContainerDied","Data":"14629a4fb769d1ccae386c715516c214346e4a5f6506293f302d975c464f0adc"} Feb 26 17:02:05 crc kubenswrapper[5033]: I0226 17:02:05.152314 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535422-hnwlv" Feb 26 17:02:05 crc kubenswrapper[5033]: I0226 17:02:05.264801 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h92d6\" (UniqueName: \"kubernetes.io/projected/2efbc103-b984-4067-9864-7c45a0835260-kube-api-access-h92d6\") pod \"2efbc103-b984-4067-9864-7c45a0835260\" (UID: \"2efbc103-b984-4067-9864-7c45a0835260\") " Feb 26 17:02:05 crc kubenswrapper[5033]: I0226 17:02:05.275606 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efbc103-b984-4067-9864-7c45a0835260-kube-api-access-h92d6" (OuterVolumeSpecName: "kube-api-access-h92d6") pod "2efbc103-b984-4067-9864-7c45a0835260" (UID: "2efbc103-b984-4067-9864-7c45a0835260"). InnerVolumeSpecName "kube-api-access-h92d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:02:05 crc kubenswrapper[5033]: I0226 17:02:05.366519 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h92d6\" (UniqueName: \"kubernetes.io/projected/2efbc103-b984-4067-9864-7c45a0835260-kube-api-access-h92d6\") on node \"crc\" DevicePath \"\"" Feb 26 17:02:05 crc kubenswrapper[5033]: I0226 17:02:05.852837 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535422-hnwlv" event={"ID":"2efbc103-b984-4067-9864-7c45a0835260","Type":"ContainerDied","Data":"44cf1cd6cc01a6a4a6ac53ae7b482ab45e0a46580e97a764f617ba62530a7e37"} Feb 26 17:02:05 crc kubenswrapper[5033]: I0226 17:02:05.852973 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535422-hnwlv" Feb 26 17:02:05 crc kubenswrapper[5033]: I0226 17:02:05.853069 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44cf1cd6cc01a6a4a6ac53ae7b482ab45e0a46580e97a764f617ba62530a7e37" Feb 26 17:02:06 crc kubenswrapper[5033]: I0226 17:02:06.257439 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535416-9fs9j"] Feb 26 17:02:06 crc kubenswrapper[5033]: I0226 17:02:06.266758 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535416-9fs9j"] Feb 26 17:02:07 crc kubenswrapper[5033]: I0226 17:02:07.647683 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f3eb14-b53a-4ef0-87c3-644c41acd01e" path="/var/lib/kubelet/pods/87f3eb14-b53a-4ef0-87c3-644c41acd01e/volumes" Feb 26 17:02:20 crc kubenswrapper[5033]: I0226 17:02:20.188846 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:02:20 crc kubenswrapper[5033]: I0226 17:02:20.189555 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:02:20 crc kubenswrapper[5033]: I0226 17:02:20.189619 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:02:20 crc kubenswrapper[5033]: I0226 17:02:20.190502 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4f312ff5f757f37f6f0328584c5b77de391b92d8488a888b49a674b4796ab8e"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:02:20 crc kubenswrapper[5033]: I0226 17:02:20.190585 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://a4f312ff5f757f37f6f0328584c5b77de391b92d8488a888b49a674b4796ab8e" gracePeriod=600 Feb 26 17:02:20 crc kubenswrapper[5033]: I0226 17:02:20.995056 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="a4f312ff5f757f37f6f0328584c5b77de391b92d8488a888b49a674b4796ab8e" exitCode=0 Feb 26 17:02:20 crc kubenswrapper[5033]: I0226 17:02:20.995130 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"a4f312ff5f757f37f6f0328584c5b77de391b92d8488a888b49a674b4796ab8e"} Feb 26 17:02:20 crc kubenswrapper[5033]: I0226 17:02:20.995660 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4"} Feb 26 17:02:20 crc kubenswrapper[5033]: I0226 17:02:20.995688 5033 scope.go:117] "RemoveContainer" containerID="a16e4849f040f7f24e186613cc865cef3f4058bad6e9fbf44e80db2cca049f35" Feb 26 17:02:32 crc kubenswrapper[5033]: I0226 17:02:32.661749 5033 scope.go:117] "RemoveContainer" containerID="bb2d1386ae9a53e5ff134a2af183ebce1a0c6777fda4e81eb3504ed4ad19ce3e" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.904590 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xn29p"] Feb 26 17:02:39 crc kubenswrapper[5033]: E0226 17:02:39.905287 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efbc103-b984-4067-9864-7c45a0835260" containerName="oc" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.905299 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efbc103-b984-4067-9864-7c45a0835260" containerName="oc" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.905421 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efbc103-b984-4067-9864-7c45a0835260" containerName="oc" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.906082 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.910159 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.910206 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.910173 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.911966 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.911969 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-v8zwb" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.913132 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xn29p"] Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.995035 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-config\") pod \"dnsmasq-dns-5d7b5456f5-xn29p\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.995234 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2w2l\" (UniqueName: \"kubernetes.io/projected/53790ba8-ee07-4a93-9737-eccba5ec6061-kube-api-access-v2w2l\") pod \"dnsmasq-dns-5d7b5456f5-xn29p\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:39 crc kubenswrapper[5033]: I0226 17:02:39.995326 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-xn29p\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.096367 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-config\") pod \"dnsmasq-dns-5d7b5456f5-xn29p\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.096437 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2w2l\" (UniqueName: \"kubernetes.io/projected/53790ba8-ee07-4a93-9737-eccba5ec6061-kube-api-access-v2w2l\") pod \"dnsmasq-dns-5d7b5456f5-xn29p\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.096471 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-xn29p\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.097361 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-xn29p\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.097395 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-config\") pod \"dnsmasq-dns-5d7b5456f5-xn29p\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.125246 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2w2l\" (UniqueName: \"kubernetes.io/projected/53790ba8-ee07-4a93-9737-eccba5ec6061-kube-api-access-v2w2l\") pod \"dnsmasq-dns-5d7b5456f5-xn29p\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.167957 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-bg7hg"] Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.169348 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.199569 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-bg7hg"] Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.224670 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.299199 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-bg7hg\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.299311 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-config\") pod \"dnsmasq-dns-98ddfc8f-bg7hg\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.299350 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2b42\" (UniqueName: \"kubernetes.io/projected/ae6528f1-15da-49f3-a1cd-6170da742302-kube-api-access-j2b42\") pod \"dnsmasq-dns-98ddfc8f-bg7hg\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.407981 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-bg7hg\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.408032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-config\") pod \"dnsmasq-dns-98ddfc8f-bg7hg\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.408057 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2b42\" (UniqueName: \"kubernetes.io/projected/ae6528f1-15da-49f3-a1cd-6170da742302-kube-api-access-j2b42\") pod \"dnsmasq-dns-98ddfc8f-bg7hg\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.409575 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-bg7hg\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.410078 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-config\") pod \"dnsmasq-dns-98ddfc8f-bg7hg\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.579863 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2b42\" (UniqueName: \"kubernetes.io/projected/ae6528f1-15da-49f3-a1cd-6170da742302-kube-api-access-j2b42\") pod \"dnsmasq-dns-98ddfc8f-bg7hg\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.736540 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xn29p"] Feb 26 17:02:40 crc kubenswrapper[5033]: W0226 17:02:40.753592 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53790ba8_ee07_4a93_9737_eccba5ec6061.slice/crio-2916270972e7439038fc7d30b27092d3264147e718cf6a71855379a3f546f704 WatchSource:0}: Error finding container 2916270972e7439038fc7d30b27092d3264147e718cf6a71855379a3f546f704: Status 404 returned error can't find the container with id 2916270972e7439038fc7d30b27092d3264147e718cf6a71855379a3f546f704 Feb 26 17:02:40 crc kubenswrapper[5033]: I0226 17:02:40.788905 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.026738 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.028143 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.047307 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.047698 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.047916 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.048030 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.048522 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-65xpw" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.051462 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.119093 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.119160 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbpw\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-kube-api-access-snbpw\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.119186 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.119242 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.119292 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.119322 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.119348 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.119365 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.119488 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.157781 5033 generic.go:334] "Generic (PLEG): container finished" podID="53790ba8-ee07-4a93-9737-eccba5ec6061" containerID="63945c0d2a3b62f063ffaf75557b12b009b4b2f1a4db1a7c11d53d89d66be38e" exitCode=0 Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.157848 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" event={"ID":"53790ba8-ee07-4a93-9737-eccba5ec6061","Type":"ContainerDied","Data":"63945c0d2a3b62f063ffaf75557b12b009b4b2f1a4db1a7c11d53d89d66be38e"} Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.157878 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" event={"ID":"53790ba8-ee07-4a93-9737-eccba5ec6061","Type":"ContainerStarted","Data":"2916270972e7439038fc7d30b27092d3264147e718cf6a71855379a3f546f704"} Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.211160 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.212552 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.215384 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.215418 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fftvv" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.221423 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.221456 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.221498 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.221548 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.221577 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbpw\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-kube-api-access-snbpw\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.221607 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.221635 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.221676 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.221702 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.222988 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.225665 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.225830 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.227326 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.227618 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.228849 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.230524 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.230573 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da68696fe89735bd81c8ec884c6f1b5859609de9dd0b0282954bef495b7b005c/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.232779 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.260939 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbpw\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-kube-api-access-snbpw\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.266295 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.310981 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-bg7hg"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.313379 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") pod \"rabbitmq-server-0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.323396 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7bcb013b-b51b-4de2-8fca-d83453bd4feb-config-data\") pod \"memcached-0\" (UID: \"7bcb013b-b51b-4de2-8fca-d83453bd4feb\") " pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.323458 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7bcb013b-b51b-4de2-8fca-d83453bd4feb-kolla-config\") pod \"memcached-0\" (UID: \"7bcb013b-b51b-4de2-8fca-d83453bd4feb\") " pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.323535 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmnzs\" (UniqueName: \"kubernetes.io/projected/7bcb013b-b51b-4de2-8fca-d83453bd4feb-kube-api-access-qmnzs\") pod \"memcached-0\" (UID: \"7bcb013b-b51b-4de2-8fca-d83453bd4feb\") " pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.329178 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.333483 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.336339 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.336463 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.336510 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.336350 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.336416 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-42gh2" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.344094 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.409676 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.424454 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7bcb013b-b51b-4de2-8fca-d83453bd4feb-kolla-config\") pod \"memcached-0\" (UID: \"7bcb013b-b51b-4de2-8fca-d83453bd4feb\") " pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.424521 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.424552 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.424588 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-589c2\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-kube-api-access-589c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.424855 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.424973 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.425026 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e635c835-6242-42f8-8334-beb5c4c60c98-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.425110 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmnzs\" (UniqueName: \"kubernetes.io/projected/7bcb013b-b51b-4de2-8fca-d83453bd4feb-kube-api-access-qmnzs\") pod \"memcached-0\" (UID: \"7bcb013b-b51b-4de2-8fca-d83453bd4feb\") " pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.425189 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.425208 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7bcb013b-b51b-4de2-8fca-d83453bd4feb-kolla-config\") pod \"memcached-0\" (UID: \"7bcb013b-b51b-4de2-8fca-d83453bd4feb\") " pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.425243 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7bcb013b-b51b-4de2-8fca-d83453bd4feb-config-data\") pod \"memcached-0\" (UID: \"7bcb013b-b51b-4de2-8fca-d83453bd4feb\") " pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.425310 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e635c835-6242-42f8-8334-beb5c4c60c98-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.425346 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.426062 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7bcb013b-b51b-4de2-8fca-d83453bd4feb-config-data\") pod \"memcached-0\" (UID: \"7bcb013b-b51b-4de2-8fca-d83453bd4feb\") " pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.442491 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmnzs\" (UniqueName: \"kubernetes.io/projected/7bcb013b-b51b-4de2-8fca-d83453bd4feb-kube-api-access-qmnzs\") pod \"memcached-0\" (UID: \"7bcb013b-b51b-4de2-8fca-d83453bd4feb\") " pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.526516 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.526557 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.526602 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-589c2\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-kube-api-access-589c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.526634 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.526658 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.526675 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e635c835-6242-42f8-8334-beb5c4c60c98-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.526706 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.526729 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e635c835-6242-42f8-8334-beb5c4c60c98-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.526745 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.527507 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.527807 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.528012 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.528113 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.533059 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.533119 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.533162 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d8409dfabed18f449aa186f8cff34535814ae624bb3ea725c5c133efd1f1f9ee/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.534838 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e635c835-6242-42f8-8334-beb5c4c60c98-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.535788 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e635c835-6242-42f8-8334-beb5c4c60c98-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.548582 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-589c2\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-kube-api-access-589c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.572077 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") pod \"rabbitmq-cell1-server-0\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.621421 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.656428 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.765944 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.767588 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.771000 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.772024 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.778775 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.778863 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gfdmt" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.782927 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.803773 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.852218 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.855170 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd8gj\" (UniqueName: \"kubernetes.io/projected/60fff19f-5516-4ccc-b6b1-a4c50d99696e-kube-api-access-dd8gj\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.855225 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/60fff19f-5516-4ccc-b6b1-a4c50d99696e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.855285 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8aea047e-7618-4d16-94cc-025772f2f7e9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aea047e-7618-4d16-94cc-025772f2f7e9\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.855339 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fff19f-5516-4ccc-b6b1-a4c50d99696e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.855424 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fff19f-5516-4ccc-b6b1-a4c50d99696e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.855460 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/60fff19f-5516-4ccc-b6b1-a4c50d99696e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.855507 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/60fff19f-5516-4ccc-b6b1-a4c50d99696e-config-data-default\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.855614 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60fff19f-5516-4ccc-b6b1-a4c50d99696e-kolla-config\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.958146 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60fff19f-5516-4ccc-b6b1-a4c50d99696e-kolla-config\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.958210 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd8gj\" (UniqueName: \"kubernetes.io/projected/60fff19f-5516-4ccc-b6b1-a4c50d99696e-kube-api-access-dd8gj\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.958233 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/60fff19f-5516-4ccc-b6b1-a4c50d99696e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.958264 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8aea047e-7618-4d16-94cc-025772f2f7e9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aea047e-7618-4d16-94cc-025772f2f7e9\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.958290 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fff19f-5516-4ccc-b6b1-a4c50d99696e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.958336 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fff19f-5516-4ccc-b6b1-a4c50d99696e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.958358 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/60fff19f-5516-4ccc-b6b1-a4c50d99696e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.958377 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/60fff19f-5516-4ccc-b6b1-a4c50d99696e-config-data-default\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.959175 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/60fff19f-5516-4ccc-b6b1-a4c50d99696e-config-data-default\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.959631 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60fff19f-5516-4ccc-b6b1-a4c50d99696e-kolla-config\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.960738 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/60fff19f-5516-4ccc-b6b1-a4c50d99696e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.961473 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fff19f-5516-4ccc-b6b1-a4c50d99696e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.961949 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.961985 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8aea047e-7618-4d16-94cc-025772f2f7e9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aea047e-7618-4d16-94cc-025772f2f7e9\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c8ed3ebef35cb66a875773e98e456ba0c5c2d659ba1b8f67aa7f78e15de1a8a2/globalmount\"" pod="openstack/openstack-galera-0" Feb 26 17:02:41 crc kubenswrapper[5033]: I0226 17:02:41.963033 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.072063 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/60fff19f-5516-4ccc-b6b1-a4c50d99696e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.072187 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fff19f-5516-4ccc-b6b1-a4c50d99696e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.072433 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd8gj\" (UniqueName: \"kubernetes.io/projected/60fff19f-5516-4ccc-b6b1-a4c50d99696e-kube-api-access-dd8gj\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: W0226 17:02:42.077829 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode635c835_6242_42f8_8334_beb5c4c60c98.slice/crio-08a4df109b7dfa086c010faffe7904f66a45284a92f1938a14a449b512152593 WatchSource:0}: Error finding container 08a4df109b7dfa086c010faffe7904f66a45284a92f1938a14a449b512152593: Status 404 returned error can't find the container with id 08a4df109b7dfa086c010faffe7904f66a45284a92f1938a14a449b512152593 Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.111826 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 17:02:42 crc kubenswrapper[5033]: W0226 17:02:42.119430 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bcb013b_b51b_4de2_8fca_d83453bd4feb.slice/crio-8e02c0a40dcf353432c8b1f5bf5d0c736a04001817624150b5a40fa6a78d0991 WatchSource:0}: Error finding container 8e02c0a40dcf353432c8b1f5bf5d0c736a04001817624150b5a40fa6a78d0991: Status 404 returned error can't find the container with id 8e02c0a40dcf353432c8b1f5bf5d0c736a04001817624150b5a40fa6a78d0991 Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.168767 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7bcb013b-b51b-4de2-8fca-d83453bd4feb","Type":"ContainerStarted","Data":"8e02c0a40dcf353432c8b1f5bf5d0c736a04001817624150b5a40fa6a78d0991"} Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.170065 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e635c835-6242-42f8-8334-beb5c4c60c98","Type":"ContainerStarted","Data":"08a4df109b7dfa086c010faffe7904f66a45284a92f1938a14a449b512152593"} Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.174490 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0","Type":"ContainerStarted","Data":"42f590217595287a5fc8776cc10cebcdc1a60182709397c5a1da250b19536576"} Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.178337 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" event={"ID":"53790ba8-ee07-4a93-9737-eccba5ec6061","Type":"ContainerStarted","Data":"8a5156a626548ecedcc6c9764d830d7e71f8020e88232a2cc4fd9ad4096e8e7b"} Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.178416 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.181090 5033 generic.go:334] "Generic (PLEG): container finished" podID="ae6528f1-15da-49f3-a1cd-6170da742302" containerID="2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9" exitCode=0 Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.181130 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" event={"ID":"ae6528f1-15da-49f3-a1cd-6170da742302","Type":"ContainerDied","Data":"2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9"} Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.181149 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" event={"ID":"ae6528f1-15da-49f3-a1cd-6170da742302","Type":"ContainerStarted","Data":"281f3e1c3977b8d6cd49f7375ddbb7bec06c9c0d52d78a1f419e55ffb2019f67"} Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.206820 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" podStartSLOduration=3.20679726 podStartE2EDuration="3.20679726s" podCreationTimestamp="2026-02-26 17:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:02:42.203690795 +0000 UTC m=+4889.416289455" watchObservedRunningTime="2026-02-26 17:02:42.20679726 +0000 UTC m=+4889.419395920" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.223080 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8aea047e-7618-4d16-94cc-025772f2f7e9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8aea047e-7618-4d16-94cc-025772f2f7e9\") pod \"openstack-galera-0\" (UID: \"60fff19f-5516-4ccc-b6b1-a4c50d99696e\") " pod="openstack/openstack-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.273000 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.274197 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.277244 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.277785 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-z4cbx" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.278017 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.287351 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.287879 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.364029 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a20a94a0-9589-4c0e-9305-72e94939e526-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.364107 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20a94a0-9589-4c0e-9305-72e94939e526-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.364139 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-68fbc0be-f96b-4655-818b-92261b43dc8a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68fbc0be-f96b-4655-818b-92261b43dc8a\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.364157 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crpgn\" (UniqueName: \"kubernetes.io/projected/a20a94a0-9589-4c0e-9305-72e94939e526-kube-api-access-crpgn\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.364191 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a20a94a0-9589-4c0e-9305-72e94939e526-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.364372 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a20a94a0-9589-4c0e-9305-72e94939e526-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.364403 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a20a94a0-9589-4c0e-9305-72e94939e526-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.364421 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20a94a0-9589-4c0e-9305-72e94939e526-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.398040 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.465615 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a20a94a0-9589-4c0e-9305-72e94939e526-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.465701 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20a94a0-9589-4c0e-9305-72e94939e526-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.465728 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-68fbc0be-f96b-4655-818b-92261b43dc8a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68fbc0be-f96b-4655-818b-92261b43dc8a\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.465745 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crpgn\" (UniqueName: \"kubernetes.io/projected/a20a94a0-9589-4c0e-9305-72e94939e526-kube-api-access-crpgn\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.465780 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a20a94a0-9589-4c0e-9305-72e94939e526-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.465806 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a20a94a0-9589-4c0e-9305-72e94939e526-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.465828 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a20a94a0-9589-4c0e-9305-72e94939e526-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.465847 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20a94a0-9589-4c0e-9305-72e94939e526-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.466793 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a20a94a0-9589-4c0e-9305-72e94939e526-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.467504 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a20a94a0-9589-4c0e-9305-72e94939e526-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.468021 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a20a94a0-9589-4c0e-9305-72e94939e526-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.468324 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a20a94a0-9589-4c0e-9305-72e94939e526-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.468733 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.468768 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-68fbc0be-f96b-4655-818b-92261b43dc8a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68fbc0be-f96b-4655-818b-92261b43dc8a\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0c41622b580e0a9f0e92e86cac900f46b77014d1a4cdbcad932027fe37486af1/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.471629 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20a94a0-9589-4c0e-9305-72e94939e526-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.474687 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20a94a0-9589-4c0e-9305-72e94939e526-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.483334 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crpgn\" (UniqueName: \"kubernetes.io/projected/a20a94a0-9589-4c0e-9305-72e94939e526-kube-api-access-crpgn\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.706301 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-68fbc0be-f96b-4655-818b-92261b43dc8a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-68fbc0be-f96b-4655-818b-92261b43dc8a\") pod \"openstack-cell1-galera-0\" (UID: \"a20a94a0-9589-4c0e-9305-72e94939e526\") " pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.922739 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 17:02:42 crc kubenswrapper[5033]: W0226 17:02:42.926381 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60fff19f_5516_4ccc_b6b1_a4c50d99696e.slice/crio-faec49c5efd2f1ca96bfa02024a7efe5643e043948f093bca906e22d6fd7b51d WatchSource:0}: Error finding container faec49c5efd2f1ca96bfa02024a7efe5643e043948f093bca906e22d6fd7b51d: Status 404 returned error can't find the container with id faec49c5efd2f1ca96bfa02024a7efe5643e043948f093bca906e22d6fd7b51d Feb 26 17:02:42 crc kubenswrapper[5033]: I0226 17:02:42.972146 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 17:02:43 crc kubenswrapper[5033]: I0226 17:02:43.192205 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" event={"ID":"ae6528f1-15da-49f3-a1cd-6170da742302","Type":"ContainerStarted","Data":"46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f"} Feb 26 17:02:43 crc kubenswrapper[5033]: I0226 17:02:43.192663 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:43 crc kubenswrapper[5033]: I0226 17:02:43.194116 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"60fff19f-5516-4ccc-b6b1-a4c50d99696e","Type":"ContainerStarted","Data":"e779cbd6138c6190c06b98ec61e0f6a056c6cff1d8670439cc7d81c34244e0fa"} Feb 26 17:02:43 crc kubenswrapper[5033]: I0226 17:02:43.194159 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"60fff19f-5516-4ccc-b6b1-a4c50d99696e","Type":"ContainerStarted","Data":"faec49c5efd2f1ca96bfa02024a7efe5643e043948f093bca906e22d6fd7b51d"} Feb 26 17:02:43 crc kubenswrapper[5033]: I0226 17:02:43.195860 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7bcb013b-b51b-4de2-8fca-d83453bd4feb","Type":"ContainerStarted","Data":"4cb7ba767c47ac55ddf0699a64da91034b75bd748d74b30fa9916b1fd559fcc2"} Feb 26 17:02:43 crc kubenswrapper[5033]: I0226 17:02:43.196143 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 26 17:02:43 crc kubenswrapper[5033]: I0226 17:02:43.208418 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" podStartSLOduration=3.20839512 podStartE2EDuration="3.20839512s" podCreationTimestamp="2026-02-26 17:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:02:43.207468898 +0000 UTC m=+4890.420067558" watchObservedRunningTime="2026-02-26 17:02:43.20839512 +0000 UTC m=+4890.420993770" Feb 26 17:02:43 crc kubenswrapper[5033]: I0226 17:02:43.236664 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.236641546 podStartE2EDuration="2.236641546s" podCreationTimestamp="2026-02-26 17:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:02:43.231410459 +0000 UTC m=+4890.444009109" watchObservedRunningTime="2026-02-26 17:02:43.236641546 +0000 UTC m=+4890.449240216" Feb 26 17:02:43 crc kubenswrapper[5033]: I0226 17:02:43.381882 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 17:02:44 crc kubenswrapper[5033]: I0226 17:02:44.205297 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a20a94a0-9589-4c0e-9305-72e94939e526","Type":"ContainerStarted","Data":"fe91b315322c24e5c3428494bb4a7d60ac529529e8e79f5f99abb1e2be9d09d7"} Feb 26 17:02:44 crc kubenswrapper[5033]: I0226 17:02:44.205609 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a20a94a0-9589-4c0e-9305-72e94939e526","Type":"ContainerStarted","Data":"cf444e111ffbba8029adca3cc5e9a9f1f1a8630fd249fd653fa9458c4accf528"} Feb 26 17:02:44 crc kubenswrapper[5033]: I0226 17:02:44.207217 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e635c835-6242-42f8-8334-beb5c4c60c98","Type":"ContainerStarted","Data":"ce09e21bb0a9375f3976f333cecd75ad522e0a113a021bad3e2fcce062f5502a"} Feb 26 17:02:44 crc kubenswrapper[5033]: I0226 17:02:44.208997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0","Type":"ContainerStarted","Data":"bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa"} Feb 26 17:02:50 crc kubenswrapper[5033]: I0226 17:02:50.226505 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:50 crc kubenswrapper[5033]: I0226 17:02:50.790417 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:02:50 crc kubenswrapper[5033]: I0226 17:02:50.851031 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xn29p"] Feb 26 17:02:50 crc kubenswrapper[5033]: I0226 17:02:50.854234 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" podUID="53790ba8-ee07-4a93-9737-eccba5ec6061" containerName="dnsmasq-dns" containerID="cri-o://8a5156a626548ecedcc6c9764d830d7e71f8020e88232a2cc4fd9ad4096e8e7b" gracePeriod=10 Feb 26 17:02:51 crc kubenswrapper[5033]: I0226 17:02:51.275044 5033 generic.go:334] "Generic (PLEG): container finished" podID="53790ba8-ee07-4a93-9737-eccba5ec6061" containerID="8a5156a626548ecedcc6c9764d830d7e71f8020e88232a2cc4fd9ad4096e8e7b" exitCode=0 Feb 26 17:02:51 crc kubenswrapper[5033]: I0226 17:02:51.275087 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" event={"ID":"53790ba8-ee07-4a93-9737-eccba5ec6061","Type":"ContainerDied","Data":"8a5156a626548ecedcc6c9764d830d7e71f8020e88232a2cc4fd9ad4096e8e7b"} Feb 26 17:02:51 crc kubenswrapper[5033]: I0226 17:02:51.622916 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.252592 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.289632 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-config\") pod \"53790ba8-ee07-4a93-9737-eccba5ec6061\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.289689 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2w2l\" (UniqueName: \"kubernetes.io/projected/53790ba8-ee07-4a93-9737-eccba5ec6061-kube-api-access-v2w2l\") pod \"53790ba8-ee07-4a93-9737-eccba5ec6061\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.289717 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-dns-svc\") pod \"53790ba8-ee07-4a93-9737-eccba5ec6061\" (UID: \"53790ba8-ee07-4a93-9737-eccba5ec6061\") " Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.289721 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" event={"ID":"53790ba8-ee07-4a93-9737-eccba5ec6061","Type":"ContainerDied","Data":"2916270972e7439038fc7d30b27092d3264147e718cf6a71855379a3f546f704"} Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.289780 5033 scope.go:117] "RemoveContainer" containerID="8a5156a626548ecedcc6c9764d830d7e71f8020e88232a2cc4fd9ad4096e8e7b" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.289940 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-xn29p" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.302886 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53790ba8-ee07-4a93-9737-eccba5ec6061-kube-api-access-v2w2l" (OuterVolumeSpecName: "kube-api-access-v2w2l") pod "53790ba8-ee07-4a93-9737-eccba5ec6061" (UID: "53790ba8-ee07-4a93-9737-eccba5ec6061"). InnerVolumeSpecName "kube-api-access-v2w2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.343593 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-config" (OuterVolumeSpecName: "config") pod "53790ba8-ee07-4a93-9737-eccba5ec6061" (UID: "53790ba8-ee07-4a93-9737-eccba5ec6061"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.345672 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53790ba8-ee07-4a93-9737-eccba5ec6061" (UID: "53790ba8-ee07-4a93-9737-eccba5ec6061"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.391001 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.391042 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2w2l\" (UniqueName: \"kubernetes.io/projected/53790ba8-ee07-4a93-9737-eccba5ec6061-kube-api-access-v2w2l\") on node \"crc\" DevicePath \"\"" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.391053 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53790ba8-ee07-4a93-9737-eccba5ec6061-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.392210 5033 scope.go:117] "RemoveContainer" containerID="63945c0d2a3b62f063ffaf75557b12b009b4b2f1a4db1a7c11d53d89d66be38e" Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.627932 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xn29p"] Feb 26 17:02:52 crc kubenswrapper[5033]: I0226 17:02:52.636285 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-xn29p"] Feb 26 17:02:53 crc kubenswrapper[5033]: I0226 17:02:53.307115 5033 generic.go:334] "Generic (PLEG): container finished" podID="a20a94a0-9589-4c0e-9305-72e94939e526" containerID="fe91b315322c24e5c3428494bb4a7d60ac529529e8e79f5f99abb1e2be9d09d7" exitCode=0 Feb 26 17:02:53 crc kubenswrapper[5033]: I0226 17:02:53.307280 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a20a94a0-9589-4c0e-9305-72e94939e526","Type":"ContainerDied","Data":"fe91b315322c24e5c3428494bb4a7d60ac529529e8e79f5f99abb1e2be9d09d7"} Feb 26 17:02:53 crc kubenswrapper[5033]: I0226 17:02:53.317966 5033 generic.go:334] "Generic (PLEG): container finished" podID="60fff19f-5516-4ccc-b6b1-a4c50d99696e" containerID="e779cbd6138c6190c06b98ec61e0f6a056c6cff1d8670439cc7d81c34244e0fa" exitCode=0 Feb 26 17:02:53 crc kubenswrapper[5033]: I0226 17:02:53.318014 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"60fff19f-5516-4ccc-b6b1-a4c50d99696e","Type":"ContainerDied","Data":"e779cbd6138c6190c06b98ec61e0f6a056c6cff1d8670439cc7d81c34244e0fa"} Feb 26 17:02:53 crc kubenswrapper[5033]: I0226 17:02:53.640079 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53790ba8-ee07-4a93-9737-eccba5ec6061" path="/var/lib/kubelet/pods/53790ba8-ee07-4a93-9737-eccba5ec6061/volumes" Feb 26 17:02:54 crc kubenswrapper[5033]: I0226 17:02:54.327794 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"60fff19f-5516-4ccc-b6b1-a4c50d99696e","Type":"ContainerStarted","Data":"1f6bb34deb62f5e7ed517e954190c69a7d20497e7f56b115c39160972d8c9b18"} Feb 26 17:02:54 crc kubenswrapper[5033]: I0226 17:02:54.332337 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a20a94a0-9589-4c0e-9305-72e94939e526","Type":"ContainerStarted","Data":"13426ebf9ed210747b06ac8e64d3ded7443937e0c58ebf9c3864b1f6824441d2"} Feb 26 17:02:54 crc kubenswrapper[5033]: I0226 17:02:54.363448 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=14.363323031 podStartE2EDuration="14.363323031s" podCreationTimestamp="2026-02-26 17:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:02:54.356847134 +0000 UTC m=+4901.569445794" watchObservedRunningTime="2026-02-26 17:02:54.363323031 +0000 UTC m=+4901.575921691" Feb 26 17:02:54 crc kubenswrapper[5033]: I0226 17:02:54.398925 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=13.398888294 podStartE2EDuration="13.398888294s" podCreationTimestamp="2026-02-26 17:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:02:54.389772844 +0000 UTC m=+4901.602371504" watchObservedRunningTime="2026-02-26 17:02:54.398888294 +0000 UTC m=+4901.611486984" Feb 26 17:03:02 crc kubenswrapper[5033]: I0226 17:03:02.398744 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 26 17:03:02 crc kubenswrapper[5033]: I0226 17:03:02.399400 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 26 17:03:02 crc kubenswrapper[5033]: I0226 17:03:02.559143 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 26 17:03:02 crc kubenswrapper[5033]: I0226 17:03:02.972405 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 26 17:03:02 crc kubenswrapper[5033]: I0226 17:03:02.972473 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 26 17:03:03 crc kubenswrapper[5033]: I0226 17:03:03.418854 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 26 17:03:03 crc kubenswrapper[5033]: I0226 17:03:03.512719 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 26 17:03:03 crc kubenswrapper[5033]: I0226 17:03:03.525687 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.770986 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-27qnb"] Feb 26 17:03:10 crc kubenswrapper[5033]: E0226 17:03:10.771911 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53790ba8-ee07-4a93-9737-eccba5ec6061" containerName="init" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.771930 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="53790ba8-ee07-4a93-9737-eccba5ec6061" containerName="init" Feb 26 17:03:10 crc kubenswrapper[5033]: E0226 17:03:10.771967 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53790ba8-ee07-4a93-9737-eccba5ec6061" containerName="dnsmasq-dns" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.771974 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="53790ba8-ee07-4a93-9737-eccba5ec6061" containerName="dnsmasq-dns" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.772148 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="53790ba8-ee07-4a93-9737-eccba5ec6061" containerName="dnsmasq-dns" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.772718 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-27qnb"] Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.772836 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.796747 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.807027 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljxk7\" (UniqueName: \"kubernetes.io/projected/ce0e0d31-3565-4112-b28c-939bba4533d5-kube-api-access-ljxk7\") pod \"root-account-create-update-27qnb\" (UID: \"ce0e0d31-3565-4112-b28c-939bba4533d5\") " pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.807220 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0e0d31-3565-4112-b28c-939bba4533d5-operator-scripts\") pod \"root-account-create-update-27qnb\" (UID: \"ce0e0d31-3565-4112-b28c-939bba4533d5\") " pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.908335 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0e0d31-3565-4112-b28c-939bba4533d5-operator-scripts\") pod \"root-account-create-update-27qnb\" (UID: \"ce0e0d31-3565-4112-b28c-939bba4533d5\") " pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.908468 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljxk7\" (UniqueName: \"kubernetes.io/projected/ce0e0d31-3565-4112-b28c-939bba4533d5-kube-api-access-ljxk7\") pod \"root-account-create-update-27qnb\" (UID: \"ce0e0d31-3565-4112-b28c-939bba4533d5\") " pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.909503 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0e0d31-3565-4112-b28c-939bba4533d5-operator-scripts\") pod \"root-account-create-update-27qnb\" (UID: \"ce0e0d31-3565-4112-b28c-939bba4533d5\") " pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:10 crc kubenswrapper[5033]: I0226 17:03:10.932121 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljxk7\" (UniqueName: \"kubernetes.io/projected/ce0e0d31-3565-4112-b28c-939bba4533d5-kube-api-access-ljxk7\") pod \"root-account-create-update-27qnb\" (UID: \"ce0e0d31-3565-4112-b28c-939bba4533d5\") " pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:11 crc kubenswrapper[5033]: I0226 17:03:11.123494 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:11 crc kubenswrapper[5033]: I0226 17:03:11.617335 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-27qnb"] Feb 26 17:03:11 crc kubenswrapper[5033]: W0226 17:03:11.619508 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce0e0d31_3565_4112_b28c_939bba4533d5.slice/crio-3b543d5f6da31d6f6d1662ab43399b685b385e89ba5b5f93b89c5059ffb863a5 WatchSource:0}: Error finding container 3b543d5f6da31d6f6d1662ab43399b685b385e89ba5b5f93b89c5059ffb863a5: Status 404 returned error can't find the container with id 3b543d5f6da31d6f6d1662ab43399b685b385e89ba5b5f93b89c5059ffb863a5 Feb 26 17:03:12 crc kubenswrapper[5033]: I0226 17:03:12.513918 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-27qnb" event={"ID":"ce0e0d31-3565-4112-b28c-939bba4533d5","Type":"ContainerStarted","Data":"80c14f1f25bbe34f46b6243330b063304f0df8c4565f56cda801289e372c2018"} Feb 26 17:03:12 crc kubenswrapper[5033]: I0226 17:03:12.514304 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-27qnb" event={"ID":"ce0e0d31-3565-4112-b28c-939bba4533d5","Type":"ContainerStarted","Data":"3b543d5f6da31d6f6d1662ab43399b685b385e89ba5b5f93b89c5059ffb863a5"} Feb 26 17:03:12 crc kubenswrapper[5033]: I0226 17:03:12.547608 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-27qnb" podStartSLOduration=2.547583646 podStartE2EDuration="2.547583646s" podCreationTimestamp="2026-02-26 17:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:03:12.536542418 +0000 UTC m=+4919.749141108" watchObservedRunningTime="2026-02-26 17:03:12.547583646 +0000 UTC m=+4919.760182326" Feb 26 17:03:13 crc kubenswrapper[5033]: I0226 17:03:13.527046 5033 generic.go:334] "Generic (PLEG): container finished" podID="ce0e0d31-3565-4112-b28c-939bba4533d5" containerID="80c14f1f25bbe34f46b6243330b063304f0df8c4565f56cda801289e372c2018" exitCode=0 Feb 26 17:03:13 crc kubenswrapper[5033]: I0226 17:03:13.527170 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-27qnb" event={"ID":"ce0e0d31-3565-4112-b28c-939bba4533d5","Type":"ContainerDied","Data":"80c14f1f25bbe34f46b6243330b063304f0df8c4565f56cda801289e372c2018"} Feb 26 17:03:14 crc kubenswrapper[5033]: I0226 17:03:14.813170 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:14 crc kubenswrapper[5033]: I0226 17:03:14.970618 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0e0d31-3565-4112-b28c-939bba4533d5-operator-scripts\") pod \"ce0e0d31-3565-4112-b28c-939bba4533d5\" (UID: \"ce0e0d31-3565-4112-b28c-939bba4533d5\") " Feb 26 17:03:14 crc kubenswrapper[5033]: I0226 17:03:14.970748 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljxk7\" (UniqueName: \"kubernetes.io/projected/ce0e0d31-3565-4112-b28c-939bba4533d5-kube-api-access-ljxk7\") pod \"ce0e0d31-3565-4112-b28c-939bba4533d5\" (UID: \"ce0e0d31-3565-4112-b28c-939bba4533d5\") " Feb 26 17:03:14 crc kubenswrapper[5033]: I0226 17:03:14.971747 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0e0d31-3565-4112-b28c-939bba4533d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce0e0d31-3565-4112-b28c-939bba4533d5" (UID: "ce0e0d31-3565-4112-b28c-939bba4533d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:03:14 crc kubenswrapper[5033]: I0226 17:03:14.976518 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0e0d31-3565-4112-b28c-939bba4533d5-kube-api-access-ljxk7" (OuterVolumeSpecName: "kube-api-access-ljxk7") pod "ce0e0d31-3565-4112-b28c-939bba4533d5" (UID: "ce0e0d31-3565-4112-b28c-939bba4533d5"). InnerVolumeSpecName "kube-api-access-ljxk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:03:15 crc kubenswrapper[5033]: I0226 17:03:15.072083 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce0e0d31-3565-4112-b28c-939bba4533d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:15 crc kubenswrapper[5033]: I0226 17:03:15.072126 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljxk7\" (UniqueName: \"kubernetes.io/projected/ce0e0d31-3565-4112-b28c-939bba4533d5-kube-api-access-ljxk7\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:15 crc kubenswrapper[5033]: I0226 17:03:15.543776 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-27qnb" event={"ID":"ce0e0d31-3565-4112-b28c-939bba4533d5","Type":"ContainerDied","Data":"3b543d5f6da31d6f6d1662ab43399b685b385e89ba5b5f93b89c5059ffb863a5"} Feb 26 17:03:15 crc kubenswrapper[5033]: I0226 17:03:15.543823 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b543d5f6da31d6f6d1662ab43399b685b385e89ba5b5f93b89c5059ffb863a5" Feb 26 17:03:15 crc kubenswrapper[5033]: I0226 17:03:15.543870 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-27qnb" Feb 26 17:03:16 crc kubenswrapper[5033]: I0226 17:03:16.287347 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-27qnb"] Feb 26 17:03:16 crc kubenswrapper[5033]: I0226 17:03:16.300801 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-27qnb"] Feb 26 17:03:16 crc kubenswrapper[5033]: I0226 17:03:16.557686 5033 generic.go:334] "Generic (PLEG): container finished" podID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" containerID="bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa" exitCode=0 Feb 26 17:03:16 crc kubenswrapper[5033]: I0226 17:03:16.557794 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0","Type":"ContainerDied","Data":"bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa"} Feb 26 17:03:16 crc kubenswrapper[5033]: I0226 17:03:16.560321 5033 generic.go:334] "Generic (PLEG): container finished" podID="e635c835-6242-42f8-8334-beb5c4c60c98" containerID="ce09e21bb0a9375f3976f333cecd75ad522e0a113a021bad3e2fcce062f5502a" exitCode=0 Feb 26 17:03:16 crc kubenswrapper[5033]: I0226 17:03:16.560379 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e635c835-6242-42f8-8334-beb5c4c60c98","Type":"ContainerDied","Data":"ce09e21bb0a9375f3976f333cecd75ad522e0a113a021bad3e2fcce062f5502a"} Feb 26 17:03:17 crc kubenswrapper[5033]: I0226 17:03:17.571429 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e635c835-6242-42f8-8334-beb5c4c60c98","Type":"ContainerStarted","Data":"15f47c040bf51afa81c740cfc61322d5f19b0ec1189587cc39f7f3cfb528550d"} Feb 26 17:03:17 crc kubenswrapper[5033]: I0226 17:03:17.573578 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:17 crc kubenswrapper[5033]: I0226 17:03:17.576141 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0","Type":"ContainerStarted","Data":"a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9"} Feb 26 17:03:17 crc kubenswrapper[5033]: I0226 17:03:17.576692 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 17:03:17 crc kubenswrapper[5033]: I0226 17:03:17.626795 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.626774104 podStartE2EDuration="37.626774104s" podCreationTimestamp="2026-02-26 17:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:03:17.600616019 +0000 UTC m=+4924.813214669" watchObservedRunningTime="2026-02-26 17:03:17.626774104 +0000 UTC m=+4924.839372754" Feb 26 17:03:17 crc kubenswrapper[5033]: I0226 17:03:17.636524 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0e0d31-3565-4112-b28c-939bba4533d5" path="/var/lib/kubelet/pods/ce0e0d31-3565-4112-b28c-939bba4533d5/volumes" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.273519 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.273495779 podStartE2EDuration="42.273495779s" podCreationTimestamp="2026-02-26 17:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:03:17.623604248 +0000 UTC m=+4924.836202898" watchObservedRunningTime="2026-02-26 17:03:21.273495779 +0000 UTC m=+4928.486094419" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.274394 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-sskqf"] Feb 26 17:03:21 crc kubenswrapper[5033]: E0226 17:03:21.274767 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0e0d31-3565-4112-b28c-939bba4533d5" containerName="mariadb-account-create-update" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.274783 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0e0d31-3565-4112-b28c-939bba4533d5" containerName="mariadb-account-create-update" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.274954 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0e0d31-3565-4112-b28c-939bba4533d5" containerName="mariadb-account-create-update" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.275571 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.276979 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.280765 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sskqf"] Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.371798 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bcf0cce-fae8-4d24-b107-fff2c4787a42-operator-scripts\") pod \"root-account-create-update-sskqf\" (UID: \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\") " pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.371873 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bknv6\" (UniqueName: \"kubernetes.io/projected/1bcf0cce-fae8-4d24-b107-fff2c4787a42-kube-api-access-bknv6\") pod \"root-account-create-update-sskqf\" (UID: \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\") " pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.473074 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bcf0cce-fae8-4d24-b107-fff2c4787a42-operator-scripts\") pod \"root-account-create-update-sskqf\" (UID: \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\") " pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.473142 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bknv6\" (UniqueName: \"kubernetes.io/projected/1bcf0cce-fae8-4d24-b107-fff2c4787a42-kube-api-access-bknv6\") pod \"root-account-create-update-sskqf\" (UID: \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\") " pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.474333 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bcf0cce-fae8-4d24-b107-fff2c4787a42-operator-scripts\") pod \"root-account-create-update-sskqf\" (UID: \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\") " pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.499955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bknv6\" (UniqueName: \"kubernetes.io/projected/1bcf0cce-fae8-4d24-b107-fff2c4787a42-kube-api-access-bknv6\") pod \"root-account-create-update-sskqf\" (UID: \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\") " pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.594344 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:21 crc kubenswrapper[5033]: I0226 17:03:21.852323 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sskqf"] Feb 26 17:03:22 crc kubenswrapper[5033]: I0226 17:03:22.617681 5033 generic.go:334] "Generic (PLEG): container finished" podID="1bcf0cce-fae8-4d24-b107-fff2c4787a42" containerID="4c26a5e4b2f6dcd406405fd65a5893444fc444113bda9573269d48ce80b09161" exitCode=0 Feb 26 17:03:22 crc kubenswrapper[5033]: I0226 17:03:22.617736 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sskqf" event={"ID":"1bcf0cce-fae8-4d24-b107-fff2c4787a42","Type":"ContainerDied","Data":"4c26a5e4b2f6dcd406405fd65a5893444fc444113bda9573269d48ce80b09161"} Feb 26 17:03:22 crc kubenswrapper[5033]: I0226 17:03:22.617768 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sskqf" event={"ID":"1bcf0cce-fae8-4d24-b107-fff2c4787a42","Type":"ContainerStarted","Data":"903ab67e06eacbed3e3a79e8b6b7ac45bf71b08776ce30db058d7cb82bf7dc73"} Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.013248 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.134635 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bknv6\" (UniqueName: \"kubernetes.io/projected/1bcf0cce-fae8-4d24-b107-fff2c4787a42-kube-api-access-bknv6\") pod \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\" (UID: \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\") " Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.134708 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bcf0cce-fae8-4d24-b107-fff2c4787a42-operator-scripts\") pod \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\" (UID: \"1bcf0cce-fae8-4d24-b107-fff2c4787a42\") " Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.136511 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bcf0cce-fae8-4d24-b107-fff2c4787a42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1bcf0cce-fae8-4d24-b107-fff2c4787a42" (UID: "1bcf0cce-fae8-4d24-b107-fff2c4787a42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.142649 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bcf0cce-fae8-4d24-b107-fff2c4787a42-kube-api-access-bknv6" (OuterVolumeSpecName: "kube-api-access-bknv6") pod "1bcf0cce-fae8-4d24-b107-fff2c4787a42" (UID: "1bcf0cce-fae8-4d24-b107-fff2c4787a42"). InnerVolumeSpecName "kube-api-access-bknv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.237874 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bknv6\" (UniqueName: \"kubernetes.io/projected/1bcf0cce-fae8-4d24-b107-fff2c4787a42-kube-api-access-bknv6\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.237944 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bcf0cce-fae8-4d24-b107-fff2c4787a42-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.639712 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sskqf" event={"ID":"1bcf0cce-fae8-4d24-b107-fff2c4787a42","Type":"ContainerDied","Data":"903ab67e06eacbed3e3a79e8b6b7ac45bf71b08776ce30db058d7cb82bf7dc73"} Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.639768 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="903ab67e06eacbed3e3a79e8b6b7ac45bf71b08776ce30db058d7cb82bf7dc73" Feb 26 17:03:24 crc kubenswrapper[5033]: I0226 17:03:24.639848 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sskqf" Feb 26 17:03:31 crc kubenswrapper[5033]: I0226 17:03:31.413319 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 17:03:31 crc kubenswrapper[5033]: I0226 17:03:31.659481 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:34 crc kubenswrapper[5033]: I0226 17:03:34.902025 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-bls56"] Feb 26 17:03:34 crc kubenswrapper[5033]: E0226 17:03:34.903084 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bcf0cce-fae8-4d24-b107-fff2c4787a42" containerName="mariadb-account-create-update" Feb 26 17:03:34 crc kubenswrapper[5033]: I0226 17:03:34.903107 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bcf0cce-fae8-4d24-b107-fff2c4787a42" containerName="mariadb-account-create-update" Feb 26 17:03:34 crc kubenswrapper[5033]: I0226 17:03:34.903403 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bcf0cce-fae8-4d24-b107-fff2c4787a42" containerName="mariadb-account-create-update" Feb 26 17:03:34 crc kubenswrapper[5033]: I0226 17:03:34.906522 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:34 crc kubenswrapper[5033]: I0226 17:03:34.921889 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-bls56"] Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.046201 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-config\") pod \"dnsmasq-dns-5b7946d7b9-bls56\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.046307 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhhw6\" (UniqueName: \"kubernetes.io/projected/77140609-456c-4eeb-84ef-59b661196b87-kube-api-access-zhhw6\") pod \"dnsmasq-dns-5b7946d7b9-bls56\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.046460 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-bls56\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.151342 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-bls56\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.151627 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-config\") pod \"dnsmasq-dns-5b7946d7b9-bls56\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.151723 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhhw6\" (UniqueName: \"kubernetes.io/projected/77140609-456c-4eeb-84ef-59b661196b87-kube-api-access-zhhw6\") pod \"dnsmasq-dns-5b7946d7b9-bls56\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.152399 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-bls56\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.152717 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-config\") pod \"dnsmasq-dns-5b7946d7b9-bls56\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.170144 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhhw6\" (UniqueName: \"kubernetes.io/projected/77140609-456c-4eeb-84ef-59b661196b87-kube-api-access-zhhw6\") pod \"dnsmasq-dns-5b7946d7b9-bls56\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.238556 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.679767 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.706313 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-bls56"] Feb 26 17:03:35 crc kubenswrapper[5033]: I0226 17:03:35.736426 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" event={"ID":"77140609-456c-4eeb-84ef-59b661196b87","Type":"ContainerStarted","Data":"7274f91ba01629937ce1f5065dd453920575c4d0b80a6c2149af62a45bba6999"} Feb 26 17:03:36 crc kubenswrapper[5033]: I0226 17:03:36.505609 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 17:03:36 crc kubenswrapper[5033]: I0226 17:03:36.743614 5033 generic.go:334] "Generic (PLEG): container finished" podID="77140609-456c-4eeb-84ef-59b661196b87" containerID="7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488" exitCode=0 Feb 26 17:03:36 crc kubenswrapper[5033]: I0226 17:03:36.743656 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" event={"ID":"77140609-456c-4eeb-84ef-59b661196b87","Type":"ContainerDied","Data":"7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488"} Feb 26 17:03:37 crc kubenswrapper[5033]: I0226 17:03:37.753565 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" event={"ID":"77140609-456c-4eeb-84ef-59b661196b87","Type":"ContainerStarted","Data":"2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09"} Feb 26 17:03:37 crc kubenswrapper[5033]: I0226 17:03:37.754024 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:37 crc kubenswrapper[5033]: I0226 17:03:37.774990 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" podStartSLOduration=3.774969406 podStartE2EDuration="3.774969406s" podCreationTimestamp="2026-02-26 17:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:03:37.771357278 +0000 UTC m=+4944.983955958" watchObservedRunningTime="2026-02-26 17:03:37.774969406 +0000 UTC m=+4944.987568046" Feb 26 17:03:37 crc kubenswrapper[5033]: I0226 17:03:37.792039 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" containerName="rabbitmq" containerID="cri-o://a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9" gracePeriod=604798 Feb 26 17:03:38 crc kubenswrapper[5033]: I0226 17:03:38.425134 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e635c835-6242-42f8-8334-beb5c4c60c98" containerName="rabbitmq" containerID="cri-o://15f47c040bf51afa81c740cfc61322d5f19b0ec1189587cc39f7f3cfb528550d" gracePeriod=604799 Feb 26 17:03:41 crc kubenswrapper[5033]: I0226 17:03:41.411378 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.15:5672: connect: connection refused" Feb 26 17:03:41 crc kubenswrapper[5033]: I0226 17:03:41.657755 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e635c835-6242-42f8-8334-beb5c4c60c98" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.17:5672: connect: connection refused" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.693150 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.732094 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-pod-info\") pod \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.732191 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-plugins\") pod \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.732233 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-erlang-cookie\") pod \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.732303 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snbpw\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-kube-api-access-snbpw\") pod \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.732351 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-server-conf\") pod \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.732380 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-erlang-cookie-secret\") pod \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.732413 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-plugins-conf\") pod \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.732476 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-confd\") pod \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.733793 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" (UID: "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.734040 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" (UID: "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.734219 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" (UID: "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.739105 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-pod-info" (OuterVolumeSpecName: "pod-info") pod "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" (UID: "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.739507 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" (UID: "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.750878 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-kube-api-access-snbpw" (OuterVolumeSpecName: "kube-api-access-snbpw") pod "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" (UID: "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0"). InnerVolumeSpecName "kube-api-access-snbpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.763253 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-server-conf" (OuterVolumeSpecName: "server-conf") pod "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" (UID: "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.816361 5033 generic.go:334] "Generic (PLEG): container finished" podID="e635c835-6242-42f8-8334-beb5c4c60c98" containerID="15f47c040bf51afa81c740cfc61322d5f19b0ec1189587cc39f7f3cfb528550d" exitCode=0 Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.816432 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e635c835-6242-42f8-8334-beb5c4c60c98","Type":"ContainerDied","Data":"15f47c040bf51afa81c740cfc61322d5f19b0ec1189587cc39f7f3cfb528550d"} Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.818730 5033 generic.go:334] "Generic (PLEG): container finished" podID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" containerID="a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9" exitCode=0 Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.818819 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.818821 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0","Type":"ContainerDied","Data":"a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9"} Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.819631 5033 scope.go:117] "RemoveContainer" containerID="a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.819429 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0","Type":"ContainerDied","Data":"42f590217595287a5fc8776cc10cebcdc1a60182709397c5a1da250b19536576"} Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.833722 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") pod \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\" (UID: \"4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0\") " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.834491 5033 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.834511 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.834521 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.834535 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snbpw\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-kube-api-access-snbpw\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.834547 5033 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.834557 5033 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.834568 5033 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.839643 5033 scope.go:117] "RemoveContainer" containerID="bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.872695 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5" (OuterVolumeSpecName: "persistence") pod "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" (UID: "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0"). InnerVolumeSpecName "pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.885602 5033 scope.go:117] "RemoveContainer" containerID="a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9" Feb 26 17:03:44 crc kubenswrapper[5033]: E0226 17:03:44.886811 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9\": container with ID starting with a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9 not found: ID does not exist" containerID="a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.886837 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9"} err="failed to get container status \"a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9\": rpc error: code = NotFound desc = could not find container \"a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9\": container with ID starting with a6e557011d2e424c2690465a6b2abc5f240668b65dad4d3b6d862863d88753f9 not found: ID does not exist" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.886857 5033 scope.go:117] "RemoveContainer" containerID="bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa" Feb 26 17:03:44 crc kubenswrapper[5033]: E0226 17:03:44.887533 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa\": container with ID starting with bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa not found: ID does not exist" containerID="bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.887570 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa"} err="failed to get container status \"bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa\": rpc error: code = NotFound desc = could not find container \"bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa\": container with ID starting with bc0664d8f48daafe7bba310da622a716bd9aa79b8bdcd97594c4ce9909aa05aa not found: ID does not exist" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.917531 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" (UID: "4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.940446 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.940532 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") on node \"crc\" " Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.964017 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 17:03:44 crc kubenswrapper[5033]: I0226 17:03:44.964186 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5") on node "crc" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.028129 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.042137 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.143277 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e635c835-6242-42f8-8334-beb5c4c60c98-pod-info\") pod \"e635c835-6242-42f8-8334-beb5c4c60c98\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.143354 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-plugins\") pod \"e635c835-6242-42f8-8334-beb5c4c60c98\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.143410 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-confd\") pod \"e635c835-6242-42f8-8334-beb5c4c60c98\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.143477 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e635c835-6242-42f8-8334-beb5c4c60c98-erlang-cookie-secret\") pod \"e635c835-6242-42f8-8334-beb5c4c60c98\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.143528 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-589c2\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-kube-api-access-589c2\") pod \"e635c835-6242-42f8-8334-beb5c4c60c98\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.143572 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-server-conf\") pod \"e635c835-6242-42f8-8334-beb5c4c60c98\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.143601 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-erlang-cookie\") pod \"e635c835-6242-42f8-8334-beb5c4c60c98\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.143623 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-plugins-conf\") pod \"e635c835-6242-42f8-8334-beb5c4c60c98\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.143765 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") pod \"e635c835-6242-42f8-8334-beb5c4c60c98\" (UID: \"e635c835-6242-42f8-8334-beb5c4c60c98\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.144349 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e635c835-6242-42f8-8334-beb5c4c60c98" (UID: "e635c835-6242-42f8-8334-beb5c4c60c98"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.144599 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e635c835-6242-42f8-8334-beb5c4c60c98" (UID: "e635c835-6242-42f8-8334-beb5c4c60c98"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.145042 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e635c835-6242-42f8-8334-beb5c4c60c98" (UID: "e635c835-6242-42f8-8334-beb5c4c60c98"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.149277 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e635c835-6242-42f8-8334-beb5c4c60c98-pod-info" (OuterVolumeSpecName: "pod-info") pod "e635c835-6242-42f8-8334-beb5c4c60c98" (UID: "e635c835-6242-42f8-8334-beb5c4c60c98"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.151586 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e635c835-6242-42f8-8334-beb5c4c60c98-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e635c835-6242-42f8-8334-beb5c4c60c98" (UID: "e635c835-6242-42f8-8334-beb5c4c60c98"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.151988 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-kube-api-access-589c2" (OuterVolumeSpecName: "kube-api-access-589c2") pod "e635c835-6242-42f8-8334-beb5c4c60c98" (UID: "e635c835-6242-42f8-8334-beb5c4c60c98"). InnerVolumeSpecName "kube-api-access-589c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.158420 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.161625 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.165286 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3" (OuterVolumeSpecName: "persistence") pod "e635c835-6242-42f8-8334-beb5c4c60c98" (UID: "e635c835-6242-42f8-8334-beb5c4c60c98"). InnerVolumeSpecName "pvc-26578078-8825-4f0e-9145-bb27738225c3". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.178136 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-server-conf" (OuterVolumeSpecName: "server-conf") pod "e635c835-6242-42f8-8334-beb5c4c60c98" (UID: "e635c835-6242-42f8-8334-beb5c4c60c98"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.182427 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 17:03:45 crc kubenswrapper[5033]: E0226 17:03:45.182799 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e635c835-6242-42f8-8334-beb5c4c60c98" containerName="rabbitmq" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.182838 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e635c835-6242-42f8-8334-beb5c4c60c98" containerName="rabbitmq" Feb 26 17:03:45 crc kubenswrapper[5033]: E0226 17:03:45.182877 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e635c835-6242-42f8-8334-beb5c4c60c98" containerName="setup-container" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.182886 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e635c835-6242-42f8-8334-beb5c4c60c98" containerName="setup-container" Feb 26 17:03:45 crc kubenswrapper[5033]: E0226 17:03:45.182902 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" containerName="rabbitmq" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.182911 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" containerName="rabbitmq" Feb 26 17:03:45 crc kubenswrapper[5033]: E0226 17:03:45.182928 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" containerName="setup-container" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.182936 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" containerName="setup-container" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.183099 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e635c835-6242-42f8-8334-beb5c4c60c98" containerName="rabbitmq" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.183119 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" containerName="rabbitmq" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.184158 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.187321 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.187389 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.187401 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.187403 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-65xpw" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.187505 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.190661 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.232088 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e635c835-6242-42f8-8334-beb5c4c60c98" (UID: "e635c835-6242-42f8-8334-beb5c4c60c98"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.241402 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.246182 5033 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e635c835-6242-42f8-8334-beb5c4c60c98-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.246212 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.246223 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.246234 5033 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e635c835-6242-42f8-8334-beb5c4c60c98-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.246244 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-589c2\" (UniqueName: \"kubernetes.io/projected/e635c835-6242-42f8-8334-beb5c4c60c98-kube-api-access-589c2\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.246266 5033 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.246275 5033 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e635c835-6242-42f8-8334-beb5c4c60c98-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.246284 5033 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e635c835-6242-42f8-8334-beb5c4c60c98-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.246317 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") on node \"crc\" " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.268169 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.268361 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-26578078-8825-4f0e-9145-bb27738225c3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3") on node "crc" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.299558 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-bg7hg"] Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.299829 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" podUID="ae6528f1-15da-49f3-a1cd-6170da742302" containerName="dnsmasq-dns" containerID="cri-o://46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f" gracePeriod=10 Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.348957 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69f6b213-b08e-463c-9ddd-5e8b46861c4b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.349582 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69f6b213-b08e-463c-9ddd-5e8b46861c4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.349889 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69f6b213-b08e-463c-9ddd-5e8b46861c4b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.349922 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69f6b213-b08e-463c-9ddd-5e8b46861c4b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.349961 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69f6b213-b08e-463c-9ddd-5e8b46861c4b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.350030 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69f6b213-b08e-463c-9ddd-5e8b46861c4b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.350190 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.350247 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69f6b213-b08e-463c-9ddd-5e8b46861c4b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.350326 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv4p5\" (UniqueName: \"kubernetes.io/projected/69f6b213-b08e-463c-9ddd-5e8b46861c4b-kube-api-access-cv4p5\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.350672 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.451748 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69f6b213-b08e-463c-9ddd-5e8b46861c4b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.451830 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.451901 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69f6b213-b08e-463c-9ddd-5e8b46861c4b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.451925 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv4p5\" (UniqueName: \"kubernetes.io/projected/69f6b213-b08e-463c-9ddd-5e8b46861c4b-kube-api-access-cv4p5\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.451957 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69f6b213-b08e-463c-9ddd-5e8b46861c4b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.451992 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69f6b213-b08e-463c-9ddd-5e8b46861c4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.452009 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69f6b213-b08e-463c-9ddd-5e8b46861c4b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.452029 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69f6b213-b08e-463c-9ddd-5e8b46861c4b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.452055 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69f6b213-b08e-463c-9ddd-5e8b46861c4b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.453039 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69f6b213-b08e-463c-9ddd-5e8b46861c4b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.453091 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69f6b213-b08e-463c-9ddd-5e8b46861c4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.453440 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69f6b213-b08e-463c-9ddd-5e8b46861c4b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.453630 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69f6b213-b08e-463c-9ddd-5e8b46861c4b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.456174 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.456213 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da68696fe89735bd81c8ec884c6f1b5859609de9dd0b0282954bef495b7b005c/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.457129 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69f6b213-b08e-463c-9ddd-5e8b46861c4b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.457234 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69f6b213-b08e-463c-9ddd-5e8b46861c4b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.460188 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69f6b213-b08e-463c-9ddd-5e8b46861c4b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.472221 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv4p5\" (UniqueName: \"kubernetes.io/projected/69f6b213-b08e-463c-9ddd-5e8b46861c4b-kube-api-access-cv4p5\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.504791 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-daef7d7f-a4d6-47e5-bef3-3e3d316427c5\") pod \"rabbitmq-server-0\" (UID: \"69f6b213-b08e-463c-9ddd-5e8b46861c4b\") " pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.594077 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.646549 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0" path="/var/lib/kubelet/pods/4e9bf668-2fd2-432d-8ba9-a6a9d45cc0a0/volumes" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.684478 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.827321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e635c835-6242-42f8-8334-beb5c4c60c98","Type":"ContainerDied","Data":"08a4df109b7dfa086c010faffe7904f66a45284a92f1938a14a449b512152593"} Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.827368 5033 scope.go:117] "RemoveContainer" containerID="15f47c040bf51afa81c740cfc61322d5f19b0ec1189587cc39f7f3cfb528550d" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.827375 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.844852 5033 generic.go:334] "Generic (PLEG): container finished" podID="ae6528f1-15da-49f3-a1cd-6170da742302" containerID="46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f" exitCode=0 Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.845217 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" event={"ID":"ae6528f1-15da-49f3-a1cd-6170da742302","Type":"ContainerDied","Data":"46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f"} Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.845277 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" event={"ID":"ae6528f1-15da-49f3-a1cd-6170da742302","Type":"ContainerDied","Data":"281f3e1c3977b8d6cd49f7375ddbb7bec06c9c0d52d78a1f419e55ffb2019f67"} Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.845337 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-bg7hg" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.853445 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.857683 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2b42\" (UniqueName: \"kubernetes.io/projected/ae6528f1-15da-49f3-a1cd-6170da742302-kube-api-access-j2b42\") pod \"ae6528f1-15da-49f3-a1cd-6170da742302\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.857779 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-config\") pod \"ae6528f1-15da-49f3-a1cd-6170da742302\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.857822 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-dns-svc\") pod \"ae6528f1-15da-49f3-a1cd-6170da742302\" (UID: \"ae6528f1-15da-49f3-a1cd-6170da742302\") " Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.859959 5033 scope.go:117] "RemoveContainer" containerID="ce09e21bb0a9375f3976f333cecd75ad522e0a113a021bad3e2fcce062f5502a" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.860246 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.871460 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae6528f1-15da-49f3-a1cd-6170da742302-kube-api-access-j2b42" (OuterVolumeSpecName: "kube-api-access-j2b42") pod "ae6528f1-15da-49f3-a1cd-6170da742302" (UID: "ae6528f1-15da-49f3-a1cd-6170da742302"). InnerVolumeSpecName "kube-api-access-j2b42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.873408 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 17:03:45 crc kubenswrapper[5033]: E0226 17:03:45.873680 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae6528f1-15da-49f3-a1cd-6170da742302" containerName="dnsmasq-dns" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.873697 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae6528f1-15da-49f3-a1cd-6170da742302" containerName="dnsmasq-dns" Feb 26 17:03:45 crc kubenswrapper[5033]: E0226 17:03:45.873734 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae6528f1-15da-49f3-a1cd-6170da742302" containerName="init" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.873740 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae6528f1-15da-49f3-a1cd-6170da742302" containerName="init" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.873866 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae6528f1-15da-49f3-a1cd-6170da742302" containerName="dnsmasq-dns" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.874940 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.878976 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.879231 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.879572 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.879733 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-42gh2" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.879891 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.900523 5033 scope.go:117] "RemoveContainer" containerID="46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.916948 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae6528f1-15da-49f3-a1cd-6170da742302" (UID: "ae6528f1-15da-49f3-a1cd-6170da742302"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.921952 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.923517 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-config" (OuterVolumeSpecName: "config") pod "ae6528f1-15da-49f3-a1cd-6170da742302" (UID: "ae6528f1-15da-49f3-a1cd-6170da742302"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.923556 5033 scope.go:117] "RemoveContainer" containerID="2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.939014 5033 scope.go:117] "RemoveContainer" containerID="46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f" Feb 26 17:03:45 crc kubenswrapper[5033]: E0226 17:03:45.939385 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f\": container with ID starting with 46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f not found: ID does not exist" containerID="46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.939433 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f"} err="failed to get container status \"46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f\": rpc error: code = NotFound desc = could not find container \"46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f\": container with ID starting with 46a9b8e6ab11b2d995b64750591bc4b5c18f68953f4a4d8e62baae0d1fdd145f not found: ID does not exist" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.939474 5033 scope.go:117] "RemoveContainer" containerID="2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9" Feb 26 17:03:45 crc kubenswrapper[5033]: E0226 17:03:45.939775 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9\": container with ID starting with 2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9 not found: ID does not exist" containerID="2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.939807 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9"} err="failed to get container status \"2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9\": rpc error: code = NotFound desc = could not find container \"2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9\": container with ID starting with 2093df78c798c86a3b6444e237a5b93073620b043c747856b60c0960a63725e9 not found: ID does not exist" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.960026 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.960127 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6528f1-15da-49f3-a1cd-6170da742302-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:45 crc kubenswrapper[5033]: I0226 17:03:45.960140 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2b42\" (UniqueName: \"kubernetes.io/projected/ae6528f1-15da-49f3-a1cd-6170da742302-kube-api-access-j2b42\") on node \"crc\" DevicePath \"\"" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.059222 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.060816 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.060874 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2616fe4-bf52-4cf1-a08e-86e821cf6935-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.060923 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2616fe4-bf52-4cf1-a08e-86e821cf6935-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.060956 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2616fe4-bf52-4cf1-a08e-86e821cf6935-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.060974 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2616fe4-bf52-4cf1-a08e-86e821cf6935-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.061000 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24cs6\" (UniqueName: \"kubernetes.io/projected/e2616fe4-bf52-4cf1-a08e-86e821cf6935-kube-api-access-24cs6\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.061015 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2616fe4-bf52-4cf1-a08e-86e821cf6935-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.061040 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2616fe4-bf52-4cf1-a08e-86e821cf6935-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.061059 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2616fe4-bf52-4cf1-a08e-86e821cf6935-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: W0226 17:03:46.065081 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f6b213_b08e_463c_9ddd_5e8b46861c4b.slice/crio-348be565173e52ba5b5c03f2b59562d56b4738ec5776a66a371a5d5f07e3cf3d WatchSource:0}: Error finding container 348be565173e52ba5b5c03f2b59562d56b4738ec5776a66a371a5d5f07e3cf3d: Status 404 returned error can't find the container with id 348be565173e52ba5b5c03f2b59562d56b4738ec5776a66a371a5d5f07e3cf3d Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.166798 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2616fe4-bf52-4cf1-a08e-86e821cf6935-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.166874 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2616fe4-bf52-4cf1-a08e-86e821cf6935-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.166903 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2616fe4-bf52-4cf1-a08e-86e821cf6935-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.166947 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24cs6\" (UniqueName: \"kubernetes.io/projected/e2616fe4-bf52-4cf1-a08e-86e821cf6935-kube-api-access-24cs6\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.166976 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2616fe4-bf52-4cf1-a08e-86e821cf6935-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.167022 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2616fe4-bf52-4cf1-a08e-86e821cf6935-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.167049 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2616fe4-bf52-4cf1-a08e-86e821cf6935-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.167093 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.167141 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2616fe4-bf52-4cf1-a08e-86e821cf6935-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.171561 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2616fe4-bf52-4cf1-a08e-86e821cf6935-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.172540 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2616fe4-bf52-4cf1-a08e-86e821cf6935-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.173399 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2616fe4-bf52-4cf1-a08e-86e821cf6935-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.173936 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2616fe4-bf52-4cf1-a08e-86e821cf6935-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.174816 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2616fe4-bf52-4cf1-a08e-86e821cf6935-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.182522 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2616fe4-bf52-4cf1-a08e-86e821cf6935-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.182892 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2616fe4-bf52-4cf1-a08e-86e821cf6935-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.183562 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.183590 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d8409dfabed18f449aa186f8cff34535814ae624bb3ea725c5c133efd1f1f9ee/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.186490 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24cs6\" (UniqueName: \"kubernetes.io/projected/e2616fe4-bf52-4cf1-a08e-86e821cf6935-kube-api-access-24cs6\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.188596 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-bg7hg"] Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.195250 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-bg7hg"] Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.207546 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-26578078-8825-4f0e-9145-bb27738225c3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26578078-8825-4f0e-9145-bb27738225c3\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2616fe4-bf52-4cf1-a08e-86e821cf6935\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.224392 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.693664 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.853640 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2616fe4-bf52-4cf1-a08e-86e821cf6935","Type":"ContainerStarted","Data":"b316220fbb573e3fdc472b59876361f7f7dd1018cbc24d63ee4722b418635875"} Feb 26 17:03:46 crc kubenswrapper[5033]: I0226 17:03:46.855455 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"69f6b213-b08e-463c-9ddd-5e8b46861c4b","Type":"ContainerStarted","Data":"348be565173e52ba5b5c03f2b59562d56b4738ec5776a66a371a5d5f07e3cf3d"} Feb 26 17:03:47 crc kubenswrapper[5033]: I0226 17:03:47.645850 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae6528f1-15da-49f3-a1cd-6170da742302" path="/var/lib/kubelet/pods/ae6528f1-15da-49f3-a1cd-6170da742302/volumes" Feb 26 17:03:47 crc kubenswrapper[5033]: I0226 17:03:47.647543 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e635c835-6242-42f8-8334-beb5c4c60c98" path="/var/lib/kubelet/pods/e635c835-6242-42f8-8334-beb5c4c60c98/volumes" Feb 26 17:03:47 crc kubenswrapper[5033]: I0226 17:03:47.869079 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"69f6b213-b08e-463c-9ddd-5e8b46861c4b","Type":"ContainerStarted","Data":"d7c5cdc0a00502098c63cdf7a2eedaa0b5b961c1ebbe3ddc48b616c2184dab27"} Feb 26 17:03:48 crc kubenswrapper[5033]: I0226 17:03:48.882296 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2616fe4-bf52-4cf1-a08e-86e821cf6935","Type":"ContainerStarted","Data":"852473a2b98a474dbc5953645fd48a64123e8e7b4cbfd39d9aaf31e0388098ee"} Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.163072 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535424-7vbs9"] Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.164396 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535424-7vbs9" Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.166746 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.168677 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.169594 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.174954 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535424-7vbs9"] Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.316636 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnrpz\" (UniqueName: \"kubernetes.io/projected/be6a1707-3db0-4a1d-9388-70fd900dfd74-kube-api-access-mnrpz\") pod \"auto-csr-approver-29535424-7vbs9\" (UID: \"be6a1707-3db0-4a1d-9388-70fd900dfd74\") " pod="openshift-infra/auto-csr-approver-29535424-7vbs9" Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.418926 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnrpz\" (UniqueName: \"kubernetes.io/projected/be6a1707-3db0-4a1d-9388-70fd900dfd74-kube-api-access-mnrpz\") pod \"auto-csr-approver-29535424-7vbs9\" (UID: \"be6a1707-3db0-4a1d-9388-70fd900dfd74\") " pod="openshift-infra/auto-csr-approver-29535424-7vbs9" Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.437330 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnrpz\" (UniqueName: \"kubernetes.io/projected/be6a1707-3db0-4a1d-9388-70fd900dfd74-kube-api-access-mnrpz\") pod \"auto-csr-approver-29535424-7vbs9\" (UID: \"be6a1707-3db0-4a1d-9388-70fd900dfd74\") " pod="openshift-infra/auto-csr-approver-29535424-7vbs9" Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.494692 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535424-7vbs9" Feb 26 17:04:00 crc kubenswrapper[5033]: I0226 17:04:00.993571 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535424-7vbs9"] Feb 26 17:04:02 crc kubenswrapper[5033]: I0226 17:04:02.006816 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535424-7vbs9" event={"ID":"be6a1707-3db0-4a1d-9388-70fd900dfd74","Type":"ContainerStarted","Data":"a8560af58c1ff184624a361f63e958e2d461a23dd09cd01efbcbdedd721c904a"} Feb 26 17:04:03 crc kubenswrapper[5033]: I0226 17:04:03.022169 5033 generic.go:334] "Generic (PLEG): container finished" podID="be6a1707-3db0-4a1d-9388-70fd900dfd74" containerID="783534fda15e65c018374eab55b4103077d32bf88711eeec080a86edb87768d0" exitCode=0 Feb 26 17:04:03 crc kubenswrapper[5033]: I0226 17:04:03.022293 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535424-7vbs9" event={"ID":"be6a1707-3db0-4a1d-9388-70fd900dfd74","Type":"ContainerDied","Data":"783534fda15e65c018374eab55b4103077d32bf88711eeec080a86edb87768d0"} Feb 26 17:04:04 crc kubenswrapper[5033]: I0226 17:04:04.371825 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535424-7vbs9" Feb 26 17:04:04 crc kubenswrapper[5033]: I0226 17:04:04.501069 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrpz\" (UniqueName: \"kubernetes.io/projected/be6a1707-3db0-4a1d-9388-70fd900dfd74-kube-api-access-mnrpz\") pod \"be6a1707-3db0-4a1d-9388-70fd900dfd74\" (UID: \"be6a1707-3db0-4a1d-9388-70fd900dfd74\") " Feb 26 17:04:04 crc kubenswrapper[5033]: I0226 17:04:04.509989 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be6a1707-3db0-4a1d-9388-70fd900dfd74-kube-api-access-mnrpz" (OuterVolumeSpecName: "kube-api-access-mnrpz") pod "be6a1707-3db0-4a1d-9388-70fd900dfd74" (UID: "be6a1707-3db0-4a1d-9388-70fd900dfd74"). InnerVolumeSpecName "kube-api-access-mnrpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:04:04 crc kubenswrapper[5033]: I0226 17:04:04.603163 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrpz\" (UniqueName: \"kubernetes.io/projected/be6a1707-3db0-4a1d-9388-70fd900dfd74-kube-api-access-mnrpz\") on node \"crc\" DevicePath \"\"" Feb 26 17:04:05 crc kubenswrapper[5033]: I0226 17:04:05.047693 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535424-7vbs9" event={"ID":"be6a1707-3db0-4a1d-9388-70fd900dfd74","Type":"ContainerDied","Data":"a8560af58c1ff184624a361f63e958e2d461a23dd09cd01efbcbdedd721c904a"} Feb 26 17:04:05 crc kubenswrapper[5033]: I0226 17:04:05.047744 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8560af58c1ff184624a361f63e958e2d461a23dd09cd01efbcbdedd721c904a" Feb 26 17:04:05 crc kubenswrapper[5033]: I0226 17:04:05.047798 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535424-7vbs9" Feb 26 17:04:05 crc kubenswrapper[5033]: I0226 17:04:05.460398 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535418-jhmvc"] Feb 26 17:04:05 crc kubenswrapper[5033]: I0226 17:04:05.474116 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535418-jhmvc"] Feb 26 17:04:05 crc kubenswrapper[5033]: I0226 17:04:05.638729 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26980423-2111-46cc-b0cd-b247409db6a5" path="/var/lib/kubelet/pods/26980423-2111-46cc-b0cd-b247409db6a5/volumes" Feb 26 17:04:20 crc kubenswrapper[5033]: I0226 17:04:20.188674 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:04:20 crc kubenswrapper[5033]: I0226 17:04:20.189191 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:04:21 crc kubenswrapper[5033]: I0226 17:04:21.198150 5033 generic.go:334] "Generic (PLEG): container finished" podID="e2616fe4-bf52-4cf1-a08e-86e821cf6935" containerID="852473a2b98a474dbc5953645fd48a64123e8e7b4cbfd39d9aaf31e0388098ee" exitCode=0 Feb 26 17:04:21 crc kubenswrapper[5033]: I0226 17:04:21.199327 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2616fe4-bf52-4cf1-a08e-86e821cf6935","Type":"ContainerDied","Data":"852473a2b98a474dbc5953645fd48a64123e8e7b4cbfd39d9aaf31e0388098ee"} Feb 26 17:04:21 crc kubenswrapper[5033]: I0226 17:04:21.200060 5033 generic.go:334] "Generic (PLEG): container finished" podID="69f6b213-b08e-463c-9ddd-5e8b46861c4b" containerID="d7c5cdc0a00502098c63cdf7a2eedaa0b5b961c1ebbe3ddc48b616c2184dab27" exitCode=0 Feb 26 17:04:21 crc kubenswrapper[5033]: I0226 17:04:21.200114 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"69f6b213-b08e-463c-9ddd-5e8b46861c4b","Type":"ContainerDied","Data":"d7c5cdc0a00502098c63cdf7a2eedaa0b5b961c1ebbe3ddc48b616c2184dab27"} Feb 26 17:04:22 crc kubenswrapper[5033]: I0226 17:04:22.209515 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"69f6b213-b08e-463c-9ddd-5e8b46861c4b","Type":"ContainerStarted","Data":"aaed2bd53b10f5db4a806ab80e7f889219ebb48dc29d2a5cad63bd4846f3c95d"} Feb 26 17:04:22 crc kubenswrapper[5033]: I0226 17:04:22.210011 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 17:04:22 crc kubenswrapper[5033]: I0226 17:04:22.212170 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2616fe4-bf52-4cf1-a08e-86e821cf6935","Type":"ContainerStarted","Data":"2d10887316a6b152335cba971be251b369e97de34a225da5b8844dcd0776f9e9"} Feb 26 17:04:22 crc kubenswrapper[5033]: I0226 17:04:22.212429 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:04:22 crc kubenswrapper[5033]: I0226 17:04:22.235789 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.235770169 podStartE2EDuration="37.235770169s" podCreationTimestamp="2026-02-26 17:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:04:22.235177444 +0000 UTC m=+4989.447776114" watchObservedRunningTime="2026-02-26 17:04:22.235770169 +0000 UTC m=+4989.448368829" Feb 26 17:04:22 crc kubenswrapper[5033]: I0226 17:04:22.261459 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.261437812 podStartE2EDuration="37.261437812s" podCreationTimestamp="2026-02-26 17:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:04:22.258016728 +0000 UTC m=+4989.470615388" watchObservedRunningTime="2026-02-26 17:04:22.261437812 +0000 UTC m=+4989.474036462" Feb 26 17:04:32 crc kubenswrapper[5033]: I0226 17:04:32.760055 5033 scope.go:117] "RemoveContainer" containerID="e62a4bda19c45c778d715e558d22e47c0fde6b8bff69e01198d8f50dc7025c0d" Feb 26 17:04:35 crc kubenswrapper[5033]: I0226 17:04:35.596516 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 17:04:36 crc kubenswrapper[5033]: I0226 17:04:36.227448 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 17:04:47 crc kubenswrapper[5033]: I0226 17:04:47.408699 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 26 17:04:47 crc kubenswrapper[5033]: E0226 17:04:47.409634 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be6a1707-3db0-4a1d-9388-70fd900dfd74" containerName="oc" Feb 26 17:04:47 crc kubenswrapper[5033]: I0226 17:04:47.409650 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="be6a1707-3db0-4a1d-9388-70fd900dfd74" containerName="oc" Feb 26 17:04:47 crc kubenswrapper[5033]: I0226 17:04:47.409849 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="be6a1707-3db0-4a1d-9388-70fd900dfd74" containerName="oc" Feb 26 17:04:47 crc kubenswrapper[5033]: I0226 17:04:47.410462 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:04:47 crc kubenswrapper[5033]: I0226 17:04:47.414710 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gfz8w" Feb 26 17:04:47 crc kubenswrapper[5033]: I0226 17:04:47.427016 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:04:47 crc kubenswrapper[5033]: I0226 17:04:47.600674 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqps\" (UniqueName: \"kubernetes.io/projected/c775f783-be6f-49e2-8af4-a5b8fc7c4b7c-kube-api-access-fkqps\") pod \"mariadb-client\" (UID: \"c775f783-be6f-49e2-8af4-a5b8fc7c4b7c\") " pod="openstack/mariadb-client" Feb 26 17:04:47 crc kubenswrapper[5033]: I0226 17:04:47.703797 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkqps\" (UniqueName: \"kubernetes.io/projected/c775f783-be6f-49e2-8af4-a5b8fc7c4b7c-kube-api-access-fkqps\") pod \"mariadb-client\" (UID: \"c775f783-be6f-49e2-8af4-a5b8fc7c4b7c\") " pod="openstack/mariadb-client" Feb 26 17:04:47 crc kubenswrapper[5033]: I0226 17:04:47.737153 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkqps\" (UniqueName: \"kubernetes.io/projected/c775f783-be6f-49e2-8af4-a5b8fc7c4b7c-kube-api-access-fkqps\") pod \"mariadb-client\" (UID: \"c775f783-be6f-49e2-8af4-a5b8fc7c4b7c\") " pod="openstack/mariadb-client" Feb 26 17:04:48 crc kubenswrapper[5033]: I0226 17:04:48.033056 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:04:48 crc kubenswrapper[5033]: I0226 17:04:48.560545 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:04:49 crc kubenswrapper[5033]: I0226 17:04:49.434047 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"c775f783-be6f-49e2-8af4-a5b8fc7c4b7c","Type":"ContainerStarted","Data":"b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e"} Feb 26 17:04:49 crc kubenswrapper[5033]: I0226 17:04:49.434090 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"c775f783-be6f-49e2-8af4-a5b8fc7c4b7c","Type":"ContainerStarted","Data":"440114ac7248558b5fdd406dbd2335a3b025131ad8c133a4663dff40ed6f43eb"} Feb 26 17:04:49 crc kubenswrapper[5033]: I0226 17:04:49.453008 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.452978034 podStartE2EDuration="2.452978034s" podCreationTimestamp="2026-02-26 17:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:04:49.446539948 +0000 UTC m=+5016.659138628" watchObservedRunningTime="2026-02-26 17:04:49.452978034 +0000 UTC m=+5016.665576714" Feb 26 17:04:50 crc kubenswrapper[5033]: I0226 17:04:50.189194 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:04:50 crc kubenswrapper[5033]: I0226 17:04:50.189354 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:04:53 crc kubenswrapper[5033]: E0226 17:04:53.808630 5033 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.89:44450->38.102.83.89:39479: write tcp 38.102.83.89:44450->38.102.83.89:39479: write: connection reset by peer Feb 26 17:05:02 crc kubenswrapper[5033]: I0226 17:05:02.675429 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:05:02 crc kubenswrapper[5033]: I0226 17:05:02.676961 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="c775f783-be6f-49e2-8af4-a5b8fc7c4b7c" containerName="mariadb-client" containerID="cri-o://b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e" gracePeriod=30 Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.266632 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.374443 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkqps\" (UniqueName: \"kubernetes.io/projected/c775f783-be6f-49e2-8af4-a5b8fc7c4b7c-kube-api-access-fkqps\") pod \"c775f783-be6f-49e2-8af4-a5b8fc7c4b7c\" (UID: \"c775f783-be6f-49e2-8af4-a5b8fc7c4b7c\") " Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.381233 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c775f783-be6f-49e2-8af4-a5b8fc7c4b7c-kube-api-access-fkqps" (OuterVolumeSpecName: "kube-api-access-fkqps") pod "c775f783-be6f-49e2-8af4-a5b8fc7c4b7c" (UID: "c775f783-be6f-49e2-8af4-a5b8fc7c4b7c"). InnerVolumeSpecName "kube-api-access-fkqps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.478751 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkqps\" (UniqueName: \"kubernetes.io/projected/c775f783-be6f-49e2-8af4-a5b8fc7c4b7c-kube-api-access-fkqps\") on node \"crc\" DevicePath \"\"" Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.574211 5033 generic.go:334] "Generic (PLEG): container finished" podID="c775f783-be6f-49e2-8af4-a5b8fc7c4b7c" containerID="b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e" exitCode=143 Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.574300 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"c775f783-be6f-49e2-8af4-a5b8fc7c4b7c","Type":"ContainerDied","Data":"b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e"} Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.574344 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"c775f783-be6f-49e2-8af4-a5b8fc7c4b7c","Type":"ContainerDied","Data":"440114ac7248558b5fdd406dbd2335a3b025131ad8c133a4663dff40ed6f43eb"} Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.574371 5033 scope.go:117] "RemoveContainer" containerID="b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e" Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.574513 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.609822 5033 scope.go:117] "RemoveContainer" containerID="b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e" Feb 26 17:05:03 crc kubenswrapper[5033]: E0226 17:05:03.610442 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e\": container with ID starting with b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e not found: ID does not exist" containerID="b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e" Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.610609 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e"} err="failed to get container status \"b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e\": rpc error: code = NotFound desc = could not find container \"b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e\": container with ID starting with b5180bc38a42e564bc80fc01c770eaecaef30172cd7b575590bb5c4728e6ed3e not found: ID does not exist" Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.625869 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:05:03 crc kubenswrapper[5033]: I0226 17:05:03.646079 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:05:05 crc kubenswrapper[5033]: I0226 17:05:05.645592 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c775f783-be6f-49e2-8af4-a5b8fc7c4b7c" path="/var/lib/kubelet/pods/c775f783-be6f-49e2-8af4-a5b8fc7c4b7c/volumes" Feb 26 17:05:20 crc kubenswrapper[5033]: I0226 17:05:20.189733 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:05:20 crc kubenswrapper[5033]: I0226 17:05:20.192300 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:05:20 crc kubenswrapper[5033]: I0226 17:05:20.192819 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:05:20 crc kubenswrapper[5033]: I0226 17:05:20.193894 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:05:20 crc kubenswrapper[5033]: I0226 17:05:20.194154 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" gracePeriod=600 Feb 26 17:05:20 crc kubenswrapper[5033]: E0226 17:05:20.320560 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:05:20 crc kubenswrapper[5033]: I0226 17:05:20.735052 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" exitCode=0 Feb 26 17:05:20 crc kubenswrapper[5033]: I0226 17:05:20.735114 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4"} Feb 26 17:05:20 crc kubenswrapper[5033]: I0226 17:05:20.735163 5033 scope.go:117] "RemoveContainer" containerID="a4f312ff5f757f37f6f0328584c5b77de391b92d8488a888b49a674b4796ab8e" Feb 26 17:05:20 crc kubenswrapper[5033]: I0226 17:05:20.735960 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:05:20 crc kubenswrapper[5033]: E0226 17:05:20.736389 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:05:32 crc kubenswrapper[5033]: I0226 17:05:32.886622 5033 scope.go:117] "RemoveContainer" containerID="c2075e770d3ce75777505f3f2a586c10bb46c80baf0b29ea3fde8c090452d755" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.629416 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:05:34 crc kubenswrapper[5033]: E0226 17:05:34.630285 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.683521 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5pdnn"] Feb 26 17:05:34 crc kubenswrapper[5033]: E0226 17:05:34.684123 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c775f783-be6f-49e2-8af4-a5b8fc7c4b7c" containerName="mariadb-client" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.684151 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c775f783-be6f-49e2-8af4-a5b8fc7c4b7c" containerName="mariadb-client" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.684499 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c775f783-be6f-49e2-8af4-a5b8fc7c4b7c" containerName="mariadb-client" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.686362 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.704533 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5pdnn"] Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.804406 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbzd\" (UniqueName: \"kubernetes.io/projected/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-kube-api-access-xcbzd\") pod \"redhat-marketplace-5pdnn\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.804563 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-catalog-content\") pod \"redhat-marketplace-5pdnn\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.804836 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-utilities\") pod \"redhat-marketplace-5pdnn\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.906834 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbzd\" (UniqueName: \"kubernetes.io/projected/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-kube-api-access-xcbzd\") pod \"redhat-marketplace-5pdnn\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.906960 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-catalog-content\") pod \"redhat-marketplace-5pdnn\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.907098 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-utilities\") pod \"redhat-marketplace-5pdnn\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.907763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-utilities\") pod \"redhat-marketplace-5pdnn\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.908499 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-catalog-content\") pod \"redhat-marketplace-5pdnn\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:34 crc kubenswrapper[5033]: I0226 17:05:34.978526 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbzd\" (UniqueName: \"kubernetes.io/projected/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-kube-api-access-xcbzd\") pod \"redhat-marketplace-5pdnn\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:35 crc kubenswrapper[5033]: I0226 17:05:35.024070 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:35 crc kubenswrapper[5033]: I0226 17:05:35.493599 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5pdnn"] Feb 26 17:05:35 crc kubenswrapper[5033]: I0226 17:05:35.890387 5033 generic.go:334] "Generic (PLEG): container finished" podID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerID="a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945" exitCode=0 Feb 26 17:05:35 crc kubenswrapper[5033]: I0226 17:05:35.890463 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5pdnn" event={"ID":"9d62a155-4e19-4e76-a2e3-22cd173cd7bb","Type":"ContainerDied","Data":"a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945"} Feb 26 17:05:35 crc kubenswrapper[5033]: I0226 17:05:35.890514 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5pdnn" event={"ID":"9d62a155-4e19-4e76-a2e3-22cd173cd7bb","Type":"ContainerStarted","Data":"4e29a29a0e627d8c223a8fd0ff616a3fb32018f02c67e485eee8f775d22af8ae"} Feb 26 17:05:35 crc kubenswrapper[5033]: I0226 17:05:35.894805 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:05:36 crc kubenswrapper[5033]: I0226 17:05:36.901643 5033 generic.go:334] "Generic (PLEG): container finished" podID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerID="c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91" exitCode=0 Feb 26 17:05:36 crc kubenswrapper[5033]: I0226 17:05:36.901716 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5pdnn" event={"ID":"9d62a155-4e19-4e76-a2e3-22cd173cd7bb","Type":"ContainerDied","Data":"c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91"} Feb 26 17:05:37 crc kubenswrapper[5033]: I0226 17:05:37.913806 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5pdnn" event={"ID":"9d62a155-4e19-4e76-a2e3-22cd173cd7bb","Type":"ContainerStarted","Data":"7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f"} Feb 26 17:05:37 crc kubenswrapper[5033]: I0226 17:05:37.937363 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5pdnn" podStartSLOduration=2.486147925 podStartE2EDuration="3.937230318s" podCreationTimestamp="2026-02-26 17:05:34 +0000 UTC" firstStartedPulling="2026-02-26 17:05:35.894605031 +0000 UTC m=+5063.107203671" lastFinishedPulling="2026-02-26 17:05:37.345687374 +0000 UTC m=+5064.558286064" observedRunningTime="2026-02-26 17:05:37.927246384 +0000 UTC m=+5065.139845054" watchObservedRunningTime="2026-02-26 17:05:37.937230318 +0000 UTC m=+5065.149828998" Feb 26 17:05:45 crc kubenswrapper[5033]: I0226 17:05:45.024960 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:45 crc kubenswrapper[5033]: I0226 17:05:45.025559 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:45 crc kubenswrapper[5033]: I0226 17:05:45.083452 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:46 crc kubenswrapper[5033]: I0226 17:05:46.136208 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:46 crc kubenswrapper[5033]: I0226 17:05:46.189690 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5pdnn"] Feb 26 17:05:46 crc kubenswrapper[5033]: I0226 17:05:46.628240 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:05:46 crc kubenswrapper[5033]: E0226 17:05:46.628662 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:05:47 crc kubenswrapper[5033]: I0226 17:05:47.997895 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5pdnn" podUID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerName="registry-server" containerID="cri-o://7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f" gracePeriod=2 Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.449624 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.540315 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-catalog-content\") pod \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.540615 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcbzd\" (UniqueName: \"kubernetes.io/projected/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-kube-api-access-xcbzd\") pod \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.540671 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-utilities\") pod \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\" (UID: \"9d62a155-4e19-4e76-a2e3-22cd173cd7bb\") " Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.542708 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-utilities" (OuterVolumeSpecName: "utilities") pod "9d62a155-4e19-4e76-a2e3-22cd173cd7bb" (UID: "9d62a155-4e19-4e76-a2e3-22cd173cd7bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.546795 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-kube-api-access-xcbzd" (OuterVolumeSpecName: "kube-api-access-xcbzd") pod "9d62a155-4e19-4e76-a2e3-22cd173cd7bb" (UID: "9d62a155-4e19-4e76-a2e3-22cd173cd7bb"). InnerVolumeSpecName "kube-api-access-xcbzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.565092 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d62a155-4e19-4e76-a2e3-22cd173cd7bb" (UID: "9d62a155-4e19-4e76-a2e3-22cd173cd7bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.642365 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.642400 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcbzd\" (UniqueName: \"kubernetes.io/projected/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-kube-api-access-xcbzd\") on node \"crc\" DevicePath \"\"" Feb 26 17:05:48 crc kubenswrapper[5033]: I0226 17:05:48.642411 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d62a155-4e19-4e76-a2e3-22cd173cd7bb-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.008366 5033 generic.go:334] "Generic (PLEG): container finished" podID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerID="7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f" exitCode=0 Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.008435 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5pdnn" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.008442 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5pdnn" event={"ID":"9d62a155-4e19-4e76-a2e3-22cd173cd7bb","Type":"ContainerDied","Data":"7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f"} Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.008504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5pdnn" event={"ID":"9d62a155-4e19-4e76-a2e3-22cd173cd7bb","Type":"ContainerDied","Data":"4e29a29a0e627d8c223a8fd0ff616a3fb32018f02c67e485eee8f775d22af8ae"} Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.008526 5033 scope.go:117] "RemoveContainer" containerID="7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.041740 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5pdnn"] Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.043316 5033 scope.go:117] "RemoveContainer" containerID="c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.048515 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5pdnn"] Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.060189 5033 scope.go:117] "RemoveContainer" containerID="a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.085446 5033 scope.go:117] "RemoveContainer" containerID="7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f" Feb 26 17:05:49 crc kubenswrapper[5033]: E0226 17:05:49.086158 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f\": container with ID starting with 7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f not found: ID does not exist" containerID="7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.086200 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f"} err="failed to get container status \"7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f\": rpc error: code = NotFound desc = could not find container \"7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f\": container with ID starting with 7926388c61f0fc12de4d80ee46443ad6c6bf48f25372b1e63b48805e6b73796f not found: ID does not exist" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.086236 5033 scope.go:117] "RemoveContainer" containerID="c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91" Feb 26 17:05:49 crc kubenswrapper[5033]: E0226 17:05:49.086661 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91\": container with ID starting with c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91 not found: ID does not exist" containerID="c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.086712 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91"} err="failed to get container status \"c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91\": rpc error: code = NotFound desc = could not find container \"c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91\": container with ID starting with c8c79aa2c93a3500759f0e264bd908a0e708b69cad4cff0835225d3afca49d91 not found: ID does not exist" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.086735 5033 scope.go:117] "RemoveContainer" containerID="a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945" Feb 26 17:05:49 crc kubenswrapper[5033]: E0226 17:05:49.086990 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945\": container with ID starting with a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945 not found: ID does not exist" containerID="a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.087016 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945"} err="failed to get container status \"a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945\": rpc error: code = NotFound desc = could not find container \"a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945\": container with ID starting with a622dc6fea34b9b3276db2c9c1c942ca3fa2585d658eec7689ab5624f451f945 not found: ID does not exist" Feb 26 17:05:49 crc kubenswrapper[5033]: I0226 17:05:49.643690 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" path="/var/lib/kubelet/pods/9d62a155-4e19-4e76-a2e3-22cd173cd7bb/volumes" Feb 26 17:05:59 crc kubenswrapper[5033]: I0226 17:05:59.628363 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:05:59 crc kubenswrapper[5033]: E0226 17:05:59.641219 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.153289 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535426-j47rw"] Feb 26 17:06:00 crc kubenswrapper[5033]: E0226 17:06:00.153605 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerName="extract-utilities" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.153617 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerName="extract-utilities" Feb 26 17:06:00 crc kubenswrapper[5033]: E0226 17:06:00.153641 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerName="registry-server" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.153647 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerName="registry-server" Feb 26 17:06:00 crc kubenswrapper[5033]: E0226 17:06:00.153665 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerName="extract-content" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.153671 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerName="extract-content" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.153831 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d62a155-4e19-4e76-a2e3-22cd173cd7bb" containerName="registry-server" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.154298 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535426-j47rw" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.157416 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.157675 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.160070 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.177435 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535426-j47rw"] Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.233777 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2kq\" (UniqueName: \"kubernetes.io/projected/abbf1d0f-7a41-4cc6-8607-16b42059a949-kube-api-access-bm2kq\") pod \"auto-csr-approver-29535426-j47rw\" (UID: \"abbf1d0f-7a41-4cc6-8607-16b42059a949\") " pod="openshift-infra/auto-csr-approver-29535426-j47rw" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.335127 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm2kq\" (UniqueName: \"kubernetes.io/projected/abbf1d0f-7a41-4cc6-8607-16b42059a949-kube-api-access-bm2kq\") pod \"auto-csr-approver-29535426-j47rw\" (UID: \"abbf1d0f-7a41-4cc6-8607-16b42059a949\") " pod="openshift-infra/auto-csr-approver-29535426-j47rw" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.359443 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm2kq\" (UniqueName: \"kubernetes.io/projected/abbf1d0f-7a41-4cc6-8607-16b42059a949-kube-api-access-bm2kq\") pod \"auto-csr-approver-29535426-j47rw\" (UID: \"abbf1d0f-7a41-4cc6-8607-16b42059a949\") " pod="openshift-infra/auto-csr-approver-29535426-j47rw" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.476561 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535426-j47rw" Feb 26 17:06:00 crc kubenswrapper[5033]: I0226 17:06:00.920286 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535426-j47rw"] Feb 26 17:06:01 crc kubenswrapper[5033]: I0226 17:06:01.138028 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535426-j47rw" event={"ID":"abbf1d0f-7a41-4cc6-8607-16b42059a949","Type":"ContainerStarted","Data":"a211c474b3f2e337f0ac4dd76413b23d50314d6745e2cf69226904819a7b10e0"} Feb 26 17:06:03 crc kubenswrapper[5033]: I0226 17:06:03.158944 5033 generic.go:334] "Generic (PLEG): container finished" podID="abbf1d0f-7a41-4cc6-8607-16b42059a949" containerID="8811b4cc8dc71d12c9f80c1ddae913550b1d050cf23cef00480eb8a4ac21f472" exitCode=0 Feb 26 17:06:03 crc kubenswrapper[5033]: I0226 17:06:03.159024 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535426-j47rw" event={"ID":"abbf1d0f-7a41-4cc6-8607-16b42059a949","Type":"ContainerDied","Data":"8811b4cc8dc71d12c9f80c1ddae913550b1d050cf23cef00480eb8a4ac21f472"} Feb 26 17:06:04 crc kubenswrapper[5033]: I0226 17:06:04.528913 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535426-j47rw" Feb 26 17:06:04 crc kubenswrapper[5033]: I0226 17:06:04.597990 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm2kq\" (UniqueName: \"kubernetes.io/projected/abbf1d0f-7a41-4cc6-8607-16b42059a949-kube-api-access-bm2kq\") pod \"abbf1d0f-7a41-4cc6-8607-16b42059a949\" (UID: \"abbf1d0f-7a41-4cc6-8607-16b42059a949\") " Feb 26 17:06:04 crc kubenswrapper[5033]: I0226 17:06:04.603157 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abbf1d0f-7a41-4cc6-8607-16b42059a949-kube-api-access-bm2kq" (OuterVolumeSpecName: "kube-api-access-bm2kq") pod "abbf1d0f-7a41-4cc6-8607-16b42059a949" (UID: "abbf1d0f-7a41-4cc6-8607-16b42059a949"). InnerVolumeSpecName "kube-api-access-bm2kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:06:04 crc kubenswrapper[5033]: I0226 17:06:04.700103 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm2kq\" (UniqueName: \"kubernetes.io/projected/abbf1d0f-7a41-4cc6-8607-16b42059a949-kube-api-access-bm2kq\") on node \"crc\" DevicePath \"\"" Feb 26 17:06:05 crc kubenswrapper[5033]: I0226 17:06:05.184476 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535426-j47rw" event={"ID":"abbf1d0f-7a41-4cc6-8607-16b42059a949","Type":"ContainerDied","Data":"a211c474b3f2e337f0ac4dd76413b23d50314d6745e2cf69226904819a7b10e0"} Feb 26 17:06:05 crc kubenswrapper[5033]: I0226 17:06:05.184522 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a211c474b3f2e337f0ac4dd76413b23d50314d6745e2cf69226904819a7b10e0" Feb 26 17:06:05 crc kubenswrapper[5033]: I0226 17:06:05.184523 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535426-j47rw" Feb 26 17:06:05 crc kubenswrapper[5033]: I0226 17:06:05.614765 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535420-hcbwm"] Feb 26 17:06:05 crc kubenswrapper[5033]: I0226 17:06:05.624549 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535420-hcbwm"] Feb 26 17:06:05 crc kubenswrapper[5033]: I0226 17:06:05.638140 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed8b94e-b065-4fd0-add3-23acfc3e9e63" path="/var/lib/kubelet/pods/8ed8b94e-b065-4fd0-add3-23acfc3e9e63/volumes" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.316528 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sv6dz"] Feb 26 17:06:09 crc kubenswrapper[5033]: E0226 17:06:09.317394 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abbf1d0f-7a41-4cc6-8607-16b42059a949" containerName="oc" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.317410 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="abbf1d0f-7a41-4cc6-8607-16b42059a949" containerName="oc" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.317592 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="abbf1d0f-7a41-4cc6-8607-16b42059a949" containerName="oc" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.318820 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.340213 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sv6dz"] Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.375708 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnlsh\" (UniqueName: \"kubernetes.io/projected/d4f213de-ff45-4cf7-831a-6be494b3a9e0-kube-api-access-vnlsh\") pod \"certified-operators-sv6dz\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.375812 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-catalog-content\") pod \"certified-operators-sv6dz\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.375906 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-utilities\") pod \"certified-operators-sv6dz\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.477310 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnlsh\" (UniqueName: \"kubernetes.io/projected/d4f213de-ff45-4cf7-831a-6be494b3a9e0-kube-api-access-vnlsh\") pod \"certified-operators-sv6dz\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.477412 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-catalog-content\") pod \"certified-operators-sv6dz\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.477442 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-utilities\") pod \"certified-operators-sv6dz\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.477962 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-utilities\") pod \"certified-operators-sv6dz\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.478556 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-catalog-content\") pod \"certified-operators-sv6dz\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.498902 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnlsh\" (UniqueName: \"kubernetes.io/projected/d4f213de-ff45-4cf7-831a-6be494b3a9e0-kube-api-access-vnlsh\") pod \"certified-operators-sv6dz\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:09 crc kubenswrapper[5033]: I0226 17:06:09.643692 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:10 crc kubenswrapper[5033]: I0226 17:06:10.175839 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sv6dz"] Feb 26 17:06:10 crc kubenswrapper[5033]: I0226 17:06:10.228535 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv6dz" event={"ID":"d4f213de-ff45-4cf7-831a-6be494b3a9e0","Type":"ContainerStarted","Data":"65b86ee3ae41b030f442d3c7bc8a19cc0822ce2edb313832a13bd1efab65e674"} Feb 26 17:06:11 crc kubenswrapper[5033]: I0226 17:06:11.239864 5033 generic.go:334] "Generic (PLEG): container finished" podID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerID="0b3b72beec9a6be645cfb13ea6a1e526d7f6b4948b5495430a19ed69bbdddb59" exitCode=0 Feb 26 17:06:11 crc kubenswrapper[5033]: I0226 17:06:11.239964 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv6dz" event={"ID":"d4f213de-ff45-4cf7-831a-6be494b3a9e0","Type":"ContainerDied","Data":"0b3b72beec9a6be645cfb13ea6a1e526d7f6b4948b5495430a19ed69bbdddb59"} Feb 26 17:06:11 crc kubenswrapper[5033]: I0226 17:06:11.628549 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:06:11 crc kubenswrapper[5033]: E0226 17:06:11.629081 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:06:12 crc kubenswrapper[5033]: I0226 17:06:12.249947 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv6dz" event={"ID":"d4f213de-ff45-4cf7-831a-6be494b3a9e0","Type":"ContainerStarted","Data":"42cea1212b33757b5323bc51b85b39a5d554ced221b49c0621f7b1b4662bd6ba"} Feb 26 17:06:13 crc kubenswrapper[5033]: I0226 17:06:13.259033 5033 generic.go:334] "Generic (PLEG): container finished" podID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerID="42cea1212b33757b5323bc51b85b39a5d554ced221b49c0621f7b1b4662bd6ba" exitCode=0 Feb 26 17:06:13 crc kubenswrapper[5033]: I0226 17:06:13.259355 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv6dz" event={"ID":"d4f213de-ff45-4cf7-831a-6be494b3a9e0","Type":"ContainerDied","Data":"42cea1212b33757b5323bc51b85b39a5d554ced221b49c0621f7b1b4662bd6ba"} Feb 26 17:06:14 crc kubenswrapper[5033]: I0226 17:06:14.268621 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv6dz" event={"ID":"d4f213de-ff45-4cf7-831a-6be494b3a9e0","Type":"ContainerStarted","Data":"5c1546715abeeb8e7013611f85d11a1c69cf8114702f6db2b4fc7ab757918d81"} Feb 26 17:06:14 crc kubenswrapper[5033]: I0226 17:06:14.292279 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sv6dz" podStartSLOduration=2.820614775 podStartE2EDuration="5.292239828s" podCreationTimestamp="2026-02-26 17:06:09 +0000 UTC" firstStartedPulling="2026-02-26 17:06:11.241608376 +0000 UTC m=+5098.454207016" lastFinishedPulling="2026-02-26 17:06:13.713233419 +0000 UTC m=+5100.925832069" observedRunningTime="2026-02-26 17:06:14.286678443 +0000 UTC m=+5101.499277113" watchObservedRunningTime="2026-02-26 17:06:14.292239828 +0000 UTC m=+5101.504838488" Feb 26 17:06:19 crc kubenswrapper[5033]: I0226 17:06:19.644834 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:19 crc kubenswrapper[5033]: I0226 17:06:19.645451 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:19 crc kubenswrapper[5033]: I0226 17:06:19.696108 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:20 crc kubenswrapper[5033]: I0226 17:06:20.384190 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:20 crc kubenswrapper[5033]: I0226 17:06:20.443569 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sv6dz"] Feb 26 17:06:22 crc kubenswrapper[5033]: I0226 17:06:22.350134 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sv6dz" podUID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerName="registry-server" containerID="cri-o://5c1546715abeeb8e7013611f85d11a1c69cf8114702f6db2b4fc7ab757918d81" gracePeriod=2 Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.380423 5033 generic.go:334] "Generic (PLEG): container finished" podID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerID="5c1546715abeeb8e7013611f85d11a1c69cf8114702f6db2b4fc7ab757918d81" exitCode=0 Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.381116 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv6dz" event={"ID":"d4f213de-ff45-4cf7-831a-6be494b3a9e0","Type":"ContainerDied","Data":"5c1546715abeeb8e7013611f85d11a1c69cf8114702f6db2b4fc7ab757918d81"} Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.616739 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.732316 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-utilities\") pod \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.732395 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnlsh\" (UniqueName: \"kubernetes.io/projected/d4f213de-ff45-4cf7-831a-6be494b3a9e0-kube-api-access-vnlsh\") pod \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.732560 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-catalog-content\") pod \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\" (UID: \"d4f213de-ff45-4cf7-831a-6be494b3a9e0\") " Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.734158 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-utilities" (OuterVolumeSpecName: "utilities") pod "d4f213de-ff45-4cf7-831a-6be494b3a9e0" (UID: "d4f213de-ff45-4cf7-831a-6be494b3a9e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.739659 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f213de-ff45-4cf7-831a-6be494b3a9e0-kube-api-access-vnlsh" (OuterVolumeSpecName: "kube-api-access-vnlsh") pod "d4f213de-ff45-4cf7-831a-6be494b3a9e0" (UID: "d4f213de-ff45-4cf7-831a-6be494b3a9e0"). InnerVolumeSpecName "kube-api-access-vnlsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.795691 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4f213de-ff45-4cf7-831a-6be494b3a9e0" (UID: "d4f213de-ff45-4cf7-831a-6be494b3a9e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.833886 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.833938 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f213de-ff45-4cf7-831a-6be494b3a9e0-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:06:25 crc kubenswrapper[5033]: I0226 17:06:25.833952 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnlsh\" (UniqueName: \"kubernetes.io/projected/d4f213de-ff45-4cf7-831a-6be494b3a9e0-kube-api-access-vnlsh\") on node \"crc\" DevicePath \"\"" Feb 26 17:06:26 crc kubenswrapper[5033]: I0226 17:06:26.392985 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv6dz" event={"ID":"d4f213de-ff45-4cf7-831a-6be494b3a9e0","Type":"ContainerDied","Data":"65b86ee3ae41b030f442d3c7bc8a19cc0822ce2edb313832a13bd1efab65e674"} Feb 26 17:06:26 crc kubenswrapper[5033]: I0226 17:06:26.393089 5033 scope.go:117] "RemoveContainer" containerID="5c1546715abeeb8e7013611f85d11a1c69cf8114702f6db2b4fc7ab757918d81" Feb 26 17:06:26 crc kubenswrapper[5033]: I0226 17:06:26.393181 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv6dz" Feb 26 17:06:26 crc kubenswrapper[5033]: I0226 17:06:26.434000 5033 scope.go:117] "RemoveContainer" containerID="42cea1212b33757b5323bc51b85b39a5d554ced221b49c0621f7b1b4662bd6ba" Feb 26 17:06:26 crc kubenswrapper[5033]: I0226 17:06:26.445597 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sv6dz"] Feb 26 17:06:26 crc kubenswrapper[5033]: I0226 17:06:26.454122 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sv6dz"] Feb 26 17:06:26 crc kubenswrapper[5033]: I0226 17:06:26.487743 5033 scope.go:117] "RemoveContainer" containerID="0b3b72beec9a6be645cfb13ea6a1e526d7f6b4948b5495430a19ed69bbdddb59" Feb 26 17:06:26 crc kubenswrapper[5033]: I0226 17:06:26.628666 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:06:26 crc kubenswrapper[5033]: E0226 17:06:26.628941 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:06:27 crc kubenswrapper[5033]: I0226 17:06:27.643943 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" path="/var/lib/kubelet/pods/d4f213de-ff45-4cf7-831a-6be494b3a9e0/volumes" Feb 26 17:06:33 crc kubenswrapper[5033]: I0226 17:06:33.165080 5033 scope.go:117] "RemoveContainer" containerID="1101471bee3125f12d57880571031d5c16f06608112226021fc8b683d14b42fd" Feb 26 17:06:37 crc kubenswrapper[5033]: I0226 17:06:37.628457 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:06:37 crc kubenswrapper[5033]: E0226 17:06:37.629330 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:06:52 crc kubenswrapper[5033]: I0226 17:06:52.627652 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:06:52 crc kubenswrapper[5033]: E0226 17:06:52.628378 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:07:04 crc kubenswrapper[5033]: I0226 17:07:04.628481 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:07:04 crc kubenswrapper[5033]: E0226 17:07:04.629410 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:07:17 crc kubenswrapper[5033]: I0226 17:07:17.627757 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:07:17 crc kubenswrapper[5033]: E0226 17:07:17.628515 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:07:29 crc kubenswrapper[5033]: I0226 17:07:29.628621 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:07:29 crc kubenswrapper[5033]: E0226 17:07:29.629412 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:07:43 crc kubenswrapper[5033]: I0226 17:07:43.633726 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:07:43 crc kubenswrapper[5033]: E0226 17:07:43.634910 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:07:54 crc kubenswrapper[5033]: I0226 17:07:54.628569 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:07:54 crc kubenswrapper[5033]: E0226 17:07:54.629530 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.151688 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535428-xt8nk"] Feb 26 17:08:00 crc kubenswrapper[5033]: E0226 17:08:00.152912 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerName="extract-content" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.152934 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerName="extract-content" Feb 26 17:08:00 crc kubenswrapper[5033]: E0226 17:08:00.152969 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerName="extract-utilities" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.152981 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerName="extract-utilities" Feb 26 17:08:00 crc kubenswrapper[5033]: E0226 17:08:00.153011 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerName="registry-server" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.153024 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerName="registry-server" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.153309 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f213de-ff45-4cf7-831a-6be494b3a9e0" containerName="registry-server" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.154100 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535428-xt8nk" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.156673 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.156846 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.158760 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.162657 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535428-xt8nk"] Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.201137 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmntp\" (UniqueName: \"kubernetes.io/projected/cf9eeef4-19bc-42d0-b77a-9ad03a4d674f-kube-api-access-xmntp\") pod \"auto-csr-approver-29535428-xt8nk\" (UID: \"cf9eeef4-19bc-42d0-b77a-9ad03a4d674f\") " pod="openshift-infra/auto-csr-approver-29535428-xt8nk" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.302863 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmntp\" (UniqueName: \"kubernetes.io/projected/cf9eeef4-19bc-42d0-b77a-9ad03a4d674f-kube-api-access-xmntp\") pod \"auto-csr-approver-29535428-xt8nk\" (UID: \"cf9eeef4-19bc-42d0-b77a-9ad03a4d674f\") " pod="openshift-infra/auto-csr-approver-29535428-xt8nk" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.328161 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmntp\" (UniqueName: \"kubernetes.io/projected/cf9eeef4-19bc-42d0-b77a-9ad03a4d674f-kube-api-access-xmntp\") pod \"auto-csr-approver-29535428-xt8nk\" (UID: \"cf9eeef4-19bc-42d0-b77a-9ad03a4d674f\") " pod="openshift-infra/auto-csr-approver-29535428-xt8nk" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.488429 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535428-xt8nk" Feb 26 17:08:00 crc kubenswrapper[5033]: I0226 17:08:00.949642 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535428-xt8nk"] Feb 26 17:08:00 crc kubenswrapper[5033]: W0226 17:08:00.951702 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf9eeef4_19bc_42d0_b77a_9ad03a4d674f.slice/crio-308d4409d5e2697696af6f4afce663c8ec6713c22eb6165174d607c80eb878a8 WatchSource:0}: Error finding container 308d4409d5e2697696af6f4afce663c8ec6713c22eb6165174d607c80eb878a8: Status 404 returned error can't find the container with id 308d4409d5e2697696af6f4afce663c8ec6713c22eb6165174d607c80eb878a8 Feb 26 17:08:01 crc kubenswrapper[5033]: I0226 17:08:01.220998 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535428-xt8nk" event={"ID":"cf9eeef4-19bc-42d0-b77a-9ad03a4d674f","Type":"ContainerStarted","Data":"308d4409d5e2697696af6f4afce663c8ec6713c22eb6165174d607c80eb878a8"} Feb 26 17:08:03 crc kubenswrapper[5033]: I0226 17:08:03.238454 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535428-xt8nk" event={"ID":"cf9eeef4-19bc-42d0-b77a-9ad03a4d674f","Type":"ContainerStarted","Data":"070e573458cf9b62bdf811b913b376cdafa2be63e19933dd1a95143737d75222"} Feb 26 17:08:03 crc kubenswrapper[5033]: I0226 17:08:03.253772 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535428-xt8nk" podStartSLOduration=1.458961035 podStartE2EDuration="3.253757491s" podCreationTimestamp="2026-02-26 17:08:00 +0000 UTC" firstStartedPulling="2026-02-26 17:08:00.955109681 +0000 UTC m=+5208.167708321" lastFinishedPulling="2026-02-26 17:08:02.749906117 +0000 UTC m=+5209.962504777" observedRunningTime="2026-02-26 17:08:03.252202034 +0000 UTC m=+5210.464800684" watchObservedRunningTime="2026-02-26 17:08:03.253757491 +0000 UTC m=+5210.466356141" Feb 26 17:08:04 crc kubenswrapper[5033]: I0226 17:08:04.250146 5033 generic.go:334] "Generic (PLEG): container finished" podID="cf9eeef4-19bc-42d0-b77a-9ad03a4d674f" containerID="070e573458cf9b62bdf811b913b376cdafa2be63e19933dd1a95143737d75222" exitCode=0 Feb 26 17:08:04 crc kubenswrapper[5033]: I0226 17:08:04.250201 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535428-xt8nk" event={"ID":"cf9eeef4-19bc-42d0-b77a-9ad03a4d674f","Type":"ContainerDied","Data":"070e573458cf9b62bdf811b913b376cdafa2be63e19933dd1a95143737d75222"} Feb 26 17:08:05 crc kubenswrapper[5033]: I0226 17:08:05.624390 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535428-xt8nk" Feb 26 17:08:05 crc kubenswrapper[5033]: I0226 17:08:05.628424 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:08:05 crc kubenswrapper[5033]: E0226 17:08:05.628691 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:08:05 crc kubenswrapper[5033]: I0226 17:08:05.794292 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmntp\" (UniqueName: \"kubernetes.io/projected/cf9eeef4-19bc-42d0-b77a-9ad03a4d674f-kube-api-access-xmntp\") pod \"cf9eeef4-19bc-42d0-b77a-9ad03a4d674f\" (UID: \"cf9eeef4-19bc-42d0-b77a-9ad03a4d674f\") " Feb 26 17:08:05 crc kubenswrapper[5033]: I0226 17:08:05.804416 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9eeef4-19bc-42d0-b77a-9ad03a4d674f-kube-api-access-xmntp" (OuterVolumeSpecName: "kube-api-access-xmntp") pod "cf9eeef4-19bc-42d0-b77a-9ad03a4d674f" (UID: "cf9eeef4-19bc-42d0-b77a-9ad03a4d674f"). InnerVolumeSpecName "kube-api-access-xmntp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:08:05 crc kubenswrapper[5033]: I0226 17:08:05.896925 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmntp\" (UniqueName: \"kubernetes.io/projected/cf9eeef4-19bc-42d0-b77a-9ad03a4d674f-kube-api-access-xmntp\") on node \"crc\" DevicePath \"\"" Feb 26 17:08:06 crc kubenswrapper[5033]: I0226 17:08:06.280184 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535428-xt8nk" event={"ID":"cf9eeef4-19bc-42d0-b77a-9ad03a4d674f","Type":"ContainerDied","Data":"308d4409d5e2697696af6f4afce663c8ec6713c22eb6165174d607c80eb878a8"} Feb 26 17:08:06 crc kubenswrapper[5033]: I0226 17:08:06.280247 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="308d4409d5e2697696af6f4afce663c8ec6713c22eb6165174d607c80eb878a8" Feb 26 17:08:06 crc kubenswrapper[5033]: I0226 17:08:06.280289 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535428-xt8nk" Feb 26 17:08:06 crc kubenswrapper[5033]: I0226 17:08:06.345239 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535422-hnwlv"] Feb 26 17:08:06 crc kubenswrapper[5033]: I0226 17:08:06.358419 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535422-hnwlv"] Feb 26 17:08:07 crc kubenswrapper[5033]: I0226 17:08:07.642866 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efbc103-b984-4067-9864-7c45a0835260" path="/var/lib/kubelet/pods/2efbc103-b984-4067-9864-7c45a0835260/volumes" Feb 26 17:08:16 crc kubenswrapper[5033]: I0226 17:08:16.628608 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:08:16 crc kubenswrapper[5033]: E0226 17:08:16.629416 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:08:28 crc kubenswrapper[5033]: I0226 17:08:28.628375 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:08:28 crc kubenswrapper[5033]: E0226 17:08:28.629436 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:08:33 crc kubenswrapper[5033]: I0226 17:08:33.264245 5033 scope.go:117] "RemoveContainer" containerID="14629a4fb769d1ccae386c715516c214346e4a5f6506293f302d975c464f0adc" Feb 26 17:08:40 crc kubenswrapper[5033]: I0226 17:08:40.628078 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:08:40 crc kubenswrapper[5033]: E0226 17:08:40.629019 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:08:53 crc kubenswrapper[5033]: I0226 17:08:53.636032 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:08:53 crc kubenswrapper[5033]: E0226 17:08:53.636978 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:09:06 crc kubenswrapper[5033]: I0226 17:09:06.628352 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:09:06 crc kubenswrapper[5033]: E0226 17:09:06.629180 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:09:08 crc kubenswrapper[5033]: I0226 17:09:08.865353 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 26 17:09:08 crc kubenswrapper[5033]: E0226 17:09:08.865857 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9eeef4-19bc-42d0-b77a-9ad03a4d674f" containerName="oc" Feb 26 17:09:08 crc kubenswrapper[5033]: I0226 17:09:08.865869 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9eeef4-19bc-42d0-b77a-9ad03a4d674f" containerName="oc" Feb 26 17:09:08 crc kubenswrapper[5033]: I0226 17:09:08.866001 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9eeef4-19bc-42d0-b77a-9ad03a4d674f" containerName="oc" Feb 26 17:09:08 crc kubenswrapper[5033]: I0226 17:09:08.866466 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 26 17:09:08 crc kubenswrapper[5033]: I0226 17:09:08.870840 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gfz8w" Feb 26 17:09:08 crc kubenswrapper[5033]: I0226 17:09:08.876344 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.029421 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\") pod \"mariadb-copy-data\" (UID: \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\") " pod="openstack/mariadb-copy-data" Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.029565 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nblcx\" (UniqueName: \"kubernetes.io/projected/f59560e4-f5b7-4062-b7ef-c8b265d2dd0b-kube-api-access-nblcx\") pod \"mariadb-copy-data\" (UID: \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\") " pod="openstack/mariadb-copy-data" Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.130730 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nblcx\" (UniqueName: \"kubernetes.io/projected/f59560e4-f5b7-4062-b7ef-c8b265d2dd0b-kube-api-access-nblcx\") pod \"mariadb-copy-data\" (UID: \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\") " pod="openstack/mariadb-copy-data" Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.130826 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\") pod \"mariadb-copy-data\" (UID: \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\") " pod="openstack/mariadb-copy-data" Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.134007 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.134054 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\") pod \"mariadb-copy-data\" (UID: \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/650295d4252ce26ba9f612791d15f328ef4ac680879d5b3e2bb0e3b63c63b1fd/globalmount\"" pod="openstack/mariadb-copy-data" Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.157286 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\") pod \"mariadb-copy-data\" (UID: \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\") " pod="openstack/mariadb-copy-data" Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.160133 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nblcx\" (UniqueName: \"kubernetes.io/projected/f59560e4-f5b7-4062-b7ef-c8b265d2dd0b-kube-api-access-nblcx\") pod \"mariadb-copy-data\" (UID: \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\") " pod="openstack/mariadb-copy-data" Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.196012 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.719708 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 26 17:09:09 crc kubenswrapper[5033]: I0226 17:09:09.873822 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b","Type":"ContainerStarted","Data":"fc2f38cfc09bf62c959173f83ac2406e92c64e9f9245634e2bc00c6f9eb5b2a0"} Feb 26 17:09:10 crc kubenswrapper[5033]: I0226 17:09:10.883578 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b","Type":"ContainerStarted","Data":"479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178"} Feb 26 17:09:10 crc kubenswrapper[5033]: I0226 17:09:10.902733 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.902716217 podStartE2EDuration="3.902716217s" podCreationTimestamp="2026-02-26 17:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:09:10.899786976 +0000 UTC m=+5278.112385666" watchObservedRunningTime="2026-02-26 17:09:10.902716217 +0000 UTC m=+5278.115314867" Feb 26 17:09:13 crc kubenswrapper[5033]: I0226 17:09:13.590761 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:13 crc kubenswrapper[5033]: I0226 17:09:13.592429 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:09:13 crc kubenswrapper[5033]: I0226 17:09:13.605809 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:13 crc kubenswrapper[5033]: I0226 17:09:13.694702 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8msj\" (UniqueName: \"kubernetes.io/projected/8672b698-7d27-4a6f-9d33-83c604c8ac93-kube-api-access-h8msj\") pod \"mariadb-client\" (UID: \"8672b698-7d27-4a6f-9d33-83c604c8ac93\") " pod="openstack/mariadb-client" Feb 26 17:09:13 crc kubenswrapper[5033]: I0226 17:09:13.796051 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8msj\" (UniqueName: \"kubernetes.io/projected/8672b698-7d27-4a6f-9d33-83c604c8ac93-kube-api-access-h8msj\") pod \"mariadb-client\" (UID: \"8672b698-7d27-4a6f-9d33-83c604c8ac93\") " pod="openstack/mariadb-client" Feb 26 17:09:13 crc kubenswrapper[5033]: I0226 17:09:13.817460 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8msj\" (UniqueName: \"kubernetes.io/projected/8672b698-7d27-4a6f-9d33-83c604c8ac93-kube-api-access-h8msj\") pod \"mariadb-client\" (UID: \"8672b698-7d27-4a6f-9d33-83c604c8ac93\") " pod="openstack/mariadb-client" Feb 26 17:09:13 crc kubenswrapper[5033]: I0226 17:09:13.912294 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:09:14 crc kubenswrapper[5033]: I0226 17:09:14.323802 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:14 crc kubenswrapper[5033]: W0226 17:09:14.329748 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8672b698_7d27_4a6f_9d33_83c604c8ac93.slice/crio-d5bfee21dbc2fd6fe366c6a80d845d1f0d5d46e2ef47ecbd46da6ae887163178 WatchSource:0}: Error finding container d5bfee21dbc2fd6fe366c6a80d845d1f0d5d46e2ef47ecbd46da6ae887163178: Status 404 returned error can't find the container with id d5bfee21dbc2fd6fe366c6a80d845d1f0d5d46e2ef47ecbd46da6ae887163178 Feb 26 17:09:14 crc kubenswrapper[5033]: I0226 17:09:14.919172 5033 generic.go:334] "Generic (PLEG): container finished" podID="8672b698-7d27-4a6f-9d33-83c604c8ac93" containerID="eeffbfcc3bfc323ab786e3eb99a82574482bc9f6f05b168523e4837b23bfb3a0" exitCode=0 Feb 26 17:09:14 crc kubenswrapper[5033]: I0226 17:09:14.919221 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8672b698-7d27-4a6f-9d33-83c604c8ac93","Type":"ContainerDied","Data":"eeffbfcc3bfc323ab786e3eb99a82574482bc9f6f05b168523e4837b23bfb3a0"} Feb 26 17:09:14 crc kubenswrapper[5033]: I0226 17:09:14.919274 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8672b698-7d27-4a6f-9d33-83c604c8ac93","Type":"ContainerStarted","Data":"d5bfee21dbc2fd6fe366c6a80d845d1f0d5d46e2ef47ecbd46da6ae887163178"} Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.202465 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.238077 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_8672b698-7d27-4a6f-9d33-83c604c8ac93/mariadb-client/0.log" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.265131 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.275042 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.346572 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8msj\" (UniqueName: \"kubernetes.io/projected/8672b698-7d27-4a6f-9d33-83c604c8ac93-kube-api-access-h8msj\") pod \"8672b698-7d27-4a6f-9d33-83c604c8ac93\" (UID: \"8672b698-7d27-4a6f-9d33-83c604c8ac93\") " Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.351803 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8672b698-7d27-4a6f-9d33-83c604c8ac93-kube-api-access-h8msj" (OuterVolumeSpecName: "kube-api-access-h8msj") pod "8672b698-7d27-4a6f-9d33-83c604c8ac93" (UID: "8672b698-7d27-4a6f-9d33-83c604c8ac93"). InnerVolumeSpecName "kube-api-access-h8msj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.407871 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:16 crc kubenswrapper[5033]: E0226 17:09:16.408418 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8672b698-7d27-4a6f-9d33-83c604c8ac93" containerName="mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.408501 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8672b698-7d27-4a6f-9d33-83c604c8ac93" containerName="mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.408765 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8672b698-7d27-4a6f-9d33-83c604c8ac93" containerName="mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.409521 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.417465 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.448873 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8msj\" (UniqueName: \"kubernetes.io/projected/8672b698-7d27-4a6f-9d33-83c604c8ac93-kube-api-access-h8msj\") on node \"crc\" DevicePath \"\"" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.550810 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrv7k\" (UniqueName: \"kubernetes.io/projected/f4cc9a14-1284-4c21-938d-ea518dabec18-kube-api-access-xrv7k\") pod \"mariadb-client\" (UID: \"f4cc9a14-1284-4c21-938d-ea518dabec18\") " pod="openstack/mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.652999 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrv7k\" (UniqueName: \"kubernetes.io/projected/f4cc9a14-1284-4c21-938d-ea518dabec18-kube-api-access-xrv7k\") pod \"mariadb-client\" (UID: \"f4cc9a14-1284-4c21-938d-ea518dabec18\") " pod="openstack/mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.672667 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrv7k\" (UniqueName: \"kubernetes.io/projected/f4cc9a14-1284-4c21-938d-ea518dabec18-kube-api-access-xrv7k\") pod \"mariadb-client\" (UID: \"f4cc9a14-1284-4c21-938d-ea518dabec18\") " pod="openstack/mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.782107 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.935772 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5bfee21dbc2fd6fe366c6a80d845d1f0d5d46e2ef47ecbd46da6ae887163178" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.935841 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:09:16 crc kubenswrapper[5033]: I0226 17:09:16.951349 5033 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="8672b698-7d27-4a6f-9d33-83c604c8ac93" podUID="f4cc9a14-1284-4c21-938d-ea518dabec18" Feb 26 17:09:17 crc kubenswrapper[5033]: I0226 17:09:17.191131 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:17 crc kubenswrapper[5033]: W0226 17:09:17.195043 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4cc9a14_1284_4c21_938d_ea518dabec18.slice/crio-618701d7a75e4186af04578c4f9eba184c7bf88d963f91eff21d6179d295b046 WatchSource:0}: Error finding container 618701d7a75e4186af04578c4f9eba184c7bf88d963f91eff21d6179d295b046: Status 404 returned error can't find the container with id 618701d7a75e4186af04578c4f9eba184c7bf88d963f91eff21d6179d295b046 Feb 26 17:09:17 crc kubenswrapper[5033]: I0226 17:09:17.638706 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8672b698-7d27-4a6f-9d33-83c604c8ac93" path="/var/lib/kubelet/pods/8672b698-7d27-4a6f-9d33-83c604c8ac93/volumes" Feb 26 17:09:17 crc kubenswrapper[5033]: I0226 17:09:17.944628 5033 generic.go:334] "Generic (PLEG): container finished" podID="f4cc9a14-1284-4c21-938d-ea518dabec18" containerID="38d3e998ce7ce81ac8647b28e89125fd456ef51e1613438e779dacac30ad7579" exitCode=0 Feb 26 17:09:17 crc kubenswrapper[5033]: I0226 17:09:17.944678 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f4cc9a14-1284-4c21-938d-ea518dabec18","Type":"ContainerDied","Data":"38d3e998ce7ce81ac8647b28e89125fd456ef51e1613438e779dacac30ad7579"} Feb 26 17:09:17 crc kubenswrapper[5033]: I0226 17:09:17.944708 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f4cc9a14-1284-4c21-938d-ea518dabec18","Type":"ContainerStarted","Data":"618701d7a75e4186af04578c4f9eba184c7bf88d963f91eff21d6179d295b046"} Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.247039 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.264565 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_f4cc9a14-1284-4c21-938d-ea518dabec18/mariadb-client/0.log" Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.291495 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.297894 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.390995 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrv7k\" (UniqueName: \"kubernetes.io/projected/f4cc9a14-1284-4c21-938d-ea518dabec18-kube-api-access-xrv7k\") pod \"f4cc9a14-1284-4c21-938d-ea518dabec18\" (UID: \"f4cc9a14-1284-4c21-938d-ea518dabec18\") " Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.397041 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4cc9a14-1284-4c21-938d-ea518dabec18-kube-api-access-xrv7k" (OuterVolumeSpecName: "kube-api-access-xrv7k") pod "f4cc9a14-1284-4c21-938d-ea518dabec18" (UID: "f4cc9a14-1284-4c21-938d-ea518dabec18"). InnerVolumeSpecName "kube-api-access-xrv7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.492318 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrv7k\" (UniqueName: \"kubernetes.io/projected/f4cc9a14-1284-4c21-938d-ea518dabec18-kube-api-access-xrv7k\") on node \"crc\" DevicePath \"\"" Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.630823 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:09:19 crc kubenswrapper[5033]: E0226 17:09:19.631100 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.640654 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4cc9a14-1284-4c21-938d-ea518dabec18" path="/var/lib/kubelet/pods/f4cc9a14-1284-4c21-938d-ea518dabec18/volumes" Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.960606 5033 scope.go:117] "RemoveContainer" containerID="38d3e998ce7ce81ac8647b28e89125fd456ef51e1613438e779dacac30ad7579" Feb 26 17:09:19 crc kubenswrapper[5033]: I0226 17:09:19.960658 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 26 17:09:31 crc kubenswrapper[5033]: I0226 17:09:31.628918 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:09:31 crc kubenswrapper[5033]: E0226 17:09:31.629704 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:09:33 crc kubenswrapper[5033]: I0226 17:09:33.333490 5033 scope.go:117] "RemoveContainer" containerID="80c14f1f25bbe34f46b6243330b063304f0df8c4565f56cda801289e372c2018" Feb 26 17:09:43 crc kubenswrapper[5033]: I0226 17:09:43.632961 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:09:43 crc kubenswrapper[5033]: E0226 17:09:43.633811 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:09:54 crc kubenswrapper[5033]: I0226 17:09:54.628318 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:09:54 crc kubenswrapper[5033]: E0226 17:09:54.629621 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.146130 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535430-bpg4t"] Feb 26 17:10:00 crc kubenswrapper[5033]: E0226 17:10:00.147739 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4cc9a14-1284-4c21-938d-ea518dabec18" containerName="mariadb-client" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.147758 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4cc9a14-1284-4c21-938d-ea518dabec18" containerName="mariadb-client" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.147936 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4cc9a14-1284-4c21-938d-ea518dabec18" containerName="mariadb-client" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.148547 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535430-bpg4t" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.154670 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.155020 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.155588 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.157646 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535430-bpg4t"] Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.267097 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttmsg\" (UniqueName: \"kubernetes.io/projected/651e60e1-bc9f-4872-99e2-26dc39415a1d-kube-api-access-ttmsg\") pod \"auto-csr-approver-29535430-bpg4t\" (UID: \"651e60e1-bc9f-4872-99e2-26dc39415a1d\") " pod="openshift-infra/auto-csr-approver-29535430-bpg4t" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.368410 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttmsg\" (UniqueName: \"kubernetes.io/projected/651e60e1-bc9f-4872-99e2-26dc39415a1d-kube-api-access-ttmsg\") pod \"auto-csr-approver-29535430-bpg4t\" (UID: \"651e60e1-bc9f-4872-99e2-26dc39415a1d\") " pod="openshift-infra/auto-csr-approver-29535430-bpg4t" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.393031 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttmsg\" (UniqueName: \"kubernetes.io/projected/651e60e1-bc9f-4872-99e2-26dc39415a1d-kube-api-access-ttmsg\") pod \"auto-csr-approver-29535430-bpg4t\" (UID: \"651e60e1-bc9f-4872-99e2-26dc39415a1d\") " pod="openshift-infra/auto-csr-approver-29535430-bpg4t" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.469212 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535430-bpg4t" Feb 26 17:10:00 crc kubenswrapper[5033]: I0226 17:10:00.900679 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535430-bpg4t"] Feb 26 17:10:01 crc kubenswrapper[5033]: I0226 17:10:01.326287 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535430-bpg4t" event={"ID":"651e60e1-bc9f-4872-99e2-26dc39415a1d","Type":"ContainerStarted","Data":"cc81707e0ea94a6ce18c06419adc8a4960579de7116a13ac580b8b5fcbb934fc"} Feb 26 17:10:02 crc kubenswrapper[5033]: I0226 17:10:02.335334 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535430-bpg4t" event={"ID":"651e60e1-bc9f-4872-99e2-26dc39415a1d","Type":"ContainerStarted","Data":"6fa9bf29231196d557d678cec29c4291befd76f4de98236c5d6a8103238724a6"} Feb 26 17:10:02 crc kubenswrapper[5033]: I0226 17:10:02.353954 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535430-bpg4t" podStartSLOduration=1.227326221 podStartE2EDuration="2.353934712s" podCreationTimestamp="2026-02-26 17:10:00 +0000 UTC" firstStartedPulling="2026-02-26 17:10:00.907323732 +0000 UTC m=+5328.119922382" lastFinishedPulling="2026-02-26 17:10:02.033932223 +0000 UTC m=+5329.246530873" observedRunningTime="2026-02-26 17:10:02.350843526 +0000 UTC m=+5329.563442186" watchObservedRunningTime="2026-02-26 17:10:02.353934712 +0000 UTC m=+5329.566533372" Feb 26 17:10:03 crc kubenswrapper[5033]: I0226 17:10:03.344966 5033 generic.go:334] "Generic (PLEG): container finished" podID="651e60e1-bc9f-4872-99e2-26dc39415a1d" containerID="6fa9bf29231196d557d678cec29c4291befd76f4de98236c5d6a8103238724a6" exitCode=0 Feb 26 17:10:03 crc kubenswrapper[5033]: I0226 17:10:03.345013 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535430-bpg4t" event={"ID":"651e60e1-bc9f-4872-99e2-26dc39415a1d","Type":"ContainerDied","Data":"6fa9bf29231196d557d678cec29c4291befd76f4de98236c5d6a8103238724a6"} Feb 26 17:10:04 crc kubenswrapper[5033]: I0226 17:10:04.639146 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535430-bpg4t" Feb 26 17:10:04 crc kubenswrapper[5033]: I0226 17:10:04.836008 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttmsg\" (UniqueName: \"kubernetes.io/projected/651e60e1-bc9f-4872-99e2-26dc39415a1d-kube-api-access-ttmsg\") pod \"651e60e1-bc9f-4872-99e2-26dc39415a1d\" (UID: \"651e60e1-bc9f-4872-99e2-26dc39415a1d\") " Feb 26 17:10:04 crc kubenswrapper[5033]: I0226 17:10:04.842368 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651e60e1-bc9f-4872-99e2-26dc39415a1d-kube-api-access-ttmsg" (OuterVolumeSpecName: "kube-api-access-ttmsg") pod "651e60e1-bc9f-4872-99e2-26dc39415a1d" (UID: "651e60e1-bc9f-4872-99e2-26dc39415a1d"). InnerVolumeSpecName "kube-api-access-ttmsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:10:04 crc kubenswrapper[5033]: I0226 17:10:04.937984 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttmsg\" (UniqueName: \"kubernetes.io/projected/651e60e1-bc9f-4872-99e2-26dc39415a1d-kube-api-access-ttmsg\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:05 crc kubenswrapper[5033]: I0226 17:10:05.364243 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535430-bpg4t" event={"ID":"651e60e1-bc9f-4872-99e2-26dc39415a1d","Type":"ContainerDied","Data":"cc81707e0ea94a6ce18c06419adc8a4960579de7116a13ac580b8b5fcbb934fc"} Feb 26 17:10:05 crc kubenswrapper[5033]: I0226 17:10:05.364336 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc81707e0ea94a6ce18c06419adc8a4960579de7116a13ac580b8b5fcbb934fc" Feb 26 17:10:05 crc kubenswrapper[5033]: I0226 17:10:05.364338 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535430-bpg4t" Feb 26 17:10:05 crc kubenswrapper[5033]: I0226 17:10:05.437900 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535424-7vbs9"] Feb 26 17:10:05 crc kubenswrapper[5033]: I0226 17:10:05.446187 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535424-7vbs9"] Feb 26 17:10:05 crc kubenswrapper[5033]: I0226 17:10:05.642443 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be6a1707-3db0-4a1d-9388-70fd900dfd74" path="/var/lib/kubelet/pods/be6a1707-3db0-4a1d-9388-70fd900dfd74/volumes" Feb 26 17:10:09 crc kubenswrapper[5033]: I0226 17:10:09.629313 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:10:09 crc kubenswrapper[5033]: E0226 17:10:09.630154 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.554908 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qn9cg"] Feb 26 17:10:15 crc kubenswrapper[5033]: E0226 17:10:15.556540 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651e60e1-bc9f-4872-99e2-26dc39415a1d" containerName="oc" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.556572 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="651e60e1-bc9f-4872-99e2-26dc39415a1d" containerName="oc" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.556979 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="651e60e1-bc9f-4872-99e2-26dc39415a1d" containerName="oc" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.559694 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.572611 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qn9cg"] Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.608988 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-catalog-content\") pod \"redhat-operators-qn9cg\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.609039 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-utilities\") pod \"redhat-operators-qn9cg\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.609070 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5km89\" (UniqueName: \"kubernetes.io/projected/dbb59d3c-be69-4d14-b493-533dab8e6232-kube-api-access-5km89\") pod \"redhat-operators-qn9cg\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.709913 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-catalog-content\") pod \"redhat-operators-qn9cg\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.709970 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-utilities\") pod \"redhat-operators-qn9cg\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.710009 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5km89\" (UniqueName: \"kubernetes.io/projected/dbb59d3c-be69-4d14-b493-533dab8e6232-kube-api-access-5km89\") pod \"redhat-operators-qn9cg\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.710467 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-catalog-content\") pod \"redhat-operators-qn9cg\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.710512 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-utilities\") pod \"redhat-operators-qn9cg\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.728080 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5km89\" (UniqueName: \"kubernetes.io/projected/dbb59d3c-be69-4d14-b493-533dab8e6232-kube-api-access-5km89\") pod \"redhat-operators-qn9cg\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:15 crc kubenswrapper[5033]: I0226 17:10:15.893426 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:16 crc kubenswrapper[5033]: I0226 17:10:16.356597 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qn9cg"] Feb 26 17:10:16 crc kubenswrapper[5033]: I0226 17:10:16.458023 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn9cg" event={"ID":"dbb59d3c-be69-4d14-b493-533dab8e6232","Type":"ContainerStarted","Data":"c8946e9bfe412da19fc67dd6ca4c8bec168c3d4aedcb604608b39d8d33f19fcf"} Feb 26 17:10:17 crc kubenswrapper[5033]: I0226 17:10:17.467373 5033 generic.go:334] "Generic (PLEG): container finished" podID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerID="ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb" exitCode=0 Feb 26 17:10:17 crc kubenswrapper[5033]: I0226 17:10:17.467439 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn9cg" event={"ID":"dbb59d3c-be69-4d14-b493-533dab8e6232","Type":"ContainerDied","Data":"ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb"} Feb 26 17:10:18 crc kubenswrapper[5033]: I0226 17:10:18.478509 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn9cg" event={"ID":"dbb59d3c-be69-4d14-b493-533dab8e6232","Type":"ContainerStarted","Data":"35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5"} Feb 26 17:10:19 crc kubenswrapper[5033]: I0226 17:10:19.490397 5033 generic.go:334] "Generic (PLEG): container finished" podID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerID="35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5" exitCode=0 Feb 26 17:10:19 crc kubenswrapper[5033]: I0226 17:10:19.490479 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn9cg" event={"ID":"dbb59d3c-be69-4d14-b493-533dab8e6232","Type":"ContainerDied","Data":"35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5"} Feb 26 17:10:21 crc kubenswrapper[5033]: I0226 17:10:21.505675 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn9cg" event={"ID":"dbb59d3c-be69-4d14-b493-533dab8e6232","Type":"ContainerStarted","Data":"d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b"} Feb 26 17:10:21 crc kubenswrapper[5033]: I0226 17:10:21.525676 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qn9cg" podStartSLOduration=3.084313099 podStartE2EDuration="6.525656192s" podCreationTimestamp="2026-02-26 17:10:15 +0000 UTC" firstStartedPulling="2026-02-26 17:10:17.4700138 +0000 UTC m=+5344.682612470" lastFinishedPulling="2026-02-26 17:10:20.911356913 +0000 UTC m=+5348.123955563" observedRunningTime="2026-02-26 17:10:21.520529907 +0000 UTC m=+5348.733128577" watchObservedRunningTime="2026-02-26 17:10:21.525656192 +0000 UTC m=+5348.738254842" Feb 26 17:10:22 crc kubenswrapper[5033]: I0226 17:10:22.628961 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:10:23 crc kubenswrapper[5033]: I0226 17:10:23.522866 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"e12b311251615971bfc352b62c31819aa1843ba8a719abffbebfc499b24d9582"} Feb 26 17:10:25 crc kubenswrapper[5033]: I0226 17:10:25.894347 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:25 crc kubenswrapper[5033]: I0226 17:10:25.894940 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:26 crc kubenswrapper[5033]: I0226 17:10:26.937155 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qn9cg" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerName="registry-server" probeResult="failure" output=< Feb 26 17:10:26 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 17:10:26 crc kubenswrapper[5033]: > Feb 26 17:10:33 crc kubenswrapper[5033]: I0226 17:10:33.397552 5033 scope.go:117] "RemoveContainer" containerID="783534fda15e65c018374eab55b4103077d32bf88711eeec080a86edb87768d0" Feb 26 17:10:35 crc kubenswrapper[5033]: I0226 17:10:35.944702 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:36 crc kubenswrapper[5033]: I0226 17:10:36.021441 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:36 crc kubenswrapper[5033]: I0226 17:10:36.191508 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qn9cg"] Feb 26 17:10:37 crc kubenswrapper[5033]: I0226 17:10:37.644230 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qn9cg" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerName="registry-server" containerID="cri-o://d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b" gracePeriod=2 Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.068436 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.186104 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-catalog-content\") pod \"dbb59d3c-be69-4d14-b493-533dab8e6232\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.186212 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-utilities\") pod \"dbb59d3c-be69-4d14-b493-533dab8e6232\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.186414 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5km89\" (UniqueName: \"kubernetes.io/projected/dbb59d3c-be69-4d14-b493-533dab8e6232-kube-api-access-5km89\") pod \"dbb59d3c-be69-4d14-b493-533dab8e6232\" (UID: \"dbb59d3c-be69-4d14-b493-533dab8e6232\") " Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.187488 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-utilities" (OuterVolumeSpecName: "utilities") pod "dbb59d3c-be69-4d14-b493-533dab8e6232" (UID: "dbb59d3c-be69-4d14-b493-533dab8e6232"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.192282 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb59d3c-be69-4d14-b493-533dab8e6232-kube-api-access-5km89" (OuterVolumeSpecName: "kube-api-access-5km89") pod "dbb59d3c-be69-4d14-b493-533dab8e6232" (UID: "dbb59d3c-be69-4d14-b493-533dab8e6232"). InnerVolumeSpecName "kube-api-access-5km89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.291458 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5km89\" (UniqueName: \"kubernetes.io/projected/dbb59d3c-be69-4d14-b493-533dab8e6232-kube-api-access-5km89\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.291491 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.332756 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbb59d3c-be69-4d14-b493-533dab8e6232" (UID: "dbb59d3c-be69-4d14-b493-533dab8e6232"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.394834 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb59d3c-be69-4d14-b493-533dab8e6232-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.653504 5033 generic.go:334] "Generic (PLEG): container finished" podID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerID="d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b" exitCode=0 Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.653565 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn9cg" event={"ID":"dbb59d3c-be69-4d14-b493-533dab8e6232","Type":"ContainerDied","Data":"d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b"} Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.653583 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn9cg" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.653600 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn9cg" event={"ID":"dbb59d3c-be69-4d14-b493-533dab8e6232","Type":"ContainerDied","Data":"c8946e9bfe412da19fc67dd6ca4c8bec168c3d4aedcb604608b39d8d33f19fcf"} Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.653625 5033 scope.go:117] "RemoveContainer" containerID="d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.737004 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qn9cg"] Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.746208 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qn9cg"] Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.812238 5033 scope.go:117] "RemoveContainer" containerID="35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.847979 5033 scope.go:117] "RemoveContainer" containerID="ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.891427 5033 scope.go:117] "RemoveContainer" containerID="d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b" Feb 26 17:10:38 crc kubenswrapper[5033]: E0226 17:10:38.894893 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b\": container with ID starting with d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b not found: ID does not exist" containerID="d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.894944 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b"} err="failed to get container status \"d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b\": rpc error: code = NotFound desc = could not find container \"d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b\": container with ID starting with d94b47abf50293f56e4b937e92117b98d034a86978edfcd214aeae808980734b not found: ID does not exist" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.894978 5033 scope.go:117] "RemoveContainer" containerID="35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5" Feb 26 17:10:38 crc kubenswrapper[5033]: E0226 17:10:38.896057 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5\": container with ID starting with 35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5 not found: ID does not exist" containerID="35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.896078 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5"} err="failed to get container status \"35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5\": rpc error: code = NotFound desc = could not find container \"35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5\": container with ID starting with 35d3ac215e49e70842d56940a1764995a0006399cad3dbb277e65a2f3c583ce5 not found: ID does not exist" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.896095 5033 scope.go:117] "RemoveContainer" containerID="ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb" Feb 26 17:10:38 crc kubenswrapper[5033]: E0226 17:10:38.899362 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb\": container with ID starting with ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb not found: ID does not exist" containerID="ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb" Feb 26 17:10:38 crc kubenswrapper[5033]: I0226 17:10:38.899395 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb"} err="failed to get container status \"ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb\": rpc error: code = NotFound desc = could not find container \"ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb\": container with ID starting with ef2308f1bbbb7b60a55051270957b5bbbf828bc6872787e5cf54847a751425bb not found: ID does not exist" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.638595 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" path="/var/lib/kubelet/pods/dbb59d3c-be69-4d14-b493-533dab8e6232/volumes" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.644642 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 17:10:39 crc kubenswrapper[5033]: E0226 17:10:39.644961 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerName="extract-utilities" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.644982 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerName="extract-utilities" Feb 26 17:10:39 crc kubenswrapper[5033]: E0226 17:10:39.645012 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerName="registry-server" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.645021 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerName="registry-server" Feb 26 17:10:39 crc kubenswrapper[5033]: E0226 17:10:39.645039 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerName="extract-content" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.645047 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerName="extract-content" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.645207 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb59d3c-be69-4d14-b493-533dab8e6232" containerName="registry-server" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.646148 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.647950 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-tqkf7" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.649085 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.649158 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.665768 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.673437 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.675611 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.685865 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.687074 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.711231 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.718359 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818339 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818434 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec70e74e-82af-4e84-88c4-9436180cb8d2-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818468 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9dd4f437-f594-443f-a270-1e9f3f1b45c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9dd4f437-f594-443f-a270-1e9f3f1b45c0\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818496 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bd448d61-5688-446c-9c67-faefdbf2aee5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bd448d61-5688-446c-9c67-faefdbf2aee5\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818525 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-config\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818547 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/095f5cb5-16b7-44e2-924c-77c2730118f0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818567 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f5cb5-16b7-44e2-924c-77c2730118f0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818583 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec70e74e-82af-4e84-88c4-9436180cb8d2-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818636 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e11457b2-2ff1-4c35-8b61-7eecf793aec2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e11457b2-2ff1-4c35-8b61-7eecf793aec2\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818659 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5vwj\" (UniqueName: \"kubernetes.io/projected/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-kube-api-access-c5vwj\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818675 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwxvv\" (UniqueName: \"kubernetes.io/projected/ec70e74e-82af-4e84-88c4-9436180cb8d2-kube-api-access-pwxvv\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818695 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818806 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec70e74e-82af-4e84-88c4-9436180cb8d2-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818832 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/095f5cb5-16b7-44e2-924c-77c2730118f0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818856 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095f5cb5-16b7-44e2-924c-77c2730118f0-config\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818891 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818937 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2snz9\" (UniqueName: \"kubernetes.io/projected/095f5cb5-16b7-44e2-924c-77c2730118f0-kube-api-access-2snz9\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.818975 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec70e74e-82af-4e84-88c4-9436180cb8d2-config\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.871555 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.873148 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.876308 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.876396 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-wnlsm" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.876525 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.886892 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.900633 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.902440 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.919211 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.920473 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921510 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec70e74e-82af-4e84-88c4-9436180cb8d2-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921539 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9dd4f437-f594-443f-a270-1e9f3f1b45c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9dd4f437-f594-443f-a270-1e9f3f1b45c0\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921559 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bd448d61-5688-446c-9c67-faefdbf2aee5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bd448d61-5688-446c-9c67-faefdbf2aee5\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921578 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-config\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921593 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/095f5cb5-16b7-44e2-924c-77c2730118f0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921607 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f5cb5-16b7-44e2-924c-77c2730118f0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921621 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec70e74e-82af-4e84-88c4-9436180cb8d2-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921651 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e11457b2-2ff1-4c35-8b61-7eecf793aec2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e11457b2-2ff1-4c35-8b61-7eecf793aec2\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921669 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5vwj\" (UniqueName: \"kubernetes.io/projected/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-kube-api-access-c5vwj\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921687 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwxvv\" (UniqueName: \"kubernetes.io/projected/ec70e74e-82af-4e84-88c4-9436180cb8d2-kube-api-access-pwxvv\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921703 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921727 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec70e74e-82af-4e84-88c4-9436180cb8d2-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921741 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/095f5cb5-16b7-44e2-924c-77c2730118f0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921758 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095f5cb5-16b7-44e2-924c-77c2730118f0-config\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921772 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921800 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2snz9\" (UniqueName: \"kubernetes.io/projected/095f5cb5-16b7-44e2-924c-77c2730118f0-kube-api-access-2snz9\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec70e74e-82af-4e84-88c4-9436180cb8d2-config\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.921888 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.922941 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/095f5cb5-16b7-44e2-924c-77c2730118f0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.924381 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/095f5cb5-16b7-44e2-924c-77c2730118f0-config\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.924398 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/095f5cb5-16b7-44e2-924c-77c2730118f0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.924710 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-config\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.924947 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.926355 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec70e74e-82af-4e84-88c4-9436180cb8d2-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.926858 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.927094 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.927119 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bd448d61-5688-446c-9c67-faefdbf2aee5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bd448d61-5688-446c-9c67-faefdbf2aee5\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7d7f16659c7b65e364f058428edf8e42fe5a526781f6481a589ce5c60a59dda3/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.927166 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.927190 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9dd4f437-f594-443f-a270-1e9f3f1b45c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9dd4f437-f594-443f-a270-1e9f3f1b45c0\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5374590537d07dccfcd8a6a3babed56357cd8cebd630b80ec00bfe54afcc2d13/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.927880 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec70e74e-82af-4e84-88c4-9436180cb8d2-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.927913 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec70e74e-82af-4e84-88c4-9436180cb8d2-config\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.928102 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.928128 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e11457b2-2ff1-4c35-8b61-7eecf793aec2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e11457b2-2ff1-4c35-8b61-7eecf793aec2\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1ed0d30526df56a30930079a9499512c6e047de548a0e44ee6213bed664cdcf6/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.931080 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.931155 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec70e74e-82af-4e84-88c4-9436180cb8d2-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.936961 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f5cb5-16b7-44e2-924c-77c2730118f0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.937677 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.946313 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.953766 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5vwj\" (UniqueName: \"kubernetes.io/projected/b99b5db1-b403-4ae0-bd38-d8ce5b410ddc-kube-api-access-c5vwj\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.954347 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwxvv\" (UniqueName: \"kubernetes.io/projected/ec70e74e-82af-4e84-88c4-9436180cb8d2-kube-api-access-pwxvv\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.956039 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2snz9\" (UniqueName: \"kubernetes.io/projected/095f5cb5-16b7-44e2-924c-77c2730118f0-kube-api-access-2snz9\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.979199 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e11457b2-2ff1-4c35-8b61-7eecf793aec2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e11457b2-2ff1-4c35-8b61-7eecf793aec2\") pod \"ovsdbserver-nb-0\" (UID: \"095f5cb5-16b7-44e2-924c-77c2730118f0\") " pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.984371 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9dd4f437-f594-443f-a270-1e9f3f1b45c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9dd4f437-f594-443f-a270-1e9f3f1b45c0\") pod \"ovsdbserver-nb-1\" (UID: \"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc\") " pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.985375 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bd448d61-5688-446c-9c67-faefdbf2aee5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bd448d61-5688-446c-9c67-faefdbf2aee5\") pod \"ovsdbserver-nb-2\" (UID: \"ec70e74e-82af-4e84-88c4-9436180cb8d2\") " pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:39 crc kubenswrapper[5033]: I0226 17:10:39.999702 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.016137 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.022940 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncqtm\" (UniqueName: \"kubernetes.io/projected/e42ca2a0-6c30-408e-a49a-9f52d241b74a-kube-api-access-ncqtm\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.022984 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e42ca2a0-6c30-408e-a49a-9f52d241b74a-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023016 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c76bfef9-84e4-470b-a497-9855ac4eec8b-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023133 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eed185d3-8452-4f44-a7d9-ab8678e36e37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed185d3-8452-4f44-a7d9-ab8678e36e37\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023166 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42ca2a0-6c30-408e-a49a-9f52d241b74a-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023189 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-29ca8dbe-01ea-4e37-938d-78148312690b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29ca8dbe-01ea-4e37-938d-78148312690b\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023220 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e42ca2a0-6c30-408e-a49a-9f52d241b74a-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023331 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e42ca2a0-6c30-408e-a49a-9f52d241b74a-config\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023353 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01ac4961-e808-495c-a2d3-7b3cde3c725b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023378 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76bfef9-84e4-470b-a497-9855ac4eec8b-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023401 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01ac4961-e808-495c-a2d3-7b3cde3c725b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023423 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ac4961-e808-495c-a2d3-7b3cde3c725b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023451 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c76bfef9-84e4-470b-a497-9855ac4eec8b-config\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023474 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ac4961-e808-495c-a2d3-7b3cde3c725b-config\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023493 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c76bfef9-84e4-470b-a497-9855ac4eec8b-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023507 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqrr4\" (UniqueName: \"kubernetes.io/projected/01ac4961-e808-495c-a2d3-7b3cde3c725b-kube-api-access-qqrr4\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023525 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3a3bcd01-5394-4fda-bb21-c5604fc84438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a3bcd01-5394-4fda-bb21-c5604fc84438\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023552 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm9kh\" (UniqueName: \"kubernetes.io/projected/c76bfef9-84e4-470b-a497-9855ac4eec8b-kube-api-access-fm9kh\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.023695 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.124878 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncqtm\" (UniqueName: \"kubernetes.io/projected/e42ca2a0-6c30-408e-a49a-9f52d241b74a-kube-api-access-ncqtm\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.124930 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e42ca2a0-6c30-408e-a49a-9f52d241b74a-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.124964 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c76bfef9-84e4-470b-a497-9855ac4eec8b-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.124997 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eed185d3-8452-4f44-a7d9-ab8678e36e37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed185d3-8452-4f44-a7d9-ab8678e36e37\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42ca2a0-6c30-408e-a49a-9f52d241b74a-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125051 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-29ca8dbe-01ea-4e37-938d-78148312690b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29ca8dbe-01ea-4e37-938d-78148312690b\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125075 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e42ca2a0-6c30-408e-a49a-9f52d241b74a-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125107 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e42ca2a0-6c30-408e-a49a-9f52d241b74a-config\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125126 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01ac4961-e808-495c-a2d3-7b3cde3c725b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125146 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76bfef9-84e4-470b-a497-9855ac4eec8b-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125160 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01ac4961-e808-495c-a2d3-7b3cde3c725b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125179 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ac4961-e808-495c-a2d3-7b3cde3c725b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125201 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c76bfef9-84e4-470b-a497-9855ac4eec8b-config\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125217 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ac4961-e808-495c-a2d3-7b3cde3c725b-config\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125234 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c76bfef9-84e4-470b-a497-9855ac4eec8b-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125250 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqrr4\" (UniqueName: \"kubernetes.io/projected/01ac4961-e808-495c-a2d3-7b3cde3c725b-kube-api-access-qqrr4\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125286 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3a3bcd01-5394-4fda-bb21-c5604fc84438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a3bcd01-5394-4fda-bb21-c5604fc84438\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.125315 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm9kh\" (UniqueName: \"kubernetes.io/projected/c76bfef9-84e4-470b-a497-9855ac4eec8b-kube-api-access-fm9kh\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.126445 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01ac4961-e808-495c-a2d3-7b3cde3c725b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.126793 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e42ca2a0-6c30-408e-a49a-9f52d241b74a-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.127018 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e42ca2a0-6c30-408e-a49a-9f52d241b74a-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.127393 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01ac4961-e808-495c-a2d3-7b3cde3c725b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.127982 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e42ca2a0-6c30-408e-a49a-9f52d241b74a-config\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.127994 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c76bfef9-84e4-470b-a497-9855ac4eec8b-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.128994 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ac4961-e808-495c-a2d3-7b3cde3c725b-config\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.129038 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c76bfef9-84e4-470b-a497-9855ac4eec8b-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.130919 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.130949 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-29ca8dbe-01ea-4e37-938d-78148312690b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29ca8dbe-01ea-4e37-938d-78148312690b\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db5e89ea7c243c0fbf9d8b8d7cf70e2c7d4c5e2f2d8388aa2299a1f7ac7e01dc/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.131117 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.131161 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3a3bcd01-5394-4fda-bb21-c5604fc84438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a3bcd01-5394-4fda-bb21-c5604fc84438\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0c2036a92ff4888365b27f2ea81e7352a7442f313a9f1fa5caed0f09d49ed379/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.132156 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.132181 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eed185d3-8452-4f44-a7d9-ab8678e36e37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed185d3-8452-4f44-a7d9-ab8678e36e37\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5e95a8c7ed43e935b31dd2a9423391e7aca2305e856431752a6db36aed632960/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.133981 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76bfef9-84e4-470b-a497-9855ac4eec8b-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.138078 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c76bfef9-84e4-470b-a497-9855ac4eec8b-config\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.138380 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ac4961-e808-495c-a2d3-7b3cde3c725b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.142042 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncqtm\" (UniqueName: \"kubernetes.io/projected/e42ca2a0-6c30-408e-a49a-9f52d241b74a-kube-api-access-ncqtm\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.144411 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42ca2a0-6c30-408e-a49a-9f52d241b74a-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.146073 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm9kh\" (UniqueName: \"kubernetes.io/projected/c76bfef9-84e4-470b-a497-9855ac4eec8b-kube-api-access-fm9kh\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.154507 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqrr4\" (UniqueName: \"kubernetes.io/projected/01ac4961-e808-495c-a2d3-7b3cde3c725b-kube-api-access-qqrr4\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.176179 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-29ca8dbe-01ea-4e37-938d-78148312690b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29ca8dbe-01ea-4e37-938d-78148312690b\") pod \"ovsdbserver-sb-2\" (UID: \"e42ca2a0-6c30-408e-a49a-9f52d241b74a\") " pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.194641 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eed185d3-8452-4f44-a7d9-ab8678e36e37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eed185d3-8452-4f44-a7d9-ab8678e36e37\") pod \"ovsdbserver-sb-1\" (UID: \"c76bfef9-84e4-470b-a497-9855ac4eec8b\") " pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.197416 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3a3bcd01-5394-4fda-bb21-c5604fc84438\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3a3bcd01-5394-4fda-bb21-c5604fc84438\") pod \"ovsdbserver-sb-0\" (UID: \"01ac4961-e808-495c-a2d3-7b3cde3c725b\") " pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.219707 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.418148 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.495608 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.538848 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.652380 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.677964 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"095f5cb5-16b7-44e2-924c-77c2730118f0","Type":"ContainerStarted","Data":"63a760250d5dc0138c05de0ac5e3bf2e62c7a2a2bc426834aaa2e1f4f259e3e8"} Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.682010 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc","Type":"ContainerStarted","Data":"e5301d2f2c031292170d2115a06d7f71c966ba75e0db5f6abd336c00838993da"} Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.766854 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 26 17:10:40 crc kubenswrapper[5033]: W0226 17:10:40.773953 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode42ca2a0_6c30_408e_a49a_9f52d241b74a.slice/crio-3cdabb48559f8a790a3f9282f124887780f02a3285da6a3eb3e7f47019bd9a4a WatchSource:0}: Error finding container 3cdabb48559f8a790a3f9282f124887780f02a3285da6a3eb3e7f47019bd9a4a: Status 404 returned error can't find the container with id 3cdabb48559f8a790a3f9282f124887780f02a3285da6a3eb3e7f47019bd9a4a Feb 26 17:10:40 crc kubenswrapper[5033]: I0226 17:10:40.969606 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 26 17:10:40 crc kubenswrapper[5033]: W0226 17:10:40.984020 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc76bfef9_84e4_470b_a497_9855ac4eec8b.slice/crio-a21308d3ac36a97f013b656d3d0df3d062674622bb8d215953d13303f2eb894b WatchSource:0}: Error finding container a21308d3ac36a97f013b656d3d0df3d062674622bb8d215953d13303f2eb894b: Status 404 returned error can't find the container with id a21308d3ac36a97f013b656d3d0df3d062674622bb8d215953d13303f2eb894b Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.094093 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 17:10:41 crc kubenswrapper[5033]: W0226 17:10:41.101008 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01ac4961_e808_495c_a2d3_7b3cde3c725b.slice/crio-dd5ce53f5504655d67c65828d63f9efbf872583f2434dfcbed8dc2c6c7ceb890 WatchSource:0}: Error finding container dd5ce53f5504655d67c65828d63f9efbf872583f2434dfcbed8dc2c6c7ceb890: Status 404 returned error can't find the container with id dd5ce53f5504655d67c65828d63f9efbf872583f2434dfcbed8dc2c6c7ceb890 Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.531831 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.691631 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"01ac4961-e808-495c-a2d3-7b3cde3c725b","Type":"ContainerStarted","Data":"93b0884e67d678206761750450e12d610d02686fc14080567cd6e37848f65bb9"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.691680 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"01ac4961-e808-495c-a2d3-7b3cde3c725b","Type":"ContainerStarted","Data":"b3a92868d6abb3ba324d56acbdb30db567fadf35510a4e1c5b1a28ed387e8475"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.691690 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"01ac4961-e808-495c-a2d3-7b3cde3c725b","Type":"ContainerStarted","Data":"dd5ce53f5504655d67c65828d63f9efbf872583f2434dfcbed8dc2c6c7ceb890"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.694188 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e42ca2a0-6c30-408e-a49a-9f52d241b74a","Type":"ContainerStarted","Data":"3f5377a9fe7be8fd0ca887b336d14283cde430875446087695f643e069c10499"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.694236 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e42ca2a0-6c30-408e-a49a-9f52d241b74a","Type":"ContainerStarted","Data":"8751f9f2f80d3c99dd2cfa7642e2eaf2200085162c04acdc19cf4eb5522c7260"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.694250 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e42ca2a0-6c30-408e-a49a-9f52d241b74a","Type":"ContainerStarted","Data":"3cdabb48559f8a790a3f9282f124887780f02a3285da6a3eb3e7f47019bd9a4a"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.697752 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc","Type":"ContainerStarted","Data":"0997b165c797c0bf2a24a2e22786fac36a37447146d26e07012fd0cc10f6e70f"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.697791 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"b99b5db1-b403-4ae0-bd38-d8ce5b410ddc","Type":"ContainerStarted","Data":"ef6b780b0d5664e5099c01cad64fcd9959c30dfe328902cf02ec6ef03c1843c2"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.702029 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"095f5cb5-16b7-44e2-924c-77c2730118f0","Type":"ContainerStarted","Data":"ab218a61620a135c407421d5e98f875ff016c5ac4c9d8198123457001d7043b2"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.702078 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"095f5cb5-16b7-44e2-924c-77c2730118f0","Type":"ContainerStarted","Data":"dd06763136e484895011d21f160b9641a0420babad24e5e229ed002eaa9e59e8"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.703476 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"ec70e74e-82af-4e84-88c4-9436180cb8d2","Type":"ContainerStarted","Data":"fc7b68c8f11c83e79214ead11e5f8ffe9fcb3da3007b2f640a1ea1437e95fe43"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.703504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"ec70e74e-82af-4e84-88c4-9436180cb8d2","Type":"ContainerStarted","Data":"f9e9e066fe856c8e9a8c9146030ed10ec9193336ddff1b49b0bea27205d2bada"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.705509 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c76bfef9-84e4-470b-a497-9855ac4eec8b","Type":"ContainerStarted","Data":"aedb1cb4099e525b503bd6f29ced56e91b1ec5ae852fd3891d2c6729a511c8b2"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.705559 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c76bfef9-84e4-470b-a497-9855ac4eec8b","Type":"ContainerStarted","Data":"d26911a02ff843aa55a22b674fab8be21de453e5480a6ed6ffe9e1bffeaf2a4b"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.705571 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c76bfef9-84e4-470b-a497-9855ac4eec8b","Type":"ContainerStarted","Data":"a21308d3ac36a97f013b656d3d0df3d062674622bb8d215953d13303f2eb894b"} Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.719026 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.719007187 podStartE2EDuration="3.719007187s" podCreationTimestamp="2026-02-26 17:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:10:41.70806784 +0000 UTC m=+5368.920666490" watchObservedRunningTime="2026-02-26 17:10:41.719007187 +0000 UTC m=+5368.931605857" Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.750506 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.750483905 podStartE2EDuration="3.750483905s" podCreationTimestamp="2026-02-26 17:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:10:41.729347359 +0000 UTC m=+5368.941946009" watchObservedRunningTime="2026-02-26 17:10:41.750483905 +0000 UTC m=+5368.963082555" Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.750617 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.750612298 podStartE2EDuration="3.750612298s" podCreationTimestamp="2026-02-26 17:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:10:41.746806636 +0000 UTC m=+5368.959405286" watchObservedRunningTime="2026-02-26 17:10:41.750612298 +0000 UTC m=+5368.963210948" Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.775858 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.775838714 podStartE2EDuration="3.775838714s" podCreationTimestamp="2026-02-26 17:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:10:41.766461916 +0000 UTC m=+5368.979060566" watchObservedRunningTime="2026-02-26 17:10:41.775838714 +0000 UTC m=+5368.988437364" Feb 26 17:10:41 crc kubenswrapper[5033]: I0226 17:10:41.787389 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.787367346 podStartE2EDuration="3.787367346s" podCreationTimestamp="2026-02-26 17:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:10:41.784435183 +0000 UTC m=+5368.997033833" watchObservedRunningTime="2026-02-26 17:10:41.787367346 +0000 UTC m=+5368.999965996" Feb 26 17:10:42 crc kubenswrapper[5033]: I0226 17:10:42.716523 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"ec70e74e-82af-4e84-88c4-9436180cb8d2","Type":"ContainerStarted","Data":"2eb3a818f5e0da69973f289bbe12dc682cedc653ebfaaef2cdd9e07921b42279"} Feb 26 17:10:42 crc kubenswrapper[5033]: I0226 17:10:42.742816 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.742799879 podStartE2EDuration="4.742799879s" podCreationTimestamp="2026-02-26 17:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:10:42.734166688 +0000 UTC m=+5369.946765388" watchObservedRunningTime="2026-02-26 17:10:42.742799879 +0000 UTC m=+5369.955398529" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.001431 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.017505 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.025202 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.066339 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.100843 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.221485 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.258705 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.418948 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.496919 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.726710 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.726764 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:43 crc kubenswrapper[5033]: I0226 17:10:43.726785 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.016805 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.066891 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.085356 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.269387 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.331742 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64848558ff-smllm"] Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.333471 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.336653 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.341994 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-smllm"] Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.418713 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.421998 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nqj5\" (UniqueName: \"kubernetes.io/projected/7cd956ac-e004-4300-a0a9-1e97287aed32-kube-api-access-9nqj5\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.422063 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-ovsdbserver-nb\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.422214 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-dns-svc\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.422315 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-config\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.497162 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.524119 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nqj5\" (UniqueName: \"kubernetes.io/projected/7cd956ac-e004-4300-a0a9-1e97287aed32-kube-api-access-9nqj5\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.524187 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-ovsdbserver-nb\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.524236 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-dns-svc\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.524278 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-config\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.525345 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-ovsdbserver-nb\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.525417 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-config\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.525483 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-dns-svc\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.525963 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-smllm"] Feb 26 17:10:45 crc kubenswrapper[5033]: E0226 17:10:45.526473 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-9nqj5], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-64848558ff-smllm" podUID="7cd956ac-e004-4300-a0a9-1e97287aed32" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.553622 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b8b7f4989-ncbwh"] Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.555155 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.561902 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.562506 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nqj5\" (UniqueName: \"kubernetes.io/projected/7cd956ac-e004-4300-a0a9-1e97287aed32-kube-api-access-9nqj5\") pod \"dnsmasq-dns-64848558ff-smllm\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.571761 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b8b7f4989-ncbwh"] Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.626070 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.626138 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.626507 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-config\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.626553 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-dns-svc\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.626617 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmzrq\" (UniqueName: \"kubernetes.io/projected/0f1e21d2-14f1-48f4-8214-d982806765f6-kube-api-access-bmzrq\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.728298 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmzrq\" (UniqueName: \"kubernetes.io/projected/0f1e21d2-14f1-48f4-8214-d982806765f6-kube-api-access-bmzrq\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.728378 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.728400 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.728480 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-config\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.728512 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-dns-svc\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.729596 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.729765 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-dns-svc\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.729963 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.732875 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-config\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.747822 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.751119 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmzrq\" (UniqueName: \"kubernetes.io/projected/0f1e21d2-14f1-48f4-8214-d982806765f6-kube-api-access-bmzrq\") pod \"dnsmasq-dns-5b8b7f4989-ncbwh\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.782593 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.829222 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-ovsdbserver-nb\") pod \"7cd956ac-e004-4300-a0a9-1e97287aed32\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.829402 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nqj5\" (UniqueName: \"kubernetes.io/projected/7cd956ac-e004-4300-a0a9-1e97287aed32-kube-api-access-9nqj5\") pod \"7cd956ac-e004-4300-a0a9-1e97287aed32\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.829459 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-config\") pod \"7cd956ac-e004-4300-a0a9-1e97287aed32\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.829488 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-dns-svc\") pod \"7cd956ac-e004-4300-a0a9-1e97287aed32\" (UID: \"7cd956ac-e004-4300-a0a9-1e97287aed32\") " Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.830305 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7cd956ac-e004-4300-a0a9-1e97287aed32" (UID: "7cd956ac-e004-4300-a0a9-1e97287aed32"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.830338 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7cd956ac-e004-4300-a0a9-1e97287aed32" (UID: "7cd956ac-e004-4300-a0a9-1e97287aed32"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.831277 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.831305 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.833464 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-config" (OuterVolumeSpecName: "config") pod "7cd956ac-e004-4300-a0a9-1e97287aed32" (UID: "7cd956ac-e004-4300-a0a9-1e97287aed32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.835277 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd956ac-e004-4300-a0a9-1e97287aed32-kube-api-access-9nqj5" (OuterVolumeSpecName: "kube-api-access-9nqj5") pod "7cd956ac-e004-4300-a0a9-1e97287aed32" (UID: "7cd956ac-e004-4300-a0a9-1e97287aed32"). InnerVolumeSpecName "kube-api-access-9nqj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.905360 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.932427 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nqj5\" (UniqueName: \"kubernetes.io/projected/7cd956ac-e004-4300-a0a9-1e97287aed32-kube-api-access-9nqj5\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:45 crc kubenswrapper[5033]: I0226 17:10:45.932470 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd956ac-e004-4300-a0a9-1e97287aed32-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.062945 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.359962 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b8b7f4989-ncbwh"] Feb 26 17:10:46 crc kubenswrapper[5033]: W0226 17:10:46.362096 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f1e21d2_14f1_48f4_8214_d982806765f6.slice/crio-0186fde1c3caf06229b75bbc51e663403c0ecad93072463dec5c10edd6e3914d WatchSource:0}: Error finding container 0186fde1c3caf06229b75bbc51e663403c0ecad93072463dec5c10edd6e3914d: Status 404 returned error can't find the container with id 0186fde1c3caf06229b75bbc51e663403c0ecad93072463dec5c10edd6e3914d Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.465675 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.506439 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.540959 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.592630 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.756950 5033 generic.go:334] "Generic (PLEG): container finished" podID="0f1e21d2-14f1-48f4-8214-d982806765f6" containerID="95d38db2d4d2cd187362be2c171a06f194190007408682bd8ad61c19c75202b7" exitCode=0 Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.757084 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" event={"ID":"0f1e21d2-14f1-48f4-8214-d982806765f6","Type":"ContainerDied","Data":"95d38db2d4d2cd187362be2c171a06f194190007408682bd8ad61c19c75202b7"} Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.757111 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" event={"ID":"0f1e21d2-14f1-48f4-8214-d982806765f6","Type":"ContainerStarted","Data":"0186fde1c3caf06229b75bbc51e663403c0ecad93072463dec5c10edd6e3914d"} Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.757135 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64848558ff-smllm" Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.811034 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.958154 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-smllm"] Feb 26 17:10:46 crc kubenswrapper[5033]: I0226 17:10:46.967939 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64848558ff-smllm"] Feb 26 17:10:47 crc kubenswrapper[5033]: I0226 17:10:47.638080 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd956ac-e004-4300-a0a9-1e97287aed32" path="/var/lib/kubelet/pods/7cd956ac-e004-4300-a0a9-1e97287aed32/volumes" Feb 26 17:10:47 crc kubenswrapper[5033]: I0226 17:10:47.767641 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" event={"ID":"0f1e21d2-14f1-48f4-8214-d982806765f6","Type":"ContainerStarted","Data":"5f97e74232cb55a2c43d9da41b47f8c95029bd2d73e2e132057ca31d32a39d79"} Feb 26 17:10:47 crc kubenswrapper[5033]: I0226 17:10:47.811672 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" podStartSLOduration=2.811648854 podStartE2EDuration="2.811648854s" podCreationTimestamp="2026-02-26 17:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:10:47.79634938 +0000 UTC m=+5375.008948100" watchObservedRunningTime="2026-02-26 17:10:47.811648854 +0000 UTC m=+5375.024247514" Feb 26 17:10:48 crc kubenswrapper[5033]: I0226 17:10:48.774118 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.504119 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.506734 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.522149 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.533488 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.621705 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/682c1a69-619b-4f26-bfdf-ad71f13cecfc-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.622452 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6781f54e-f208-4b55-828c-2ea6c2101239\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6781f54e-f208-4b55-828c-2ea6c2101239\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.622576 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkjdz\" (UniqueName: \"kubernetes.io/projected/682c1a69-619b-4f26-bfdf-ad71f13cecfc-kube-api-access-fkjdz\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.724831 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6781f54e-f208-4b55-828c-2ea6c2101239\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6781f54e-f208-4b55-828c-2ea6c2101239\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.724896 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkjdz\" (UniqueName: \"kubernetes.io/projected/682c1a69-619b-4f26-bfdf-ad71f13cecfc-kube-api-access-fkjdz\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.724938 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/682c1a69-619b-4f26-bfdf-ad71f13cecfc-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.727313 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rchhz"] Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.734540 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.734699 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6781f54e-f208-4b55-828c-2ea6c2101239\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6781f54e-f208-4b55-828c-2ea6c2101239\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c2657671e9c5b394f76e2abb1b7c92036e4879e497ebbbed19de49280364ddb2/globalmount\"" pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.735550 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.735762 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rchhz"] Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.745629 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/682c1a69-619b-4f26-bfdf-ad71f13cecfc-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.756877 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkjdz\" (UniqueName: \"kubernetes.io/projected/682c1a69-619b-4f26-bfdf-ad71f13cecfc-kube-api-access-fkjdz\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.784734 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6781f54e-f208-4b55-828c-2ea6c2101239\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6781f54e-f208-4b55-828c-2ea6c2101239\") pod \"ovn-copy-data\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.827025 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5kb\" (UniqueName: \"kubernetes.io/projected/c2017e8f-1cff-4438-a426-6ee0701a457c-kube-api-access-pk5kb\") pod \"community-operators-rchhz\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.828623 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-catalog-content\") pod \"community-operators-rchhz\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.829012 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-utilities\") pod \"community-operators-rchhz\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.840606 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.931143 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-catalog-content\") pod \"community-operators-rchhz\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.931474 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-utilities\") pod \"community-operators-rchhz\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.931659 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5kb\" (UniqueName: \"kubernetes.io/projected/c2017e8f-1cff-4438-a426-6ee0701a457c-kube-api-access-pk5kb\") pod \"community-operators-rchhz\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.931737 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-catalog-content\") pod \"community-operators-rchhz\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.932133 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-utilities\") pod \"community-operators-rchhz\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:49 crc kubenswrapper[5033]: I0226 17:10:49.950377 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5kb\" (UniqueName: \"kubernetes.io/projected/c2017e8f-1cff-4438-a426-6ee0701a457c-kube-api-access-pk5kb\") pod \"community-operators-rchhz\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:50 crc kubenswrapper[5033]: I0226 17:10:50.133759 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:10:50 crc kubenswrapper[5033]: I0226 17:10:50.403476 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 26 17:10:50 crc kubenswrapper[5033]: I0226 17:10:50.635345 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rchhz"] Feb 26 17:10:50 crc kubenswrapper[5033]: W0226 17:10:50.644063 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2017e8f_1cff_4438_a426_6ee0701a457c.slice/crio-fa744cc8c5531aab5cfa1baefa83d2a2b747f49c2d921dda06212e7ed8db76e6 WatchSource:0}: Error finding container fa744cc8c5531aab5cfa1baefa83d2a2b747f49c2d921dda06212e7ed8db76e6: Status 404 returned error can't find the container with id fa744cc8c5531aab5cfa1baefa83d2a2b747f49c2d921dda06212e7ed8db76e6 Feb 26 17:10:50 crc kubenswrapper[5033]: I0226 17:10:50.803562 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rchhz" event={"ID":"c2017e8f-1cff-4438-a426-6ee0701a457c","Type":"ContainerStarted","Data":"fa744cc8c5531aab5cfa1baefa83d2a2b747f49c2d921dda06212e7ed8db76e6"} Feb 26 17:10:50 crc kubenswrapper[5033]: I0226 17:10:50.810569 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"682c1a69-619b-4f26-bfdf-ad71f13cecfc","Type":"ContainerStarted","Data":"833d7444db08b6840f10a7b02e1ed045b641e83d121de9a99e3e283ef03c933c"} Feb 26 17:10:50 crc kubenswrapper[5033]: I0226 17:10:50.810620 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"682c1a69-619b-4f26-bfdf-ad71f13cecfc","Type":"ContainerStarted","Data":"4909492c03c7dfa660b98d4596f10df2918238057bd48f185b5daa27b1423c5c"} Feb 26 17:10:50 crc kubenswrapper[5033]: I0226 17:10:50.831377 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.831347348 podStartE2EDuration="2.831347348s" podCreationTimestamp="2026-02-26 17:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:10:50.826813077 +0000 UTC m=+5378.039411767" watchObservedRunningTime="2026-02-26 17:10:50.831347348 +0000 UTC m=+5378.043946008" Feb 26 17:10:51 crc kubenswrapper[5033]: I0226 17:10:51.830981 5033 generic.go:334] "Generic (PLEG): container finished" podID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerID="043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1" exitCode=0 Feb 26 17:10:51 crc kubenswrapper[5033]: I0226 17:10:51.831309 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rchhz" event={"ID":"c2017e8f-1cff-4438-a426-6ee0701a457c","Type":"ContainerDied","Data":"043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1"} Feb 26 17:10:51 crc kubenswrapper[5033]: I0226 17:10:51.834955 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:10:53 crc kubenswrapper[5033]: I0226 17:10:53.848744 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rchhz" event={"ID":"c2017e8f-1cff-4438-a426-6ee0701a457c","Type":"ContainerStarted","Data":"93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18"} Feb 26 17:10:54 crc kubenswrapper[5033]: I0226 17:10:54.858178 5033 generic.go:334] "Generic (PLEG): container finished" podID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerID="93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18" exitCode=0 Feb 26 17:10:54 crc kubenswrapper[5033]: I0226 17:10:54.858273 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rchhz" event={"ID":"c2017e8f-1cff-4438-a426-6ee0701a457c","Type":"ContainerDied","Data":"93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18"} Feb 26 17:10:55 crc kubenswrapper[5033]: I0226 17:10:55.868969 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rchhz" event={"ID":"c2017e8f-1cff-4438-a426-6ee0701a457c","Type":"ContainerStarted","Data":"bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75"} Feb 26 17:10:55 crc kubenswrapper[5033]: I0226 17:10:55.895425 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rchhz" podStartSLOduration=3.340032682 podStartE2EDuration="6.895399576s" podCreationTimestamp="2026-02-26 17:10:49 +0000 UTC" firstStartedPulling="2026-02-26 17:10:51.834170477 +0000 UTC m=+5379.046769167" lastFinishedPulling="2026-02-26 17:10:55.389537391 +0000 UTC m=+5382.602136061" observedRunningTime="2026-02-26 17:10:55.892790092 +0000 UTC m=+5383.105388762" watchObservedRunningTime="2026-02-26 17:10:55.895399576 +0000 UTC m=+5383.107998256" Feb 26 17:10:55 crc kubenswrapper[5033]: I0226 17:10:55.907298 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:10:55 crc kubenswrapper[5033]: I0226 17:10:55.976443 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-bls56"] Feb 26 17:10:55 crc kubenswrapper[5033]: I0226 17:10:55.978142 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" podUID="77140609-456c-4eeb-84ef-59b661196b87" containerName="dnsmasq-dns" containerID="cri-o://2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09" gracePeriod=10 Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.434564 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.436239 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.439871 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-w2wlb" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.440076 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.440192 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.492309 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.549732 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.558778 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-scripts\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.558844 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2gk8\" (UniqueName: \"kubernetes.io/projected/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-kube-api-access-x2gk8\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.558894 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-config\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.558922 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.558987 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.660326 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-config\") pod \"77140609-456c-4eeb-84ef-59b661196b87\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.660437 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhhw6\" (UniqueName: \"kubernetes.io/projected/77140609-456c-4eeb-84ef-59b661196b87-kube-api-access-zhhw6\") pod \"77140609-456c-4eeb-84ef-59b661196b87\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.660471 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-dns-svc\") pod \"77140609-456c-4eeb-84ef-59b661196b87\" (UID: \"77140609-456c-4eeb-84ef-59b661196b87\") " Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.660705 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.660771 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.660834 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-scripts\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.660858 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2gk8\" (UniqueName: \"kubernetes.io/projected/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-kube-api-access-x2gk8\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.660889 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-config\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.661721 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-config\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.662621 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-scripts\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.662912 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.672228 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77140609-456c-4eeb-84ef-59b661196b87-kube-api-access-zhhw6" (OuterVolumeSpecName: "kube-api-access-zhhw6") pod "77140609-456c-4eeb-84ef-59b661196b87" (UID: "77140609-456c-4eeb-84ef-59b661196b87"). InnerVolumeSpecName "kube-api-access-zhhw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.677515 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.679665 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2gk8\" (UniqueName: \"kubernetes.io/projected/fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc-kube-api-access-x2gk8\") pod \"ovn-northd-0\" (UID: \"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc\") " pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.713578 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-config" (OuterVolumeSpecName: "config") pod "77140609-456c-4eeb-84ef-59b661196b87" (UID: "77140609-456c-4eeb-84ef-59b661196b87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.731110 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77140609-456c-4eeb-84ef-59b661196b87" (UID: "77140609-456c-4eeb-84ef-59b661196b87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.762760 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.763506 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhhw6\" (UniqueName: \"kubernetes.io/projected/77140609-456c-4eeb-84ef-59b661196b87-kube-api-access-zhhw6\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.763519 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77140609-456c-4eeb-84ef-59b661196b87-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.842376 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.883377 5033 generic.go:334] "Generic (PLEG): container finished" podID="77140609-456c-4eeb-84ef-59b661196b87" containerID="2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09" exitCode=0 Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.884318 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.884379 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" event={"ID":"77140609-456c-4eeb-84ef-59b661196b87","Type":"ContainerDied","Data":"2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09"} Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.884417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-bls56" event={"ID":"77140609-456c-4eeb-84ef-59b661196b87","Type":"ContainerDied","Data":"7274f91ba01629937ce1f5065dd453920575c4d0b80a6c2149af62a45bba6999"} Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.884435 5033 scope.go:117] "RemoveContainer" containerID="2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.910518 5033 scope.go:117] "RemoveContainer" containerID="7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.933635 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-bls56"] Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.954029 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-bls56"] Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.963059 5033 scope.go:117] "RemoveContainer" containerID="2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09" Feb 26 17:10:56 crc kubenswrapper[5033]: E0226 17:10:56.971389 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09\": container with ID starting with 2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09 not found: ID does not exist" containerID="2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.971429 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09"} err="failed to get container status \"2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09\": rpc error: code = NotFound desc = could not find container \"2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09\": container with ID starting with 2d1b6d7d95be6a908516126e6c307df10c6d00744a1ebac7d22b0ed9cca08b09 not found: ID does not exist" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.971456 5033 scope.go:117] "RemoveContainer" containerID="7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488" Feb 26 17:10:56 crc kubenswrapper[5033]: E0226 17:10:56.974100 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488\": container with ID starting with 7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488 not found: ID does not exist" containerID="7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488" Feb 26 17:10:56 crc kubenswrapper[5033]: I0226 17:10:56.974151 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488"} err="failed to get container status \"7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488\": rpc error: code = NotFound desc = could not find container \"7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488\": container with ID starting with 7a333d76fe60a294f332b68687bd7752972996681ee5f8842ab30d8931825488 not found: ID does not exist" Feb 26 17:10:57 crc kubenswrapper[5033]: I0226 17:10:57.325168 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 17:10:57 crc kubenswrapper[5033]: I0226 17:10:57.637242 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77140609-456c-4eeb-84ef-59b661196b87" path="/var/lib/kubelet/pods/77140609-456c-4eeb-84ef-59b661196b87/volumes" Feb 26 17:10:57 crc kubenswrapper[5033]: I0226 17:10:57.896105 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc","Type":"ContainerStarted","Data":"70ec7ddede6ac16d3af884bc52637ff4fc0e1b487651376d1b46f6b5e362d3f1"} Feb 26 17:10:57 crc kubenswrapper[5033]: I0226 17:10:57.896148 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc","Type":"ContainerStarted","Data":"870b77b9ef82fd682424f7909486c45dccfb016a9b667929e47e230390e96f74"} Feb 26 17:10:57 crc kubenswrapper[5033]: I0226 17:10:57.896161 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc","Type":"ContainerStarted","Data":"a7d2c846ca4675c0a07fb35bd9c4667393ffbef3d3d597c6c4a6d1c2345114f8"} Feb 26 17:10:57 crc kubenswrapper[5033]: I0226 17:10:57.896269 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 26 17:10:57 crc kubenswrapper[5033]: I0226 17:10:57.917156 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.917134888 podStartE2EDuration="1.917134888s" podCreationTimestamp="2026-02-26 17:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:10:57.913808096 +0000 UTC m=+5385.126406786" watchObservedRunningTime="2026-02-26 17:10:57.917134888 +0000 UTC m=+5385.129733558" Feb 26 17:11:00 crc kubenswrapper[5033]: I0226 17:11:00.135527 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:11:00 crc kubenswrapper[5033]: I0226 17:11:00.136091 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:11:00 crc kubenswrapper[5033]: I0226 17:11:00.191552 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:11:00 crc kubenswrapper[5033]: I0226 17:11:00.965192 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.041613 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rchhz"] Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.366220 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pjksv"] Feb 26 17:11:01 crc kubenswrapper[5033]: E0226 17:11:01.366626 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77140609-456c-4eeb-84ef-59b661196b87" containerName="dnsmasq-dns" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.366643 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="77140609-456c-4eeb-84ef-59b661196b87" containerName="dnsmasq-dns" Feb 26 17:11:01 crc kubenswrapper[5033]: E0226 17:11:01.366654 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77140609-456c-4eeb-84ef-59b661196b87" containerName="init" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.366662 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="77140609-456c-4eeb-84ef-59b661196b87" containerName="init" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.366927 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="77140609-456c-4eeb-84ef-59b661196b87" containerName="dnsmasq-dns" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.367570 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.373747 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pjksv"] Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.446727 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p65pg\" (UniqueName: \"kubernetes.io/projected/c5b951c3-accd-47c9-aa64-efe6365d3022-kube-api-access-p65pg\") pod \"keystone-db-create-pjksv\" (UID: \"c5b951c3-accd-47c9-aa64-efe6365d3022\") " pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.446778 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5b951c3-accd-47c9-aa64-efe6365d3022-operator-scripts\") pod \"keystone-db-create-pjksv\" (UID: \"c5b951c3-accd-47c9-aa64-efe6365d3022\") " pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.475787 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-af50-account-create-update-gjwkt"] Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.477035 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.480728 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.487913 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-af50-account-create-update-gjwkt"] Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.548077 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p65pg\" (UniqueName: \"kubernetes.io/projected/c5b951c3-accd-47c9-aa64-efe6365d3022-kube-api-access-p65pg\") pod \"keystone-db-create-pjksv\" (UID: \"c5b951c3-accd-47c9-aa64-efe6365d3022\") " pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.548126 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5b951c3-accd-47c9-aa64-efe6365d3022-operator-scripts\") pod \"keystone-db-create-pjksv\" (UID: \"c5b951c3-accd-47c9-aa64-efe6365d3022\") " pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.548171 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75798cbd-85b7-4424-83fc-3f542dcc8820-operator-scripts\") pod \"keystone-af50-account-create-update-gjwkt\" (UID: \"75798cbd-85b7-4424-83fc-3f542dcc8820\") " pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.548281 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zrp\" (UniqueName: \"kubernetes.io/projected/75798cbd-85b7-4424-83fc-3f542dcc8820-kube-api-access-q6zrp\") pod \"keystone-af50-account-create-update-gjwkt\" (UID: \"75798cbd-85b7-4424-83fc-3f542dcc8820\") " pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.549444 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5b951c3-accd-47c9-aa64-efe6365d3022-operator-scripts\") pod \"keystone-db-create-pjksv\" (UID: \"c5b951c3-accd-47c9-aa64-efe6365d3022\") " pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.589460 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p65pg\" (UniqueName: \"kubernetes.io/projected/c5b951c3-accd-47c9-aa64-efe6365d3022-kube-api-access-p65pg\") pod \"keystone-db-create-pjksv\" (UID: \"c5b951c3-accd-47c9-aa64-efe6365d3022\") " pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.650007 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zrp\" (UniqueName: \"kubernetes.io/projected/75798cbd-85b7-4424-83fc-3f542dcc8820-kube-api-access-q6zrp\") pod \"keystone-af50-account-create-update-gjwkt\" (UID: \"75798cbd-85b7-4424-83fc-3f542dcc8820\") " pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.650088 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75798cbd-85b7-4424-83fc-3f542dcc8820-operator-scripts\") pod \"keystone-af50-account-create-update-gjwkt\" (UID: \"75798cbd-85b7-4424-83fc-3f542dcc8820\") " pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.650772 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75798cbd-85b7-4424-83fc-3f542dcc8820-operator-scripts\") pod \"keystone-af50-account-create-update-gjwkt\" (UID: \"75798cbd-85b7-4424-83fc-3f542dcc8820\") " pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.666148 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zrp\" (UniqueName: \"kubernetes.io/projected/75798cbd-85b7-4424-83fc-3f542dcc8820-kube-api-access-q6zrp\") pod \"keystone-af50-account-create-update-gjwkt\" (UID: \"75798cbd-85b7-4424-83fc-3f542dcc8820\") " pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.696529 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:01 crc kubenswrapper[5033]: I0226 17:11:01.847207 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:02 crc kubenswrapper[5033]: I0226 17:11:02.137107 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pjksv"] Feb 26 17:11:02 crc kubenswrapper[5033]: W0226 17:11:02.139374 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5b951c3_accd_47c9_aa64_efe6365d3022.slice/crio-3fe7ef66c9329aaa2bdc9be95d66ee38869c602017cedca9d7a67694149d3852 WatchSource:0}: Error finding container 3fe7ef66c9329aaa2bdc9be95d66ee38869c602017cedca9d7a67694149d3852: Status 404 returned error can't find the container with id 3fe7ef66c9329aaa2bdc9be95d66ee38869c602017cedca9d7a67694149d3852 Feb 26 17:11:02 crc kubenswrapper[5033]: I0226 17:11:02.258584 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-af50-account-create-update-gjwkt"] Feb 26 17:11:02 crc kubenswrapper[5033]: W0226 17:11:02.262892 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75798cbd_85b7_4424_83fc_3f542dcc8820.slice/crio-6d9034190462ba59293ace390e4456dfab077998ed217e438e1e934ff905c3df WatchSource:0}: Error finding container 6d9034190462ba59293ace390e4456dfab077998ed217e438e1e934ff905c3df: Status 404 returned error can't find the container with id 6d9034190462ba59293ace390e4456dfab077998ed217e438e1e934ff905c3df Feb 26 17:11:02 crc kubenswrapper[5033]: I0226 17:11:02.946089 5033 generic.go:334] "Generic (PLEG): container finished" podID="c5b951c3-accd-47c9-aa64-efe6365d3022" containerID="48acbde38630df76c413870a1a1f323e727866ffa6748253f75a81fe1a11634d" exitCode=0 Feb 26 17:11:02 crc kubenswrapper[5033]: I0226 17:11:02.946210 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pjksv" event={"ID":"c5b951c3-accd-47c9-aa64-efe6365d3022","Type":"ContainerDied","Data":"48acbde38630df76c413870a1a1f323e727866ffa6748253f75a81fe1a11634d"} Feb 26 17:11:02 crc kubenswrapper[5033]: I0226 17:11:02.946515 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pjksv" event={"ID":"c5b951c3-accd-47c9-aa64-efe6365d3022","Type":"ContainerStarted","Data":"3fe7ef66c9329aaa2bdc9be95d66ee38869c602017cedca9d7a67694149d3852"} Feb 26 17:11:02 crc kubenswrapper[5033]: I0226 17:11:02.949394 5033 generic.go:334] "Generic (PLEG): container finished" podID="75798cbd-85b7-4424-83fc-3f542dcc8820" containerID="73e44ea2f1b4db87278f792b5a38f0a66ba4bf65d2e04987773723f0d2854c39" exitCode=0 Feb 26 17:11:02 crc kubenswrapper[5033]: I0226 17:11:02.949443 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-af50-account-create-update-gjwkt" event={"ID":"75798cbd-85b7-4424-83fc-3f542dcc8820","Type":"ContainerDied","Data":"73e44ea2f1b4db87278f792b5a38f0a66ba4bf65d2e04987773723f0d2854c39"} Feb 26 17:11:02 crc kubenswrapper[5033]: I0226 17:11:02.949489 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-af50-account-create-update-gjwkt" event={"ID":"75798cbd-85b7-4424-83fc-3f542dcc8820","Type":"ContainerStarted","Data":"6d9034190462ba59293ace390e4456dfab077998ed217e438e1e934ff905c3df"} Feb 26 17:11:02 crc kubenswrapper[5033]: I0226 17:11:02.949694 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rchhz" podUID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerName="registry-server" containerID="cri-o://bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75" gracePeriod=2 Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.429654 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.477561 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-catalog-content\") pod \"c2017e8f-1cff-4438-a426-6ee0701a457c\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.477683 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk5kb\" (UniqueName: \"kubernetes.io/projected/c2017e8f-1cff-4438-a426-6ee0701a457c-kube-api-access-pk5kb\") pod \"c2017e8f-1cff-4438-a426-6ee0701a457c\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.477733 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-utilities\") pod \"c2017e8f-1cff-4438-a426-6ee0701a457c\" (UID: \"c2017e8f-1cff-4438-a426-6ee0701a457c\") " Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.479359 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-utilities" (OuterVolumeSpecName: "utilities") pod "c2017e8f-1cff-4438-a426-6ee0701a457c" (UID: "c2017e8f-1cff-4438-a426-6ee0701a457c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.488538 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2017e8f-1cff-4438-a426-6ee0701a457c-kube-api-access-pk5kb" (OuterVolumeSpecName: "kube-api-access-pk5kb") pod "c2017e8f-1cff-4438-a426-6ee0701a457c" (UID: "c2017e8f-1cff-4438-a426-6ee0701a457c"). InnerVolumeSpecName "kube-api-access-pk5kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.548064 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2017e8f-1cff-4438-a426-6ee0701a457c" (UID: "c2017e8f-1cff-4438-a426-6ee0701a457c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.580051 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.580125 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk5kb\" (UniqueName: \"kubernetes.io/projected/c2017e8f-1cff-4438-a426-6ee0701a457c-kube-api-access-pk5kb\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.580146 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2017e8f-1cff-4438-a426-6ee0701a457c-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.977218 5033 generic.go:334] "Generic (PLEG): container finished" podID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerID="bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75" exitCode=0 Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.977299 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rchhz" event={"ID":"c2017e8f-1cff-4438-a426-6ee0701a457c","Type":"ContainerDied","Data":"bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75"} Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.977392 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rchhz" event={"ID":"c2017e8f-1cff-4438-a426-6ee0701a457c","Type":"ContainerDied","Data":"fa744cc8c5531aab5cfa1baefa83d2a2b747f49c2d921dda06212e7ed8db76e6"} Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.977404 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rchhz" Feb 26 17:11:03 crc kubenswrapper[5033]: I0226 17:11:03.977434 5033 scope.go:117] "RemoveContainer" containerID="bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.016243 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rchhz"] Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.017596 5033 scope.go:117] "RemoveContainer" containerID="93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.024444 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rchhz"] Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.065472 5033 scope.go:117] "RemoveContainer" containerID="043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.084912 5033 scope.go:117] "RemoveContainer" containerID="bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75" Feb 26 17:11:04 crc kubenswrapper[5033]: E0226 17:11:04.092402 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75\": container with ID starting with bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75 not found: ID does not exist" containerID="bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.092467 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75"} err="failed to get container status \"bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75\": rpc error: code = NotFound desc = could not find container \"bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75\": container with ID starting with bebd5b68228aa38649032fd097aa1dd53b2ed2c18a50db8b2cb7eb8bfea98b75 not found: ID does not exist" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.092501 5033 scope.go:117] "RemoveContainer" containerID="93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18" Feb 26 17:11:04 crc kubenswrapper[5033]: E0226 17:11:04.096093 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18\": container with ID starting with 93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18 not found: ID does not exist" containerID="93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.096157 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18"} err="failed to get container status \"93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18\": rpc error: code = NotFound desc = could not find container \"93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18\": container with ID starting with 93d4d51ff8cdd1563fd815e38c9c38a8fd104647105b3758115b5d96f296db18 not found: ID does not exist" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.096219 5033 scope.go:117] "RemoveContainer" containerID="043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1" Feb 26 17:11:04 crc kubenswrapper[5033]: E0226 17:11:04.107529 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1\": container with ID starting with 043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1 not found: ID does not exist" containerID="043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.107578 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1"} err="failed to get container status \"043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1\": rpc error: code = NotFound desc = could not find container \"043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1\": container with ID starting with 043090374da5f965b38c12f0800e596422637955c0fc34243da0c2cf01f846f1 not found: ID does not exist" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.391071 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.397344 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.495649 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5b951c3-accd-47c9-aa64-efe6365d3022-operator-scripts\") pod \"c5b951c3-accd-47c9-aa64-efe6365d3022\" (UID: \"c5b951c3-accd-47c9-aa64-efe6365d3022\") " Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.495847 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75798cbd-85b7-4424-83fc-3f542dcc8820-operator-scripts\") pod \"75798cbd-85b7-4424-83fc-3f542dcc8820\" (UID: \"75798cbd-85b7-4424-83fc-3f542dcc8820\") " Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.495902 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6zrp\" (UniqueName: \"kubernetes.io/projected/75798cbd-85b7-4424-83fc-3f542dcc8820-kube-api-access-q6zrp\") pod \"75798cbd-85b7-4424-83fc-3f542dcc8820\" (UID: \"75798cbd-85b7-4424-83fc-3f542dcc8820\") " Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.495947 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p65pg\" (UniqueName: \"kubernetes.io/projected/c5b951c3-accd-47c9-aa64-efe6365d3022-kube-api-access-p65pg\") pod \"c5b951c3-accd-47c9-aa64-efe6365d3022\" (UID: \"c5b951c3-accd-47c9-aa64-efe6365d3022\") " Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.496320 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75798cbd-85b7-4424-83fc-3f542dcc8820-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75798cbd-85b7-4424-83fc-3f542dcc8820" (UID: "75798cbd-85b7-4424-83fc-3f542dcc8820"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.496319 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5b951c3-accd-47c9-aa64-efe6365d3022-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5b951c3-accd-47c9-aa64-efe6365d3022" (UID: "c5b951c3-accd-47c9-aa64-efe6365d3022"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.496476 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5b951c3-accd-47c9-aa64-efe6365d3022-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.496503 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75798cbd-85b7-4424-83fc-3f542dcc8820-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.500131 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b951c3-accd-47c9-aa64-efe6365d3022-kube-api-access-p65pg" (OuterVolumeSpecName: "kube-api-access-p65pg") pod "c5b951c3-accd-47c9-aa64-efe6365d3022" (UID: "c5b951c3-accd-47c9-aa64-efe6365d3022"). InnerVolumeSpecName "kube-api-access-p65pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.503504 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75798cbd-85b7-4424-83fc-3f542dcc8820-kube-api-access-q6zrp" (OuterVolumeSpecName: "kube-api-access-q6zrp") pod "75798cbd-85b7-4424-83fc-3f542dcc8820" (UID: "75798cbd-85b7-4424-83fc-3f542dcc8820"). InnerVolumeSpecName "kube-api-access-q6zrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.597427 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6zrp\" (UniqueName: \"kubernetes.io/projected/75798cbd-85b7-4424-83fc-3f542dcc8820-kube-api-access-q6zrp\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.597545 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p65pg\" (UniqueName: \"kubernetes.io/projected/c5b951c3-accd-47c9-aa64-efe6365d3022-kube-api-access-p65pg\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.988197 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pjksv" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.989606 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pjksv" event={"ID":"c5b951c3-accd-47c9-aa64-efe6365d3022","Type":"ContainerDied","Data":"3fe7ef66c9329aaa2bdc9be95d66ee38869c602017cedca9d7a67694149d3852"} Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.989689 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fe7ef66c9329aaa2bdc9be95d66ee38869c602017cedca9d7a67694149d3852" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.990475 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af50-account-create-update-gjwkt" Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.990486 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-af50-account-create-update-gjwkt" event={"ID":"75798cbd-85b7-4424-83fc-3f542dcc8820","Type":"ContainerDied","Data":"6d9034190462ba59293ace390e4456dfab077998ed217e438e1e934ff905c3df"} Feb 26 17:11:04 crc kubenswrapper[5033]: I0226 17:11:04.990654 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d9034190462ba59293ace390e4456dfab077998ed217e438e1e934ff905c3df" Feb 26 17:11:05 crc kubenswrapper[5033]: I0226 17:11:05.645696 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2017e8f-1cff-4438-a426-6ee0701a457c" path="/var/lib/kubelet/pods/c2017e8f-1cff-4438-a426-6ee0701a457c/volumes" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.008493 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4zlph"] Feb 26 17:11:07 crc kubenswrapper[5033]: E0226 17:11:07.008885 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerName="extract-utilities" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.008921 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerName="extract-utilities" Feb 26 17:11:07 crc kubenswrapper[5033]: E0226 17:11:07.008937 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerName="registry-server" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.008943 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerName="registry-server" Feb 26 17:11:07 crc kubenswrapper[5033]: E0226 17:11:07.008956 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b951c3-accd-47c9-aa64-efe6365d3022" containerName="mariadb-database-create" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.008963 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b951c3-accd-47c9-aa64-efe6365d3022" containerName="mariadb-database-create" Feb 26 17:11:07 crc kubenswrapper[5033]: E0226 17:11:07.008970 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75798cbd-85b7-4424-83fc-3f542dcc8820" containerName="mariadb-account-create-update" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.008975 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="75798cbd-85b7-4424-83fc-3f542dcc8820" containerName="mariadb-account-create-update" Feb 26 17:11:07 crc kubenswrapper[5033]: E0226 17:11:07.008995 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerName="extract-content" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.009001 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerName="extract-content" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.009142 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="75798cbd-85b7-4424-83fc-3f542dcc8820" containerName="mariadb-account-create-update" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.009153 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b951c3-accd-47c9-aa64-efe6365d3022" containerName="mariadb-database-create" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.009169 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2017e8f-1cff-4438-a426-6ee0701a457c" containerName="registry-server" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.010815 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.015164 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4kchl" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.015462 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.015711 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.016199 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.022878 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4zlph"] Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.043441 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-config-data\") pod \"keystone-db-sync-4zlph\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.043488 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q24fh\" (UniqueName: \"kubernetes.io/projected/7e138959-75c1-4a17-b437-07e9cb38b285-kube-api-access-q24fh\") pod \"keystone-db-sync-4zlph\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.043514 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-combined-ca-bundle\") pod \"keystone-db-sync-4zlph\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.145855 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-config-data\") pod \"keystone-db-sync-4zlph\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.145928 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q24fh\" (UniqueName: \"kubernetes.io/projected/7e138959-75c1-4a17-b437-07e9cb38b285-kube-api-access-q24fh\") pod \"keystone-db-sync-4zlph\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.145959 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-combined-ca-bundle\") pod \"keystone-db-sync-4zlph\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.152490 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-config-data\") pod \"keystone-db-sync-4zlph\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.152750 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-combined-ca-bundle\") pod \"keystone-db-sync-4zlph\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.169868 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q24fh\" (UniqueName: \"kubernetes.io/projected/7e138959-75c1-4a17-b437-07e9cb38b285-kube-api-access-q24fh\") pod \"keystone-db-sync-4zlph\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.345888 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:07 crc kubenswrapper[5033]: I0226 17:11:07.777003 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4zlph"] Feb 26 17:11:08 crc kubenswrapper[5033]: I0226 17:11:08.021474 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4zlph" event={"ID":"7e138959-75c1-4a17-b437-07e9cb38b285","Type":"ContainerStarted","Data":"948f88cdce6c89de4e1fcfdd26aabe5d8fe6cc4b39a68d2d1be7951286421496"} Feb 26 17:11:08 crc kubenswrapper[5033]: I0226 17:11:08.021532 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4zlph" event={"ID":"7e138959-75c1-4a17-b437-07e9cb38b285","Type":"ContainerStarted","Data":"c885e9137de82c85ed5c9bac1f12141813c4848b970b43f9db7d7d2ef259c310"} Feb 26 17:11:08 crc kubenswrapper[5033]: I0226 17:11:08.044075 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4zlph" podStartSLOduration=2.044056865 podStartE2EDuration="2.044056865s" podCreationTimestamp="2026-02-26 17:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:11:08.041058632 +0000 UTC m=+5395.253657292" watchObservedRunningTime="2026-02-26 17:11:08.044056865 +0000 UTC m=+5395.256655535" Feb 26 17:11:10 crc kubenswrapper[5033]: I0226 17:11:10.045229 5033 generic.go:334] "Generic (PLEG): container finished" podID="7e138959-75c1-4a17-b437-07e9cb38b285" containerID="948f88cdce6c89de4e1fcfdd26aabe5d8fe6cc4b39a68d2d1be7951286421496" exitCode=0 Feb 26 17:11:10 crc kubenswrapper[5033]: I0226 17:11:10.045349 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4zlph" event={"ID":"7e138959-75c1-4a17-b437-07e9cb38b285","Type":"ContainerDied","Data":"948f88cdce6c89de4e1fcfdd26aabe5d8fe6cc4b39a68d2d1be7951286421496"} Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.365640 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.443681 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-combined-ca-bundle\") pod \"7e138959-75c1-4a17-b437-07e9cb38b285\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.443792 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-config-data\") pod \"7e138959-75c1-4a17-b437-07e9cb38b285\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.443836 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q24fh\" (UniqueName: \"kubernetes.io/projected/7e138959-75c1-4a17-b437-07e9cb38b285-kube-api-access-q24fh\") pod \"7e138959-75c1-4a17-b437-07e9cb38b285\" (UID: \"7e138959-75c1-4a17-b437-07e9cb38b285\") " Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.454025 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e138959-75c1-4a17-b437-07e9cb38b285-kube-api-access-q24fh" (OuterVolumeSpecName: "kube-api-access-q24fh") pod "7e138959-75c1-4a17-b437-07e9cb38b285" (UID: "7e138959-75c1-4a17-b437-07e9cb38b285"). InnerVolumeSpecName "kube-api-access-q24fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.469242 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e138959-75c1-4a17-b437-07e9cb38b285" (UID: "7e138959-75c1-4a17-b437-07e9cb38b285"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.542780 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-config-data" (OuterVolumeSpecName: "config-data") pod "7e138959-75c1-4a17-b437-07e9cb38b285" (UID: "7e138959-75c1-4a17-b437-07e9cb38b285"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.545190 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.545476 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q24fh\" (UniqueName: \"kubernetes.io/projected/7e138959-75c1-4a17-b437-07e9cb38b285-kube-api-access-q24fh\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:11 crc kubenswrapper[5033]: I0226 17:11:11.545555 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e138959-75c1-4a17-b437-07e9cb38b285-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.069602 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4zlph" event={"ID":"7e138959-75c1-4a17-b437-07e9cb38b285","Type":"ContainerDied","Data":"c885e9137de82c85ed5c9bac1f12141813c4848b970b43f9db7d7d2ef259c310"} Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.069645 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c885e9137de82c85ed5c9bac1f12141813c4848b970b43f9db7d7d2ef259c310" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.069706 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4zlph" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.334681 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-647b68ccbf-5xftw"] Feb 26 17:11:12 crc kubenswrapper[5033]: E0226 17:11:12.335101 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e138959-75c1-4a17-b437-07e9cb38b285" containerName="keystone-db-sync" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.335131 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e138959-75c1-4a17-b437-07e9cb38b285" containerName="keystone-db-sync" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.335395 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e138959-75c1-4a17-b437-07e9cb38b285" containerName="keystone-db-sync" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.336383 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.366190 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2w42q"] Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.368135 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.371076 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.372544 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4kchl" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.372788 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.373557 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.375360 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.376447 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647b68ccbf-5xftw"] Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.397614 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2w42q"] Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465404 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgwlp\" (UniqueName: \"kubernetes.io/projected/48c4a420-af29-4c0f-978b-67725d55d368-kube-api-access-cgwlp\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465454 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-config\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465525 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-dns-svc\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465546 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-fernet-keys\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465568 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-nb\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465645 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-combined-ca-bundle\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465692 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-credential-keys\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465708 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-config-data\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465723 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-sb\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465739 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjfz8\" (UniqueName: \"kubernetes.io/projected/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-kube-api-access-rjfz8\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.465782 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-scripts\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.567058 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-scripts\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.567186 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgwlp\" (UniqueName: \"kubernetes.io/projected/48c4a420-af29-4c0f-978b-67725d55d368-kube-api-access-cgwlp\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.567217 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-config\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.567243 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-dns-svc\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568062 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-config\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568123 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-fernet-keys\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568149 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-nb\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568152 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-dns-svc\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568213 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-combined-ca-bundle\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568617 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-credential-keys\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568643 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-config-data\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568662 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-sb\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568681 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjfz8\" (UniqueName: \"kubernetes.io/projected/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-kube-api-access-rjfz8\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.568751 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-nb\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.569684 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-sb\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.576184 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-credential-keys\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.576273 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-fernet-keys\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.579548 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-combined-ca-bundle\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.582394 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-config-data\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.585215 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-scripts\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.586363 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjfz8\" (UniqueName: \"kubernetes.io/projected/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-kube-api-access-rjfz8\") pod \"dnsmasq-dns-647b68ccbf-5xftw\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.588228 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgwlp\" (UniqueName: \"kubernetes.io/projected/48c4a420-af29-4c0f-978b-67725d55d368-kube-api-access-cgwlp\") pod \"keystone-bootstrap-2w42q\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.671151 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:12 crc kubenswrapper[5033]: I0226 17:11:12.686530 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:13 crc kubenswrapper[5033]: I0226 17:11:13.098868 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647b68ccbf-5xftw"] Feb 26 17:11:13 crc kubenswrapper[5033]: W0226 17:11:13.103140 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3c1c68_f56c_473f_ae9c_030cf0c2d2a8.slice/crio-09822fe9428a5617d197cb1bd2b8090904a1a55b78acd11e6c4d2d2b8912562a WatchSource:0}: Error finding container 09822fe9428a5617d197cb1bd2b8090904a1a55b78acd11e6c4d2d2b8912562a: Status 404 returned error can't find the container with id 09822fe9428a5617d197cb1bd2b8090904a1a55b78acd11e6c4d2d2b8912562a Feb 26 17:11:13 crc kubenswrapper[5033]: I0226 17:11:13.184473 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2w42q"] Feb 26 17:11:13 crc kubenswrapper[5033]: W0226 17:11:13.188832 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c4a420_af29_4c0f_978b_67725d55d368.slice/crio-63d48a9114e82418165e54ae776583e32accfd20b8e7fd5b0ad0512e554a03e0 WatchSource:0}: Error finding container 63d48a9114e82418165e54ae776583e32accfd20b8e7fd5b0ad0512e554a03e0: Status 404 returned error can't find the container with id 63d48a9114e82418165e54ae776583e32accfd20b8e7fd5b0ad0512e554a03e0 Feb 26 17:11:14 crc kubenswrapper[5033]: I0226 17:11:14.092490 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2w42q" event={"ID":"48c4a420-af29-4c0f-978b-67725d55d368","Type":"ContainerStarted","Data":"31b83a0f2fc391d0bcb0acf8cf9cba8009549a3c5146061368132d0b65153876"} Feb 26 17:11:14 crc kubenswrapper[5033]: I0226 17:11:14.093334 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2w42q" event={"ID":"48c4a420-af29-4c0f-978b-67725d55d368","Type":"ContainerStarted","Data":"63d48a9114e82418165e54ae776583e32accfd20b8e7fd5b0ad0512e554a03e0"} Feb 26 17:11:14 crc kubenswrapper[5033]: I0226 17:11:14.096882 5033 generic.go:334] "Generic (PLEG): container finished" podID="9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" containerID="a455958c138f14cc7be8ece1a6ab8f87964501bd5b2e1457108f2edebaf43968" exitCode=0 Feb 26 17:11:14 crc kubenswrapper[5033]: I0226 17:11:14.096935 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" event={"ID":"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8","Type":"ContainerDied","Data":"a455958c138f14cc7be8ece1a6ab8f87964501bd5b2e1457108f2edebaf43968"} Feb 26 17:11:14 crc kubenswrapper[5033]: I0226 17:11:14.096965 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" event={"ID":"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8","Type":"ContainerStarted","Data":"09822fe9428a5617d197cb1bd2b8090904a1a55b78acd11e6c4d2d2b8912562a"} Feb 26 17:11:14 crc kubenswrapper[5033]: I0226 17:11:14.145955 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2w42q" podStartSLOduration=2.145922628 podStartE2EDuration="2.145922628s" podCreationTimestamp="2026-02-26 17:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:11:14.131009294 +0000 UTC m=+5401.343607984" watchObservedRunningTime="2026-02-26 17:11:14.145922628 +0000 UTC m=+5401.358521318" Feb 26 17:11:15 crc kubenswrapper[5033]: I0226 17:11:15.105866 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" event={"ID":"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8","Type":"ContainerStarted","Data":"bac8c43e92d3b8e89466bc84597f06b7de238ba0bd5a44559a76496c9e18bba5"} Feb 26 17:11:15 crc kubenswrapper[5033]: I0226 17:11:15.133061 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" podStartSLOduration=3.133024025 podStartE2EDuration="3.133024025s" podCreationTimestamp="2026-02-26 17:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:11:15.127879429 +0000 UTC m=+5402.340478099" watchObservedRunningTime="2026-02-26 17:11:15.133024025 +0000 UTC m=+5402.345622685" Feb 26 17:11:16 crc kubenswrapper[5033]: I0226 17:11:16.117340 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:16 crc kubenswrapper[5033]: I0226 17:11:16.921526 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 26 17:11:17 crc kubenswrapper[5033]: I0226 17:11:17.132833 5033 generic.go:334] "Generic (PLEG): container finished" podID="48c4a420-af29-4c0f-978b-67725d55d368" containerID="31b83a0f2fc391d0bcb0acf8cf9cba8009549a3c5146061368132d0b65153876" exitCode=0 Feb 26 17:11:17 crc kubenswrapper[5033]: I0226 17:11:17.132907 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2w42q" event={"ID":"48c4a420-af29-4c0f-978b-67725d55d368","Type":"ContainerDied","Data":"31b83a0f2fc391d0bcb0acf8cf9cba8009549a3c5146061368132d0b65153876"} Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.533780 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.581759 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-scripts\") pod \"48c4a420-af29-4c0f-978b-67725d55d368\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.581853 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-credential-keys\") pod \"48c4a420-af29-4c0f-978b-67725d55d368\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.581902 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-config-data\") pod \"48c4a420-af29-4c0f-978b-67725d55d368\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.581938 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgwlp\" (UniqueName: \"kubernetes.io/projected/48c4a420-af29-4c0f-978b-67725d55d368-kube-api-access-cgwlp\") pod \"48c4a420-af29-4c0f-978b-67725d55d368\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.581985 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-combined-ca-bundle\") pod \"48c4a420-af29-4c0f-978b-67725d55d368\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.582007 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-fernet-keys\") pod \"48c4a420-af29-4c0f-978b-67725d55d368\" (UID: \"48c4a420-af29-4c0f-978b-67725d55d368\") " Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.587710 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-scripts" (OuterVolumeSpecName: "scripts") pod "48c4a420-af29-4c0f-978b-67725d55d368" (UID: "48c4a420-af29-4c0f-978b-67725d55d368"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.588231 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "48c4a420-af29-4c0f-978b-67725d55d368" (UID: "48c4a420-af29-4c0f-978b-67725d55d368"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.588411 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c4a420-af29-4c0f-978b-67725d55d368-kube-api-access-cgwlp" (OuterVolumeSpecName: "kube-api-access-cgwlp") pod "48c4a420-af29-4c0f-978b-67725d55d368" (UID: "48c4a420-af29-4c0f-978b-67725d55d368"). InnerVolumeSpecName "kube-api-access-cgwlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.592108 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "48c4a420-af29-4c0f-978b-67725d55d368" (UID: "48c4a420-af29-4c0f-978b-67725d55d368"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.604061 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48c4a420-af29-4c0f-978b-67725d55d368" (UID: "48c4a420-af29-4c0f-978b-67725d55d368"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.608225 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-config-data" (OuterVolumeSpecName: "config-data") pod "48c4a420-af29-4c0f-978b-67725d55d368" (UID: "48c4a420-af29-4c0f-978b-67725d55d368"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.684490 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.684842 5033 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.684857 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.684871 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgwlp\" (UniqueName: \"kubernetes.io/projected/48c4a420-af29-4c0f-978b-67725d55d368-kube-api-access-cgwlp\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.684882 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:18 crc kubenswrapper[5033]: I0226 17:11:18.684893 5033 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48c4a420-af29-4c0f-978b-67725d55d368-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.158919 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2w42q" event={"ID":"48c4a420-af29-4c0f-978b-67725d55d368","Type":"ContainerDied","Data":"63d48a9114e82418165e54ae776583e32accfd20b8e7fd5b0ad0512e554a03e0"} Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.158975 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63d48a9114e82418165e54ae776583e32accfd20b8e7fd5b0ad0512e554a03e0" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.159415 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2w42q" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.239755 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2w42q"] Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.249819 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2w42q"] Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.330113 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mpzk7"] Feb 26 17:11:19 crc kubenswrapper[5033]: E0226 17:11:19.330603 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c4a420-af29-4c0f-978b-67725d55d368" containerName="keystone-bootstrap" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.330628 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c4a420-af29-4c0f-978b-67725d55d368" containerName="keystone-bootstrap" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.330873 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c4a420-af29-4c0f-978b-67725d55d368" containerName="keystone-bootstrap" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.331656 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.334647 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.335012 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.335372 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.335717 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.336161 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4kchl" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.337126 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mpzk7"] Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.399758 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-config-data\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.399830 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-credential-keys\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.400043 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-scripts\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.400121 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-fernet-keys\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.400152 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-combined-ca-bundle\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.400205 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pscjx\" (UniqueName: \"kubernetes.io/projected/520a1008-f450-4296-9c63-b8d869095196-kube-api-access-pscjx\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.501865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-config-data\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.501912 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-credential-keys\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.501968 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-scripts\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.501995 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-fernet-keys\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.502011 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-combined-ca-bundle\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.502038 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pscjx\" (UniqueName: \"kubernetes.io/projected/520a1008-f450-4296-9c63-b8d869095196-kube-api-access-pscjx\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.506387 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-config-data\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.506447 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-scripts\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.507108 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-fernet-keys\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.507639 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-combined-ca-bundle\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.511698 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-credential-keys\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.518087 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pscjx\" (UniqueName: \"kubernetes.io/projected/520a1008-f450-4296-9c63-b8d869095196-kube-api-access-pscjx\") pod \"keystone-bootstrap-mpzk7\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.644416 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c4a420-af29-4c0f-978b-67725d55d368" path="/var/lib/kubelet/pods/48c4a420-af29-4c0f-978b-67725d55d368/volumes" Feb 26 17:11:19 crc kubenswrapper[5033]: I0226 17:11:19.660365 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:20 crc kubenswrapper[5033]: I0226 17:11:20.088901 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mpzk7"] Feb 26 17:11:20 crc kubenswrapper[5033]: I0226 17:11:20.165358 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mpzk7" event={"ID":"520a1008-f450-4296-9c63-b8d869095196","Type":"ContainerStarted","Data":"5a0282b998b7185023e6f944f3d9244a776a7eb3c49bda2f973b241eedf4d777"} Feb 26 17:11:21 crc kubenswrapper[5033]: I0226 17:11:21.176712 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mpzk7" event={"ID":"520a1008-f450-4296-9c63-b8d869095196","Type":"ContainerStarted","Data":"733b7580735c4191096036e3b8ee1b966877551975a71f346439b63072f4eeb9"} Feb 26 17:11:22 crc kubenswrapper[5033]: I0226 17:11:22.673098 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:11:22 crc kubenswrapper[5033]: I0226 17:11:22.700616 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mpzk7" podStartSLOduration=3.70058866 podStartE2EDuration="3.70058866s" podCreationTimestamp="2026-02-26 17:11:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:11:21.202139617 +0000 UTC m=+5408.414738307" watchObservedRunningTime="2026-02-26 17:11:22.70058866 +0000 UTC m=+5409.913187330" Feb 26 17:11:22 crc kubenswrapper[5033]: I0226 17:11:22.728961 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b8b7f4989-ncbwh"] Feb 26 17:11:22 crc kubenswrapper[5033]: I0226 17:11:22.729321 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" podUID="0f1e21d2-14f1-48f4-8214-d982806765f6" containerName="dnsmasq-dns" containerID="cri-o://5f97e74232cb55a2c43d9da41b47f8c95029bd2d73e2e132057ca31d32a39d79" gracePeriod=10 Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.194926 5033 generic.go:334] "Generic (PLEG): container finished" podID="0f1e21d2-14f1-48f4-8214-d982806765f6" containerID="5f97e74232cb55a2c43d9da41b47f8c95029bd2d73e2e132057ca31d32a39d79" exitCode=0 Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.195227 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" event={"ID":"0f1e21d2-14f1-48f4-8214-d982806765f6","Type":"ContainerDied","Data":"5f97e74232cb55a2c43d9da41b47f8c95029bd2d73e2e132057ca31d32a39d79"} Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.284644 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.374872 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-config\") pod \"0f1e21d2-14f1-48f4-8214-d982806765f6\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.375007 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-nb\") pod \"0f1e21d2-14f1-48f4-8214-d982806765f6\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.375071 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-sb\") pod \"0f1e21d2-14f1-48f4-8214-d982806765f6\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.375100 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-dns-svc\") pod \"0f1e21d2-14f1-48f4-8214-d982806765f6\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.375120 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmzrq\" (UniqueName: \"kubernetes.io/projected/0f1e21d2-14f1-48f4-8214-d982806765f6-kube-api-access-bmzrq\") pod \"0f1e21d2-14f1-48f4-8214-d982806765f6\" (UID: \"0f1e21d2-14f1-48f4-8214-d982806765f6\") " Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.384020 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f1e21d2-14f1-48f4-8214-d982806765f6-kube-api-access-bmzrq" (OuterVolumeSpecName: "kube-api-access-bmzrq") pod "0f1e21d2-14f1-48f4-8214-d982806765f6" (UID: "0f1e21d2-14f1-48f4-8214-d982806765f6"). InnerVolumeSpecName "kube-api-access-bmzrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.413078 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f1e21d2-14f1-48f4-8214-d982806765f6" (UID: "0f1e21d2-14f1-48f4-8214-d982806765f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.413139 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-config" (OuterVolumeSpecName: "config") pod "0f1e21d2-14f1-48f4-8214-d982806765f6" (UID: "0f1e21d2-14f1-48f4-8214-d982806765f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.415287 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f1e21d2-14f1-48f4-8214-d982806765f6" (UID: "0f1e21d2-14f1-48f4-8214-d982806765f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.417404 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f1e21d2-14f1-48f4-8214-d982806765f6" (UID: "0f1e21d2-14f1-48f4-8214-d982806765f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.476592 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.476637 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.476654 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.476667 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmzrq\" (UniqueName: \"kubernetes.io/projected/0f1e21d2-14f1-48f4-8214-d982806765f6-kube-api-access-bmzrq\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:23 crc kubenswrapper[5033]: I0226 17:11:23.476679 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f1e21d2-14f1-48f4-8214-d982806765f6-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:24 crc kubenswrapper[5033]: I0226 17:11:24.208183 5033 generic.go:334] "Generic (PLEG): container finished" podID="520a1008-f450-4296-9c63-b8d869095196" containerID="733b7580735c4191096036e3b8ee1b966877551975a71f346439b63072f4eeb9" exitCode=0 Feb 26 17:11:24 crc kubenswrapper[5033]: I0226 17:11:24.208231 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mpzk7" event={"ID":"520a1008-f450-4296-9c63-b8d869095196","Type":"ContainerDied","Data":"733b7580735c4191096036e3b8ee1b966877551975a71f346439b63072f4eeb9"} Feb 26 17:11:24 crc kubenswrapper[5033]: I0226 17:11:24.211837 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" event={"ID":"0f1e21d2-14f1-48f4-8214-d982806765f6","Type":"ContainerDied","Data":"0186fde1c3caf06229b75bbc51e663403c0ecad93072463dec5c10edd6e3914d"} Feb 26 17:11:24 crc kubenswrapper[5033]: I0226 17:11:24.211908 5033 scope.go:117] "RemoveContainer" containerID="5f97e74232cb55a2c43d9da41b47f8c95029bd2d73e2e132057ca31d32a39d79" Feb 26 17:11:24 crc kubenswrapper[5033]: I0226 17:11:24.211969 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8b7f4989-ncbwh" Feb 26 17:11:24 crc kubenswrapper[5033]: I0226 17:11:24.246707 5033 scope.go:117] "RemoveContainer" containerID="95d38db2d4d2cd187362be2c171a06f194190007408682bd8ad61c19c75202b7" Feb 26 17:11:24 crc kubenswrapper[5033]: I0226 17:11:24.265819 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b8b7f4989-ncbwh"] Feb 26 17:11:24 crc kubenswrapper[5033]: I0226 17:11:24.273502 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b8b7f4989-ncbwh"] Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.617136 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.647430 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f1e21d2-14f1-48f4-8214-d982806765f6" path="/var/lib/kubelet/pods/0f1e21d2-14f1-48f4-8214-d982806765f6/volumes" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.716395 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-fernet-keys\") pod \"520a1008-f450-4296-9c63-b8d869095196\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.716440 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-credential-keys\") pod \"520a1008-f450-4296-9c63-b8d869095196\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.716518 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-combined-ca-bundle\") pod \"520a1008-f450-4296-9c63-b8d869095196\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.716566 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-scripts\") pod \"520a1008-f450-4296-9c63-b8d869095196\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.716595 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pscjx\" (UniqueName: \"kubernetes.io/projected/520a1008-f450-4296-9c63-b8d869095196-kube-api-access-pscjx\") pod \"520a1008-f450-4296-9c63-b8d869095196\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.716654 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-config-data\") pod \"520a1008-f450-4296-9c63-b8d869095196\" (UID: \"520a1008-f450-4296-9c63-b8d869095196\") " Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.722437 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-scripts" (OuterVolumeSpecName: "scripts") pod "520a1008-f450-4296-9c63-b8d869095196" (UID: "520a1008-f450-4296-9c63-b8d869095196"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.722489 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "520a1008-f450-4296-9c63-b8d869095196" (UID: "520a1008-f450-4296-9c63-b8d869095196"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.725901 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520a1008-f450-4296-9c63-b8d869095196-kube-api-access-pscjx" (OuterVolumeSpecName: "kube-api-access-pscjx") pod "520a1008-f450-4296-9c63-b8d869095196" (UID: "520a1008-f450-4296-9c63-b8d869095196"). InnerVolumeSpecName "kube-api-access-pscjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.726323 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "520a1008-f450-4296-9c63-b8d869095196" (UID: "520a1008-f450-4296-9c63-b8d869095196"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.744920 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "520a1008-f450-4296-9c63-b8d869095196" (UID: "520a1008-f450-4296-9c63-b8d869095196"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.746304 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-config-data" (OuterVolumeSpecName: "config-data") pod "520a1008-f450-4296-9c63-b8d869095196" (UID: "520a1008-f450-4296-9c63-b8d869095196"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.818930 5033 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.818961 5033 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.818972 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.818981 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.818989 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pscjx\" (UniqueName: \"kubernetes.io/projected/520a1008-f450-4296-9c63-b8d869095196-kube-api-access-pscjx\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:25 crc kubenswrapper[5033]: I0226 17:11:25.818997 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520a1008-f450-4296-9c63-b8d869095196-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.234271 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mpzk7" event={"ID":"520a1008-f450-4296-9c63-b8d869095196","Type":"ContainerDied","Data":"5a0282b998b7185023e6f944f3d9244a776a7eb3c49bda2f973b241eedf4d777"} Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.234312 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a0282b998b7185023e6f944f3d9244a776a7eb3c49bda2f973b241eedf4d777" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.234318 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mpzk7" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.311333 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7b6d888c87-76db4"] Feb 26 17:11:26 crc kubenswrapper[5033]: E0226 17:11:26.311650 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1e21d2-14f1-48f4-8214-d982806765f6" containerName="init" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.311666 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1e21d2-14f1-48f4-8214-d982806765f6" containerName="init" Feb 26 17:11:26 crc kubenswrapper[5033]: E0226 17:11:26.311684 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520a1008-f450-4296-9c63-b8d869095196" containerName="keystone-bootstrap" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.311692 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="520a1008-f450-4296-9c63-b8d869095196" containerName="keystone-bootstrap" Feb 26 17:11:26 crc kubenswrapper[5033]: E0226 17:11:26.311730 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1e21d2-14f1-48f4-8214-d982806765f6" containerName="dnsmasq-dns" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.311737 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1e21d2-14f1-48f4-8214-d982806765f6" containerName="dnsmasq-dns" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.311873 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="520a1008-f450-4296-9c63-b8d869095196" containerName="keystone-bootstrap" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.311905 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f1e21d2-14f1-48f4-8214-d982806765f6" containerName="dnsmasq-dns" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.312448 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.314131 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4kchl" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.314779 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.314827 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.324501 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b6d888c87-76db4"] Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.325084 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.428506 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-combined-ca-bundle\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.428578 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-credential-keys\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.428721 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-scripts\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.428834 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-config-data\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.428884 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-fernet-keys\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.428924 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q646v\" (UniqueName: \"kubernetes.io/projected/1398b95c-4410-4eb1-a84c-d9df58c4aad3-kube-api-access-q646v\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.530780 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-credential-keys\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.531519 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-scripts\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.531577 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-config-data\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.531624 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-fernet-keys\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.531648 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q646v\" (UniqueName: \"kubernetes.io/projected/1398b95c-4410-4eb1-a84c-d9df58c4aad3-kube-api-access-q646v\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.531706 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-combined-ca-bundle\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.534948 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-credential-keys\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.535910 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-config-data\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.536766 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-scripts\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.536983 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-fernet-keys\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.545987 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q646v\" (UniqueName: \"kubernetes.io/projected/1398b95c-4410-4eb1-a84c-d9df58c4aad3-kube-api-access-q646v\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.546437 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1398b95c-4410-4eb1-a84c-d9df58c4aad3-combined-ca-bundle\") pod \"keystone-7b6d888c87-76db4\" (UID: \"1398b95c-4410-4eb1-a84c-d9df58c4aad3\") " pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:26 crc kubenswrapper[5033]: I0226 17:11:26.638887 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:27 crc kubenswrapper[5033]: I0226 17:11:27.158558 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b6d888c87-76db4"] Feb 26 17:11:27 crc kubenswrapper[5033]: I0226 17:11:27.246981 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b6d888c87-76db4" event={"ID":"1398b95c-4410-4eb1-a84c-d9df58c4aad3","Type":"ContainerStarted","Data":"ccbeb73be07f14a67aa5edcf633df15249d93dfc02a897883a77248526f9db22"} Feb 26 17:11:28 crc kubenswrapper[5033]: I0226 17:11:28.264599 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b6d888c87-76db4" event={"ID":"1398b95c-4410-4eb1-a84c-d9df58c4aad3","Type":"ContainerStarted","Data":"4bec9dbd2ba2051a96b9a2aefec46d07fa2e3b02ce760dff93af70e0a11c38d2"} Feb 26 17:11:28 crc kubenswrapper[5033]: I0226 17:11:28.265239 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:11:28 crc kubenswrapper[5033]: I0226 17:11:28.287141 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7b6d888c87-76db4" podStartSLOduration=2.287120427 podStartE2EDuration="2.287120427s" podCreationTimestamp="2026-02-26 17:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:11:28.283369146 +0000 UTC m=+5415.495967816" watchObservedRunningTime="2026-02-26 17:11:28.287120427 +0000 UTC m=+5415.499719087" Feb 26 17:11:58 crc kubenswrapper[5033]: I0226 17:11:58.135395 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7b6d888c87-76db4" Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.146005 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535432-6jv8g"] Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.147863 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535432-6jv8g" Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.151899 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.151985 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.156711 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.167571 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535432-6jv8g"] Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.287815 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k6d6\" (UniqueName: \"kubernetes.io/projected/d6741c25-d626-4b56-9b5c-7018756a17c7-kube-api-access-5k6d6\") pod \"auto-csr-approver-29535432-6jv8g\" (UID: \"d6741c25-d626-4b56-9b5c-7018756a17c7\") " pod="openshift-infra/auto-csr-approver-29535432-6jv8g" Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.389916 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k6d6\" (UniqueName: \"kubernetes.io/projected/d6741c25-d626-4b56-9b5c-7018756a17c7-kube-api-access-5k6d6\") pod \"auto-csr-approver-29535432-6jv8g\" (UID: \"d6741c25-d626-4b56-9b5c-7018756a17c7\") " pod="openshift-infra/auto-csr-approver-29535432-6jv8g" Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.426064 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k6d6\" (UniqueName: \"kubernetes.io/projected/d6741c25-d626-4b56-9b5c-7018756a17c7-kube-api-access-5k6d6\") pod \"auto-csr-approver-29535432-6jv8g\" (UID: \"d6741c25-d626-4b56-9b5c-7018756a17c7\") " pod="openshift-infra/auto-csr-approver-29535432-6jv8g" Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.478243 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535432-6jv8g" Feb 26 17:12:00 crc kubenswrapper[5033]: I0226 17:12:00.963167 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535432-6jv8g"] Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.357086 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.358682 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.360556 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.361561 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.366130 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-xfdz2" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.368668 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.506443 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config\") pod \"openstackclient\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.506581 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config-secret\") pod \"openstackclient\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.506627 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82c7x\" (UniqueName: \"kubernetes.io/projected/01250dc7-efa2-45f2-87d6-5d1a865369c2-kube-api-access-82c7x\") pod \"openstackclient\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.586678 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535432-6jv8g" event={"ID":"d6741c25-d626-4b56-9b5c-7018756a17c7","Type":"ContainerStarted","Data":"ebcabcf6419fdc3ec78564e710564bcb6be1a1d1b21dd0ef60248ea88e02daa5"} Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.607670 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config-secret\") pod \"openstackclient\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.607997 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82c7x\" (UniqueName: \"kubernetes.io/projected/01250dc7-efa2-45f2-87d6-5d1a865369c2-kube-api-access-82c7x\") pod \"openstackclient\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.608161 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config\") pod \"openstackclient\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.610064 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config\") pod \"openstackclient\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.615841 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config-secret\") pod \"openstackclient\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.629223 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82c7x\" (UniqueName: \"kubernetes.io/projected/01250dc7-efa2-45f2-87d6-5d1a865369c2-kube-api-access-82c7x\") pod \"openstackclient\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " pod="openstack/openstackclient" Feb 26 17:12:01 crc kubenswrapper[5033]: I0226 17:12:01.698656 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 17:12:02 crc kubenswrapper[5033]: I0226 17:12:02.176736 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 17:12:02 crc kubenswrapper[5033]: W0226 17:12:02.186406 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01250dc7_efa2_45f2_87d6_5d1a865369c2.slice/crio-e6b6cbe7bdd88c8d1c7c0449719fb40ac2ceba3197b2b7377290dbd54ac26786 WatchSource:0}: Error finding container e6b6cbe7bdd88c8d1c7c0449719fb40ac2ceba3197b2b7377290dbd54ac26786: Status 404 returned error can't find the container with id e6b6cbe7bdd88c8d1c7c0449719fb40ac2ceba3197b2b7377290dbd54ac26786 Feb 26 17:12:02 crc kubenswrapper[5033]: I0226 17:12:02.596894 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"01250dc7-efa2-45f2-87d6-5d1a865369c2","Type":"ContainerStarted","Data":"8d0b492ffb7fcea538d8e31052eeb122a7fee3cbee1a0ee552cf46261b91ab82"} Feb 26 17:12:02 crc kubenswrapper[5033]: I0226 17:12:02.597167 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"01250dc7-efa2-45f2-87d6-5d1a865369c2","Type":"ContainerStarted","Data":"e6b6cbe7bdd88c8d1c7c0449719fb40ac2ceba3197b2b7377290dbd54ac26786"} Feb 26 17:12:02 crc kubenswrapper[5033]: I0226 17:12:02.601185 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535432-6jv8g" event={"ID":"d6741c25-d626-4b56-9b5c-7018756a17c7","Type":"ContainerStarted","Data":"820baf1ca6542c49ede7e348e7648f6b27e8a1cd6cef94820ed817fb47c87abc"} Feb 26 17:12:02 crc kubenswrapper[5033]: I0226 17:12:02.615160 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.615139118 podStartE2EDuration="1.615139118s" podCreationTimestamp="2026-02-26 17:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:12:02.610059163 +0000 UTC m=+5449.822657833" watchObservedRunningTime="2026-02-26 17:12:02.615139118 +0000 UTC m=+5449.827737768" Feb 26 17:12:02 crc kubenswrapper[5033]: I0226 17:12:02.628017 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535432-6jv8g" podStartSLOduration=1.3585969869999999 podStartE2EDuration="2.627994141s" podCreationTimestamp="2026-02-26 17:12:00 +0000 UTC" firstStartedPulling="2026-02-26 17:12:00.969899562 +0000 UTC m=+5448.182498232" lastFinishedPulling="2026-02-26 17:12:02.239296736 +0000 UTC m=+5449.451895386" observedRunningTime="2026-02-26 17:12:02.623795149 +0000 UTC m=+5449.836393799" watchObservedRunningTime="2026-02-26 17:12:02.627994141 +0000 UTC m=+5449.840592791" Feb 26 17:12:03 crc kubenswrapper[5033]: I0226 17:12:03.613382 5033 generic.go:334] "Generic (PLEG): container finished" podID="d6741c25-d626-4b56-9b5c-7018756a17c7" containerID="820baf1ca6542c49ede7e348e7648f6b27e8a1cd6cef94820ed817fb47c87abc" exitCode=0 Feb 26 17:12:03 crc kubenswrapper[5033]: I0226 17:12:03.613478 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535432-6jv8g" event={"ID":"d6741c25-d626-4b56-9b5c-7018756a17c7","Type":"ContainerDied","Data":"820baf1ca6542c49ede7e348e7648f6b27e8a1cd6cef94820ed817fb47c87abc"} Feb 26 17:12:05 crc kubenswrapper[5033]: I0226 17:12:05.024276 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535432-6jv8g" Feb 26 17:12:05 crc kubenswrapper[5033]: I0226 17:12:05.193765 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k6d6\" (UniqueName: \"kubernetes.io/projected/d6741c25-d626-4b56-9b5c-7018756a17c7-kube-api-access-5k6d6\") pod \"d6741c25-d626-4b56-9b5c-7018756a17c7\" (UID: \"d6741c25-d626-4b56-9b5c-7018756a17c7\") " Feb 26 17:12:05 crc kubenswrapper[5033]: I0226 17:12:05.200362 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6741c25-d626-4b56-9b5c-7018756a17c7-kube-api-access-5k6d6" (OuterVolumeSpecName: "kube-api-access-5k6d6") pod "d6741c25-d626-4b56-9b5c-7018756a17c7" (UID: "d6741c25-d626-4b56-9b5c-7018756a17c7"). InnerVolumeSpecName "kube-api-access-5k6d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:12:05 crc kubenswrapper[5033]: I0226 17:12:05.296443 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k6d6\" (UniqueName: \"kubernetes.io/projected/d6741c25-d626-4b56-9b5c-7018756a17c7-kube-api-access-5k6d6\") on node \"crc\" DevicePath \"\"" Feb 26 17:12:05 crc kubenswrapper[5033]: I0226 17:12:05.676949 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535432-6jv8g" event={"ID":"d6741c25-d626-4b56-9b5c-7018756a17c7","Type":"ContainerDied","Data":"ebcabcf6419fdc3ec78564e710564bcb6be1a1d1b21dd0ef60248ea88e02daa5"} Feb 26 17:12:05 crc kubenswrapper[5033]: I0226 17:12:05.677000 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebcabcf6419fdc3ec78564e710564bcb6be1a1d1b21dd0ef60248ea88e02daa5" Feb 26 17:12:05 crc kubenswrapper[5033]: I0226 17:12:05.677073 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535432-6jv8g" Feb 26 17:12:05 crc kubenswrapper[5033]: I0226 17:12:05.711774 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535426-j47rw"] Feb 26 17:12:05 crc kubenswrapper[5033]: I0226 17:12:05.730445 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535426-j47rw"] Feb 26 17:12:07 crc kubenswrapper[5033]: I0226 17:12:07.644789 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abbf1d0f-7a41-4cc6-8607-16b42059a949" path="/var/lib/kubelet/pods/abbf1d0f-7a41-4cc6-8607-16b42059a949/volumes" Feb 26 17:12:33 crc kubenswrapper[5033]: I0226 17:12:33.596400 5033 scope.go:117] "RemoveContainer" containerID="8811b4cc8dc71d12c9f80c1ddae913550b1d050cf23cef00480eb8a4ac21f472" Feb 26 17:12:50 crc kubenswrapper[5033]: I0226 17:12:50.189623 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:12:50 crc kubenswrapper[5033]: I0226 17:12:50.190506 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:13:20 crc kubenswrapper[5033]: I0226 17:13:20.188887 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:13:20 crc kubenswrapper[5033]: I0226 17:13:20.189571 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:13:25 crc kubenswrapper[5033]: I0226 17:13:25.075684 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-sskqf"] Feb 26 17:13:25 crc kubenswrapper[5033]: I0226 17:13:25.084782 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-sskqf"] Feb 26 17:13:25 crc kubenswrapper[5033]: I0226 17:13:25.637880 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bcf0cce-fae8-4d24-b107-fff2c4787a42" path="/var/lib/kubelet/pods/1bcf0cce-fae8-4d24-b107-fff2c4787a42/volumes" Feb 26 17:13:33 crc kubenswrapper[5033]: I0226 17:13:33.701813 5033 scope.go:117] "RemoveContainer" containerID="4c26a5e4b2f6dcd406405fd65a5893444fc444113bda9573269d48ce80b09161" Feb 26 17:13:43 crc kubenswrapper[5033]: I0226 17:13:43.952709 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-g4txj"] Feb 26 17:13:43 crc kubenswrapper[5033]: E0226 17:13:43.954179 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6741c25-d626-4b56-9b5c-7018756a17c7" containerName="oc" Feb 26 17:13:43 crc kubenswrapper[5033]: I0226 17:13:43.954205 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6741c25-d626-4b56-9b5c-7018756a17c7" containerName="oc" Feb 26 17:13:43 crc kubenswrapper[5033]: I0226 17:13:43.954507 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6741c25-d626-4b56-9b5c-7018756a17c7" containerName="oc" Feb 26 17:13:43 crc kubenswrapper[5033]: I0226 17:13:43.955456 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:43 crc kubenswrapper[5033]: I0226 17:13:43.959934 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-eeae-account-create-update-z7z4m"] Feb 26 17:13:43 crc kubenswrapper[5033]: I0226 17:13:43.961322 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:43 crc kubenswrapper[5033]: I0226 17:13:43.962954 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 26 17:13:43 crc kubenswrapper[5033]: I0226 17:13:43.974788 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-eeae-account-create-update-z7z4m"] Feb 26 17:13:43 crc kubenswrapper[5033]: I0226 17:13:43.981961 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g4txj"] Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.147644 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bf9167-60ba-4040-aa99-a636b88e6a0b-operator-scripts\") pod \"barbican-db-create-g4txj\" (UID: \"39bf9167-60ba-4040-aa99-a636b88e6a0b\") " pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.147730 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-operator-scripts\") pod \"barbican-eeae-account-create-update-z7z4m\" (UID: \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\") " pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.147759 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xbsd\" (UniqueName: \"kubernetes.io/projected/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-kube-api-access-4xbsd\") pod \"barbican-eeae-account-create-update-z7z4m\" (UID: \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\") " pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.147905 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdpdk\" (UniqueName: \"kubernetes.io/projected/39bf9167-60ba-4040-aa99-a636b88e6a0b-kube-api-access-jdpdk\") pod \"barbican-db-create-g4txj\" (UID: \"39bf9167-60ba-4040-aa99-a636b88e6a0b\") " pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.249248 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-operator-scripts\") pod \"barbican-eeae-account-create-update-z7z4m\" (UID: \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\") " pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.249323 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbsd\" (UniqueName: \"kubernetes.io/projected/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-kube-api-access-4xbsd\") pod \"barbican-eeae-account-create-update-z7z4m\" (UID: \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\") " pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.249386 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdpdk\" (UniqueName: \"kubernetes.io/projected/39bf9167-60ba-4040-aa99-a636b88e6a0b-kube-api-access-jdpdk\") pod \"barbican-db-create-g4txj\" (UID: \"39bf9167-60ba-4040-aa99-a636b88e6a0b\") " pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.249432 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bf9167-60ba-4040-aa99-a636b88e6a0b-operator-scripts\") pod \"barbican-db-create-g4txj\" (UID: \"39bf9167-60ba-4040-aa99-a636b88e6a0b\") " pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.250704 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bf9167-60ba-4040-aa99-a636b88e6a0b-operator-scripts\") pod \"barbican-db-create-g4txj\" (UID: \"39bf9167-60ba-4040-aa99-a636b88e6a0b\") " pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.250844 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-operator-scripts\") pod \"barbican-eeae-account-create-update-z7z4m\" (UID: \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\") " pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.277462 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xbsd\" (UniqueName: \"kubernetes.io/projected/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-kube-api-access-4xbsd\") pod \"barbican-eeae-account-create-update-z7z4m\" (UID: \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\") " pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.281763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdpdk\" (UniqueName: \"kubernetes.io/projected/39bf9167-60ba-4040-aa99-a636b88e6a0b-kube-api-access-jdpdk\") pod \"barbican-db-create-g4txj\" (UID: \"39bf9167-60ba-4040-aa99-a636b88e6a0b\") " pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.283068 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.295232 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.758462 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-eeae-account-create-update-z7z4m"] Feb 26 17:13:44 crc kubenswrapper[5033]: I0226 17:13:44.841701 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g4txj"] Feb 26 17:13:45 crc kubenswrapper[5033]: I0226 17:13:45.569347 5033 generic.go:334] "Generic (PLEG): container finished" podID="bac90d32-0947-4b73-9a7d-c1e6c16bdb85" containerID="f9bf7194a0d7d166dd88cbf6a26439db02c4bf0a3587748b84b35f4ab5d18827" exitCode=0 Feb 26 17:13:45 crc kubenswrapper[5033]: I0226 17:13:45.569455 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-eeae-account-create-update-z7z4m" event={"ID":"bac90d32-0947-4b73-9a7d-c1e6c16bdb85","Type":"ContainerDied","Data":"f9bf7194a0d7d166dd88cbf6a26439db02c4bf0a3587748b84b35f4ab5d18827"} Feb 26 17:13:45 crc kubenswrapper[5033]: I0226 17:13:45.571477 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-eeae-account-create-update-z7z4m" event={"ID":"bac90d32-0947-4b73-9a7d-c1e6c16bdb85","Type":"ContainerStarted","Data":"ab1c6b5ad4594675d30492b95dc7b28ded5b7279b1762e10d28dbd0512e7434c"} Feb 26 17:13:45 crc kubenswrapper[5033]: I0226 17:13:45.575673 5033 generic.go:334] "Generic (PLEG): container finished" podID="39bf9167-60ba-4040-aa99-a636b88e6a0b" containerID="0b6fe038fb10d3f42b68369112bd2a627f278d8d71427b0ec76eb5a5b917e43c" exitCode=0 Feb 26 17:13:45 crc kubenswrapper[5033]: I0226 17:13:45.575762 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g4txj" event={"ID":"39bf9167-60ba-4040-aa99-a636b88e6a0b","Type":"ContainerDied","Data":"0b6fe038fb10d3f42b68369112bd2a627f278d8d71427b0ec76eb5a5b917e43c"} Feb 26 17:13:45 crc kubenswrapper[5033]: I0226 17:13:45.575798 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g4txj" event={"ID":"39bf9167-60ba-4040-aa99-a636b88e6a0b","Type":"ContainerStarted","Data":"bb8e1a084ab8ba0c5838cb1e9df3a2503773ac3778d417b22aaf7212a0f77692"} Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.006001 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.012740 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.117006 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xbsd\" (UniqueName: \"kubernetes.io/projected/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-kube-api-access-4xbsd\") pod \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\" (UID: \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\") " Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.117113 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-operator-scripts\") pod \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\" (UID: \"bac90d32-0947-4b73-9a7d-c1e6c16bdb85\") " Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.117199 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdpdk\" (UniqueName: \"kubernetes.io/projected/39bf9167-60ba-4040-aa99-a636b88e6a0b-kube-api-access-jdpdk\") pod \"39bf9167-60ba-4040-aa99-a636b88e6a0b\" (UID: \"39bf9167-60ba-4040-aa99-a636b88e6a0b\") " Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.117352 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bf9167-60ba-4040-aa99-a636b88e6a0b-operator-scripts\") pod \"39bf9167-60ba-4040-aa99-a636b88e6a0b\" (UID: \"39bf9167-60ba-4040-aa99-a636b88e6a0b\") " Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.118183 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bac90d32-0947-4b73-9a7d-c1e6c16bdb85" (UID: "bac90d32-0947-4b73-9a7d-c1e6c16bdb85"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.118654 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.118899 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bf9167-60ba-4040-aa99-a636b88e6a0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39bf9167-60ba-4040-aa99-a636b88e6a0b" (UID: "39bf9167-60ba-4040-aa99-a636b88e6a0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.124070 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39bf9167-60ba-4040-aa99-a636b88e6a0b-kube-api-access-jdpdk" (OuterVolumeSpecName: "kube-api-access-jdpdk") pod "39bf9167-60ba-4040-aa99-a636b88e6a0b" (UID: "39bf9167-60ba-4040-aa99-a636b88e6a0b"). InnerVolumeSpecName "kube-api-access-jdpdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.126393 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-kube-api-access-4xbsd" (OuterVolumeSpecName: "kube-api-access-4xbsd") pod "bac90d32-0947-4b73-9a7d-c1e6c16bdb85" (UID: "bac90d32-0947-4b73-9a7d-c1e6c16bdb85"). InnerVolumeSpecName "kube-api-access-4xbsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.220810 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xbsd\" (UniqueName: \"kubernetes.io/projected/bac90d32-0947-4b73-9a7d-c1e6c16bdb85-kube-api-access-4xbsd\") on node \"crc\" DevicePath \"\"" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.220869 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdpdk\" (UniqueName: \"kubernetes.io/projected/39bf9167-60ba-4040-aa99-a636b88e6a0b-kube-api-access-jdpdk\") on node \"crc\" DevicePath \"\"" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.220891 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bf9167-60ba-4040-aa99-a636b88e6a0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.599058 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-eeae-account-create-update-z7z4m" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.599060 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-eeae-account-create-update-z7z4m" event={"ID":"bac90d32-0947-4b73-9a7d-c1e6c16bdb85","Type":"ContainerDied","Data":"ab1c6b5ad4594675d30492b95dc7b28ded5b7279b1762e10d28dbd0512e7434c"} Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.599334 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab1c6b5ad4594675d30492b95dc7b28ded5b7279b1762e10d28dbd0512e7434c" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.601774 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g4txj" event={"ID":"39bf9167-60ba-4040-aa99-a636b88e6a0b","Type":"ContainerDied","Data":"bb8e1a084ab8ba0c5838cb1e9df3a2503773ac3778d417b22aaf7212a0f77692"} Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.601852 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb8e1a084ab8ba0c5838cb1e9df3a2503773ac3778d417b22aaf7212a0f77692" Feb 26 17:13:47 crc kubenswrapper[5033]: I0226 17:13:47.601935 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g4txj" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.273635 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8q9fh"] Feb 26 17:13:49 crc kubenswrapper[5033]: E0226 17:13:49.276852 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac90d32-0947-4b73-9a7d-c1e6c16bdb85" containerName="mariadb-account-create-update" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.276884 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac90d32-0947-4b73-9a7d-c1e6c16bdb85" containerName="mariadb-account-create-update" Feb 26 17:13:49 crc kubenswrapper[5033]: E0226 17:13:49.276916 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39bf9167-60ba-4040-aa99-a636b88e6a0b" containerName="mariadb-database-create" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.276926 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="39bf9167-60ba-4040-aa99-a636b88e6a0b" containerName="mariadb-database-create" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.277159 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bac90d32-0947-4b73-9a7d-c1e6c16bdb85" containerName="mariadb-account-create-update" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.277185 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="39bf9167-60ba-4040-aa99-a636b88e6a0b" containerName="mariadb-database-create" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.278037 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.290628 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mkttr" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.292501 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.295374 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8q9fh"] Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.460743 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98s6k\" (UniqueName: \"kubernetes.io/projected/fe38b338-9cef-49ae-a3f5-356667495c68-kube-api-access-98s6k\") pod \"barbican-db-sync-8q9fh\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.460920 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-db-sync-config-data\") pod \"barbican-db-sync-8q9fh\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.460981 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-combined-ca-bundle\") pod \"barbican-db-sync-8q9fh\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.562937 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98s6k\" (UniqueName: \"kubernetes.io/projected/fe38b338-9cef-49ae-a3f5-356667495c68-kube-api-access-98s6k\") pod \"barbican-db-sync-8q9fh\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.563073 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-db-sync-config-data\") pod \"barbican-db-sync-8q9fh\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.563114 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-combined-ca-bundle\") pod \"barbican-db-sync-8q9fh\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.568911 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-combined-ca-bundle\") pod \"barbican-db-sync-8q9fh\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.578947 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-db-sync-config-data\") pod \"barbican-db-sync-8q9fh\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.586009 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98s6k\" (UniqueName: \"kubernetes.io/projected/fe38b338-9cef-49ae-a3f5-356667495c68-kube-api-access-98s6k\") pod \"barbican-db-sync-8q9fh\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:49 crc kubenswrapper[5033]: I0226 17:13:49.617161 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.079479 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8q9fh"] Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.189425 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.189896 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.189965 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.191477 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e12b311251615971bfc352b62c31819aa1843ba8a719abffbebfc499b24d9582"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.191583 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://e12b311251615971bfc352b62c31819aa1843ba8a719abffbebfc499b24d9582" gracePeriod=600 Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.623380 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8q9fh" event={"ID":"fe38b338-9cef-49ae-a3f5-356667495c68","Type":"ContainerStarted","Data":"70c8726ee065ee26976877ae88ccb44e6a3d523959c097924c8c9d7b80f3f227"} Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.623750 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8q9fh" event={"ID":"fe38b338-9cef-49ae-a3f5-356667495c68","Type":"ContainerStarted","Data":"84b5b797757f60fd4612fab634857cd69b56c6b639181a9e047ab3482d6154c1"} Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.626364 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="e12b311251615971bfc352b62c31819aa1843ba8a719abffbebfc499b24d9582" exitCode=0 Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.626411 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"e12b311251615971bfc352b62c31819aa1843ba8a719abffbebfc499b24d9582"} Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.626445 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421"} Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.626465 5033 scope.go:117] "RemoveContainer" containerID="bdb1f2f5268959f0a7ed7bf58d4555bdf41679f7649b8b749b458e0f074f9bc4" Feb 26 17:13:50 crc kubenswrapper[5033]: I0226 17:13:50.662332 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8q9fh" podStartSLOduration=1.662316066 podStartE2EDuration="1.662316066s" podCreationTimestamp="2026-02-26 17:13:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:13:50.641839436 +0000 UTC m=+5557.854438106" watchObservedRunningTime="2026-02-26 17:13:50.662316066 +0000 UTC m=+5557.874914716" Feb 26 17:13:51 crc kubenswrapper[5033]: I0226 17:13:51.637325 5033 generic.go:334] "Generic (PLEG): container finished" podID="fe38b338-9cef-49ae-a3f5-356667495c68" containerID="70c8726ee065ee26976877ae88ccb44e6a3d523959c097924c8c9d7b80f3f227" exitCode=0 Feb 26 17:13:51 crc kubenswrapper[5033]: I0226 17:13:51.637612 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8q9fh" event={"ID":"fe38b338-9cef-49ae-a3f5-356667495c68","Type":"ContainerDied","Data":"70c8726ee065ee26976877ae88ccb44e6a3d523959c097924c8c9d7b80f3f227"} Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.019558 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.116789 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-db-sync-config-data\") pod \"fe38b338-9cef-49ae-a3f5-356667495c68\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.116883 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-combined-ca-bundle\") pod \"fe38b338-9cef-49ae-a3f5-356667495c68\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.116931 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98s6k\" (UniqueName: \"kubernetes.io/projected/fe38b338-9cef-49ae-a3f5-356667495c68-kube-api-access-98s6k\") pod \"fe38b338-9cef-49ae-a3f5-356667495c68\" (UID: \"fe38b338-9cef-49ae-a3f5-356667495c68\") " Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.121955 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe38b338-9cef-49ae-a3f5-356667495c68-kube-api-access-98s6k" (OuterVolumeSpecName: "kube-api-access-98s6k") pod "fe38b338-9cef-49ae-a3f5-356667495c68" (UID: "fe38b338-9cef-49ae-a3f5-356667495c68"). InnerVolumeSpecName "kube-api-access-98s6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.122060 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fe38b338-9cef-49ae-a3f5-356667495c68" (UID: "fe38b338-9cef-49ae-a3f5-356667495c68"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.144001 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe38b338-9cef-49ae-a3f5-356667495c68" (UID: "fe38b338-9cef-49ae-a3f5-356667495c68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.219997 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.220060 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98s6k\" (UniqueName: \"kubernetes.io/projected/fe38b338-9cef-49ae-a3f5-356667495c68-kube-api-access-98s6k\") on node \"crc\" DevicePath \"\"" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.220080 5033 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe38b338-9cef-49ae-a3f5-356667495c68-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.655109 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8q9fh" event={"ID":"fe38b338-9cef-49ae-a3f5-356667495c68","Type":"ContainerDied","Data":"84b5b797757f60fd4612fab634857cd69b56c6b639181a9e047ab3482d6154c1"} Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.655156 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84b5b797757f60fd4612fab634857cd69b56c6b639181a9e047ab3482d6154c1" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.655216 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8q9fh" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.885679 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-84bd445555-qg8pl"] Feb 26 17:13:53 crc kubenswrapper[5033]: E0226 17:13:53.886337 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe38b338-9cef-49ae-a3f5-356667495c68" containerName="barbican-db-sync" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.886356 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe38b338-9cef-49ae-a3f5-356667495c68" containerName="barbican-db-sync" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.886517 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe38b338-9cef-49ae-a3f5-356667495c68" containerName="barbican-db-sync" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.887356 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.890736 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.890912 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mkttr" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.891626 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.898544 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6798749c56-qfcf4"] Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.899798 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.908470 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.920146 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84bd445555-qg8pl"] Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.933758 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da8187b8-d72f-4471-9ebc-6846bf6a03f8-config-data-custom\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.933804 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnw5f\" (UniqueName: \"kubernetes.io/projected/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-kube-api-access-jnw5f\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.933852 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-logs\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.933879 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8187b8-d72f-4471-9ebc-6846bf6a03f8-combined-ca-bundle\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.933896 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-config-data\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.933932 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8187b8-d72f-4471-9ebc-6846bf6a03f8-config-data\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.933962 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8187b8-d72f-4471-9ebc-6846bf6a03f8-logs\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.933979 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pszc\" (UniqueName: \"kubernetes.io/projected/da8187b8-d72f-4471-9ebc-6846bf6a03f8-kube-api-access-7pszc\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.934014 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-combined-ca-bundle\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.934031 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-config-data-custom\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:53 crc kubenswrapper[5033]: I0226 17:13:53.945772 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6798749c56-qfcf4"] Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.036781 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8187b8-d72f-4471-9ebc-6846bf6a03f8-config-data\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.036867 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8187b8-d72f-4471-9ebc-6846bf6a03f8-logs\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.036903 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pszc\" (UniqueName: \"kubernetes.io/projected/da8187b8-d72f-4471-9ebc-6846bf6a03f8-kube-api-access-7pszc\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.036961 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-combined-ca-bundle\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.036994 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-config-data-custom\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.037041 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da8187b8-d72f-4471-9ebc-6846bf6a03f8-config-data-custom\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.037074 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnw5f\" (UniqueName: \"kubernetes.io/projected/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-kube-api-access-jnw5f\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.037113 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-logs\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.037167 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8187b8-d72f-4471-9ebc-6846bf6a03f8-combined-ca-bundle\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.037197 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-config-data\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.041740 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8187b8-d72f-4471-9ebc-6846bf6a03f8-logs\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.041991 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-logs\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.054664 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-combined-ca-bundle\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.064760 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnw5f\" (UniqueName: \"kubernetes.io/projected/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-kube-api-access-jnw5f\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.071514 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f75c6ccc5-c7dhg"] Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.072980 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.073407 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8187b8-d72f-4471-9ebc-6846bf6a03f8-config-data\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.074284 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-config-data-custom\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.074352 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da8187b8-d72f-4471-9ebc-6846bf6a03f8-config-data-custom\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.074489 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8187b8-d72f-4471-9ebc-6846bf6a03f8-combined-ca-bundle\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.079465 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pszc\" (UniqueName: \"kubernetes.io/projected/da8187b8-d72f-4471-9ebc-6846bf6a03f8-kube-api-access-7pszc\") pod \"barbican-keystone-listener-6798749c56-qfcf4\" (UID: \"da8187b8-d72f-4471-9ebc-6846bf6a03f8\") " pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.091853 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a4dbc71-6e95-4ec4-90f1-7867b5492ef3-config-data\") pod \"barbican-worker-84bd445555-qg8pl\" (UID: \"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3\") " pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.097347 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f75c6ccc5-c7dhg"] Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.106519 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84b69f6954-ldnp5"] Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.107823 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.110169 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.134811 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84b69f6954-ldnp5"] Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.225345 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84bd445555-qg8pl" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.239277 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-sb\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.239335 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-nb\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.239363 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34475404-fb32-4053-978e-02a89a738365-config-data-custom\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.239403 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34475404-fb32-4053-978e-02a89a738365-logs\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.239447 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-dns-svc\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.239505 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34475404-fb32-4053-978e-02a89a738365-combined-ca-bundle\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.239632 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zps25\" (UniqueName: \"kubernetes.io/projected/90c2225d-302a-464e-b874-c5fc02a75c5f-kube-api-access-zps25\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.239687 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx9w8\" (UniqueName: \"kubernetes.io/projected/34475404-fb32-4053-978e-02a89a738365-kube-api-access-mx9w8\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.240201 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-config\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.240344 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34475404-fb32-4053-978e-02a89a738365-config-data\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.257607 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341627 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-sb\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341680 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-nb\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341711 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34475404-fb32-4053-978e-02a89a738365-config-data-custom\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341752 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34475404-fb32-4053-978e-02a89a738365-logs\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341787 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-dns-svc\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341855 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34475404-fb32-4053-978e-02a89a738365-combined-ca-bundle\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341906 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zps25\" (UniqueName: \"kubernetes.io/projected/90c2225d-302a-464e-b874-c5fc02a75c5f-kube-api-access-zps25\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341928 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9w8\" (UniqueName: \"kubernetes.io/projected/34475404-fb32-4053-978e-02a89a738365-kube-api-access-mx9w8\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341955 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-config\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.341994 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34475404-fb32-4053-978e-02a89a738365-config-data\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.342546 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-sb\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.342744 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-nb\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.342857 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34475404-fb32-4053-978e-02a89a738365-logs\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.343438 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-config\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.343613 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-dns-svc\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.345517 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34475404-fb32-4053-978e-02a89a738365-config-data-custom\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.346009 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34475404-fb32-4053-978e-02a89a738365-config-data\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.349807 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34475404-fb32-4053-978e-02a89a738365-combined-ca-bundle\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.361148 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zps25\" (UniqueName: \"kubernetes.io/projected/90c2225d-302a-464e-b874-c5fc02a75c5f-kube-api-access-zps25\") pod \"dnsmasq-dns-6f75c6ccc5-c7dhg\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.365246 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx9w8\" (UniqueName: \"kubernetes.io/projected/34475404-fb32-4053-978e-02a89a738365-kube-api-access-mx9w8\") pod \"barbican-api-84b69f6954-ldnp5\" (UID: \"34475404-fb32-4053-978e-02a89a738365\") " pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.455495 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.464730 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.746005 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6798749c56-qfcf4"] Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.775879 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84bd445555-qg8pl"] Feb 26 17:13:54 crc kubenswrapper[5033]: W0226 17:13:54.779243 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a4dbc71_6e95_4ec4_90f1_7867b5492ef3.slice/crio-906aa1c893429aca291356f991e4a430c5e64c7b685642901c27e2e06892b95a WatchSource:0}: Error finding container 906aa1c893429aca291356f991e4a430c5e64c7b685642901c27e2e06892b95a: Status 404 returned error can't find the container with id 906aa1c893429aca291356f991e4a430c5e64c7b685642901c27e2e06892b95a Feb 26 17:13:54 crc kubenswrapper[5033]: I0226 17:13:54.925092 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84b69f6954-ldnp5"] Feb 26 17:13:54 crc kubenswrapper[5033]: W0226 17:13:54.930062 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34475404_fb32_4053_978e_02a89a738365.slice/crio-47e1d26354d070b9ffd64631e6905ba624e318e5bc5dee9b9369bca960fee5b4 WatchSource:0}: Error finding container 47e1d26354d070b9ffd64631e6905ba624e318e5bc5dee9b9369bca960fee5b4: Status 404 returned error can't find the container with id 47e1d26354d070b9ffd64631e6905ba624e318e5bc5dee9b9369bca960fee5b4 Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.000440 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f75c6ccc5-c7dhg"] Feb 26 17:13:55 crc kubenswrapper[5033]: W0226 17:13:55.021889 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90c2225d_302a_464e_b874_c5fc02a75c5f.slice/crio-e33d9815e97a7300753ba95ddc8a118140569d18b920023323234c3257d8ca39 WatchSource:0}: Error finding container e33d9815e97a7300753ba95ddc8a118140569d18b920023323234c3257d8ca39: Status 404 returned error can't find the container with id e33d9815e97a7300753ba95ddc8a118140569d18b920023323234c3257d8ca39 Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.672739 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b69f6954-ldnp5" event={"ID":"34475404-fb32-4053-978e-02a89a738365","Type":"ContainerStarted","Data":"d6cbbc8998addcee10ccfc8433a0b52d49ebe8dad29e1cf1a2086d81a29bc418"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.673099 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b69f6954-ldnp5" event={"ID":"34475404-fb32-4053-978e-02a89a738365","Type":"ContainerStarted","Data":"994e11a57b35ad8cbd572ccf7307833681a602defde32e60d9d1ec38f6d335a4"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.673114 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84b69f6954-ldnp5" event={"ID":"34475404-fb32-4053-978e-02a89a738365","Type":"ContainerStarted","Data":"47e1d26354d070b9ffd64631e6905ba624e318e5bc5dee9b9369bca960fee5b4"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.673348 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.673387 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.699709 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84bd445555-qg8pl" event={"ID":"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3","Type":"ContainerStarted","Data":"47d8074ee7317742ac06dbbb0d6a99dabec82d0abcaf665cf1c57a1a3f7bd0b7"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.699762 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84bd445555-qg8pl" event={"ID":"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3","Type":"ContainerStarted","Data":"462907e93e9bb82bca72e7d4a3cdf14258ecd34865b6c6f430fc4225d69afd9e"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.699780 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84bd445555-qg8pl" event={"ID":"8a4dbc71-6e95-4ec4-90f1-7867b5492ef3","Type":"ContainerStarted","Data":"906aa1c893429aca291356f991e4a430c5e64c7b685642901c27e2e06892b95a"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.706611 5033 generic.go:334] "Generic (PLEG): container finished" podID="90c2225d-302a-464e-b874-c5fc02a75c5f" containerID="c6d1452462e5ed9277df5f98ff191701698282d3f6116c1388cf3eb193d8b7a6" exitCode=0 Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.706706 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" event={"ID":"90c2225d-302a-464e-b874-c5fc02a75c5f","Type":"ContainerDied","Data":"c6d1452462e5ed9277df5f98ff191701698282d3f6116c1388cf3eb193d8b7a6"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.706731 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" event={"ID":"90c2225d-302a-464e-b874-c5fc02a75c5f","Type":"ContainerStarted","Data":"e33d9815e97a7300753ba95ddc8a118140569d18b920023323234c3257d8ca39"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.707446 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84b69f6954-ldnp5" podStartSLOduration=1.707409002 podStartE2EDuration="1.707409002s" podCreationTimestamp="2026-02-26 17:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:13:55.694003545 +0000 UTC m=+5562.906602225" watchObservedRunningTime="2026-02-26 17:13:55.707409002 +0000 UTC m=+5562.920007642" Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.710856 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" event={"ID":"da8187b8-d72f-4471-9ebc-6846bf6a03f8","Type":"ContainerStarted","Data":"f8698dc14a9455f37930e38bec5adba8c0986fd3f654bf4ffb74e8f2c0f87c86"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.710897 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" event={"ID":"da8187b8-d72f-4471-9ebc-6846bf6a03f8","Type":"ContainerStarted","Data":"27e770ec4c7fc955c636167e12b935f81fbc58beb049e4485ebf08ff8bf6a7c6"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.710907 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" event={"ID":"da8187b8-d72f-4471-9ebc-6846bf6a03f8","Type":"ContainerStarted","Data":"10dd32fcc3b141af621473ca400e64b39f8dc6eddf48632412fbc260f2183690"} Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.727708 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-84bd445555-qg8pl" podStartSLOduration=2.7276882159999998 podStartE2EDuration="2.727688216s" podCreationTimestamp="2026-02-26 17:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:13:55.719989789 +0000 UTC m=+5562.932588459" watchObservedRunningTime="2026-02-26 17:13:55.727688216 +0000 UTC m=+5562.940286866" Feb 26 17:13:55 crc kubenswrapper[5033]: I0226 17:13:55.746214 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6798749c56-qfcf4" podStartSLOduration=2.746197129 podStartE2EDuration="2.746197129s" podCreationTimestamp="2026-02-26 17:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:13:55.73518708 +0000 UTC m=+5562.947785730" watchObservedRunningTime="2026-02-26 17:13:55.746197129 +0000 UTC m=+5562.958795779" Feb 26 17:13:56 crc kubenswrapper[5033]: I0226 17:13:56.725124 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" event={"ID":"90c2225d-302a-464e-b874-c5fc02a75c5f","Type":"ContainerStarted","Data":"4d2b27cfb9ae183393f028ecf4bb352b8dcbd987e5ff68fca403308041e8fcf0"} Feb 26 17:13:56 crc kubenswrapper[5033]: I0226 17:13:56.726592 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:13:56 crc kubenswrapper[5033]: I0226 17:13:56.763658 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" podStartSLOduration=3.763641485 podStartE2EDuration="3.763641485s" podCreationTimestamp="2026-02-26 17:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:13:56.748037094 +0000 UTC m=+5563.960635744" watchObservedRunningTime="2026-02-26 17:13:56.763641485 +0000 UTC m=+5563.976240135" Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.126671 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535434-fl7rv"] Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.135422 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535434-fl7rv"] Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.135532 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535434-fl7rv" Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.138055 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.138322 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.139484 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.249474 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8t92\" (UniqueName: \"kubernetes.io/projected/bb28b135-15da-4647-91f8-f4b506248277-kube-api-access-r8t92\") pod \"auto-csr-approver-29535434-fl7rv\" (UID: \"bb28b135-15da-4647-91f8-f4b506248277\") " pod="openshift-infra/auto-csr-approver-29535434-fl7rv" Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.351640 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8t92\" (UniqueName: \"kubernetes.io/projected/bb28b135-15da-4647-91f8-f4b506248277-kube-api-access-r8t92\") pod \"auto-csr-approver-29535434-fl7rv\" (UID: \"bb28b135-15da-4647-91f8-f4b506248277\") " pod="openshift-infra/auto-csr-approver-29535434-fl7rv" Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.388780 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8t92\" (UniqueName: \"kubernetes.io/projected/bb28b135-15da-4647-91f8-f4b506248277-kube-api-access-r8t92\") pod \"auto-csr-approver-29535434-fl7rv\" (UID: \"bb28b135-15da-4647-91f8-f4b506248277\") " pod="openshift-infra/auto-csr-approver-29535434-fl7rv" Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.467019 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535434-fl7rv" Feb 26 17:14:00 crc kubenswrapper[5033]: I0226 17:14:00.927012 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535434-fl7rv"] Feb 26 17:14:01 crc kubenswrapper[5033]: I0226 17:14:01.772359 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535434-fl7rv" event={"ID":"bb28b135-15da-4647-91f8-f4b506248277","Type":"ContainerStarted","Data":"2a493f4734548a4fc4af308e3e827a6e00161b4064e38eb7a706564e5fdf963e"} Feb 26 17:14:02 crc kubenswrapper[5033]: I0226 17:14:02.782345 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535434-fl7rv" event={"ID":"bb28b135-15da-4647-91f8-f4b506248277","Type":"ContainerStarted","Data":"0a12391a322497c8a26f19e7ef6f88f74a3800349201679c3156620f0bd217b9"} Feb 26 17:14:02 crc kubenswrapper[5033]: I0226 17:14:02.806207 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535434-fl7rv" podStartSLOduration=1.621256086 podStartE2EDuration="2.80618059s" podCreationTimestamp="2026-02-26 17:14:00 +0000 UTC" firstStartedPulling="2026-02-26 17:14:00.951634767 +0000 UTC m=+5568.164233407" lastFinishedPulling="2026-02-26 17:14:02.136559261 +0000 UTC m=+5569.349157911" observedRunningTime="2026-02-26 17:14:02.79964449 +0000 UTC m=+5570.012243150" watchObservedRunningTime="2026-02-26 17:14:02.80618059 +0000 UTC m=+5570.018779240" Feb 26 17:14:03 crc kubenswrapper[5033]: I0226 17:14:03.793047 5033 generic.go:334] "Generic (PLEG): container finished" podID="bb28b135-15da-4647-91f8-f4b506248277" containerID="0a12391a322497c8a26f19e7ef6f88f74a3800349201679c3156620f0bd217b9" exitCode=0 Feb 26 17:14:03 crc kubenswrapper[5033]: I0226 17:14:03.793091 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535434-fl7rv" event={"ID":"bb28b135-15da-4647-91f8-f4b506248277","Type":"ContainerDied","Data":"0a12391a322497c8a26f19e7ef6f88f74a3800349201679c3156620f0bd217b9"} Feb 26 17:14:04 crc kubenswrapper[5033]: I0226 17:14:04.457415 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:14:04 crc kubenswrapper[5033]: I0226 17:14:04.533826 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647b68ccbf-5xftw"] Feb 26 17:14:04 crc kubenswrapper[5033]: I0226 17:14:04.534060 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" podUID="9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" containerName="dnsmasq-dns" containerID="cri-o://bac8c43e92d3b8e89466bc84597f06b7de238ba0bd5a44559a76496c9e18bba5" gracePeriod=10 Feb 26 17:14:04 crc kubenswrapper[5033]: I0226 17:14:04.805296 5033 generic.go:334] "Generic (PLEG): container finished" podID="9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" containerID="bac8c43e92d3b8e89466bc84597f06b7de238ba0bd5a44559a76496c9e18bba5" exitCode=0 Feb 26 17:14:04 crc kubenswrapper[5033]: I0226 17:14:04.805461 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" event={"ID":"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8","Type":"ContainerDied","Data":"bac8c43e92d3b8e89466bc84597f06b7de238ba0bd5a44559a76496c9e18bba5"} Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.047317 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.153056 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-dns-svc\") pod \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.153140 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-config\") pod \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.153176 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-sb\") pod \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.153227 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjfz8\" (UniqueName: \"kubernetes.io/projected/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-kube-api-access-rjfz8\") pod \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.153276 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-nb\") pod \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\" (UID: \"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8\") " Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.158296 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-kube-api-access-rjfz8" (OuterVolumeSpecName: "kube-api-access-rjfz8") pod "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" (UID: "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8"). InnerVolumeSpecName "kube-api-access-rjfz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.180984 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535434-fl7rv" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.194101 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-config" (OuterVolumeSpecName: "config") pod "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" (UID: "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.195924 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" (UID: "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.196942 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" (UID: "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.220230 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" (UID: "9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.254973 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8t92\" (UniqueName: \"kubernetes.io/projected/bb28b135-15da-4647-91f8-f4b506248277-kube-api-access-r8t92\") pod \"bb28b135-15da-4647-91f8-f4b506248277\" (UID: \"bb28b135-15da-4647-91f8-f4b506248277\") " Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.255400 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.255420 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.255452 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjfz8\" (UniqueName: \"kubernetes.io/projected/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-kube-api-access-rjfz8\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.255463 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.255472 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.257672 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb28b135-15da-4647-91f8-f4b506248277-kube-api-access-r8t92" (OuterVolumeSpecName: "kube-api-access-r8t92") pod "bb28b135-15da-4647-91f8-f4b506248277" (UID: "bb28b135-15da-4647-91f8-f4b506248277"). InnerVolumeSpecName "kube-api-access-r8t92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.357084 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8t92\" (UniqueName: \"kubernetes.io/projected/bb28b135-15da-4647-91f8-f4b506248277-kube-api-access-r8t92\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.816205 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" event={"ID":"9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8","Type":"ContainerDied","Data":"09822fe9428a5617d197cb1bd2b8090904a1a55b78acd11e6c4d2d2b8912562a"} Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.816343 5033 scope.go:117] "RemoveContainer" containerID="bac8c43e92d3b8e89466bc84597f06b7de238ba0bd5a44559a76496c9e18bba5" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.816340 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647b68ccbf-5xftw" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.824802 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535434-fl7rv" event={"ID":"bb28b135-15da-4647-91f8-f4b506248277","Type":"ContainerDied","Data":"2a493f4734548a4fc4af308e3e827a6e00161b4064e38eb7a706564e5fdf963e"} Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.824846 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a493f4734548a4fc4af308e3e827a6e00161b4064e38eb7a706564e5fdf963e" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.824873 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535434-fl7rv" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.859120 5033 scope.go:117] "RemoveContainer" containerID="a455958c138f14cc7be8ece1a6ab8f87964501bd5b2e1457108f2edebaf43968" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.904926 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647b68ccbf-5xftw"] Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.912976 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-647b68ccbf-5xftw"] Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.921730 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535428-xt8nk"] Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.929084 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.930352 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535428-xt8nk"] Feb 26 17:14:05 crc kubenswrapper[5033]: I0226 17:14:05.991638 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84b69f6954-ldnp5" Feb 26 17:14:07 crc kubenswrapper[5033]: I0226 17:14:07.637082 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" path="/var/lib/kubelet/pods/9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8/volumes" Feb 26 17:14:07 crc kubenswrapper[5033]: I0226 17:14:07.638521 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9eeef4-19bc-42d0-b77a-9ad03a4d674f" path="/var/lib/kubelet/pods/cf9eeef4-19bc-42d0-b77a-9ad03a4d674f/volumes" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.055669 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2cdrs"] Feb 26 17:14:18 crc kubenswrapper[5033]: E0226 17:14:18.057734 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" containerName="dnsmasq-dns" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.057837 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" containerName="dnsmasq-dns" Feb 26 17:14:18 crc kubenswrapper[5033]: E0226 17:14:18.057932 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb28b135-15da-4647-91f8-f4b506248277" containerName="oc" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.058007 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb28b135-15da-4647-91f8-f4b506248277" containerName="oc" Feb 26 17:14:18 crc kubenswrapper[5033]: E0226 17:14:18.058092 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" containerName="init" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.058164 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" containerName="init" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.058452 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3c1c68-f56c-473f-ae9c-030cf0c2d2a8" containerName="dnsmasq-dns" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.058561 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb28b135-15da-4647-91f8-f4b506248277" containerName="oc" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.059341 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.065940 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2cdrs"] Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.153848 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-90ef-account-create-update-qqnl6"] Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.155019 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.156910 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.192903 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-90ef-account-create-update-qqnl6"] Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.197383 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56674e29-05ce-4200-94bf-26173801be40-operator-scripts\") pod \"neutron-db-create-2cdrs\" (UID: \"56674e29-05ce-4200-94bf-26173801be40\") " pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.197438 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmjj8\" (UniqueName: \"kubernetes.io/projected/56674e29-05ce-4200-94bf-26173801be40-kube-api-access-dmjj8\") pod \"neutron-db-create-2cdrs\" (UID: \"56674e29-05ce-4200-94bf-26173801be40\") " pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.298937 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56674e29-05ce-4200-94bf-26173801be40-operator-scripts\") pod \"neutron-db-create-2cdrs\" (UID: \"56674e29-05ce-4200-94bf-26173801be40\") " pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.298987 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmjj8\" (UniqueName: \"kubernetes.io/projected/56674e29-05ce-4200-94bf-26173801be40-kube-api-access-dmjj8\") pod \"neutron-db-create-2cdrs\" (UID: \"56674e29-05ce-4200-94bf-26173801be40\") " pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.299021 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlgxk\" (UniqueName: \"kubernetes.io/projected/88dbb18e-b5df-435b-899c-36247cb9b20c-kube-api-access-hlgxk\") pod \"neutron-90ef-account-create-update-qqnl6\" (UID: \"88dbb18e-b5df-435b-899c-36247cb9b20c\") " pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.299084 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88dbb18e-b5df-435b-899c-36247cb9b20c-operator-scripts\") pod \"neutron-90ef-account-create-update-qqnl6\" (UID: \"88dbb18e-b5df-435b-899c-36247cb9b20c\") " pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.300080 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56674e29-05ce-4200-94bf-26173801be40-operator-scripts\") pod \"neutron-db-create-2cdrs\" (UID: \"56674e29-05ce-4200-94bf-26173801be40\") " pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.327863 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmjj8\" (UniqueName: \"kubernetes.io/projected/56674e29-05ce-4200-94bf-26173801be40-kube-api-access-dmjj8\") pod \"neutron-db-create-2cdrs\" (UID: \"56674e29-05ce-4200-94bf-26173801be40\") " pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.392369 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.400679 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88dbb18e-b5df-435b-899c-36247cb9b20c-operator-scripts\") pod \"neutron-90ef-account-create-update-qqnl6\" (UID: \"88dbb18e-b5df-435b-899c-36247cb9b20c\") " pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.400885 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlgxk\" (UniqueName: \"kubernetes.io/projected/88dbb18e-b5df-435b-899c-36247cb9b20c-kube-api-access-hlgxk\") pod \"neutron-90ef-account-create-update-qqnl6\" (UID: \"88dbb18e-b5df-435b-899c-36247cb9b20c\") " pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.401565 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88dbb18e-b5df-435b-899c-36247cb9b20c-operator-scripts\") pod \"neutron-90ef-account-create-update-qqnl6\" (UID: \"88dbb18e-b5df-435b-899c-36247cb9b20c\") " pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.422757 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlgxk\" (UniqueName: \"kubernetes.io/projected/88dbb18e-b5df-435b-899c-36247cb9b20c-kube-api-access-hlgxk\") pod \"neutron-90ef-account-create-update-qqnl6\" (UID: \"88dbb18e-b5df-435b-899c-36247cb9b20c\") " pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.469648 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.940106 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2cdrs"] Feb 26 17:14:18 crc kubenswrapper[5033]: I0226 17:14:18.967146 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2cdrs" event={"ID":"56674e29-05ce-4200-94bf-26173801be40","Type":"ContainerStarted","Data":"e6a53976d813eb4d38cc171b904cbdcdebc9105331659657892cd2fa4588ab09"} Feb 26 17:14:19 crc kubenswrapper[5033]: I0226 17:14:19.016552 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-90ef-account-create-update-qqnl6"] Feb 26 17:14:19 crc kubenswrapper[5033]: W0226 17:14:19.022391 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88dbb18e_b5df_435b_899c_36247cb9b20c.slice/crio-84a7f6e1337f56674ece688e4164d4ffcbe49edbd8a51c3895132070a9b04ad5 WatchSource:0}: Error finding container 84a7f6e1337f56674ece688e4164d4ffcbe49edbd8a51c3895132070a9b04ad5: Status 404 returned error can't find the container with id 84a7f6e1337f56674ece688e4164d4ffcbe49edbd8a51c3895132070a9b04ad5 Feb 26 17:14:19 crc kubenswrapper[5033]: I0226 17:14:19.976412 5033 generic.go:334] "Generic (PLEG): container finished" podID="88dbb18e-b5df-435b-899c-36247cb9b20c" containerID="3a43d91c6aa1b8b4a760a1a3bd2b908d27c8715640bf845a07165f8a2cdae545" exitCode=0 Feb 26 17:14:19 crc kubenswrapper[5033]: I0226 17:14:19.976488 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-90ef-account-create-update-qqnl6" event={"ID":"88dbb18e-b5df-435b-899c-36247cb9b20c","Type":"ContainerDied","Data":"3a43d91c6aa1b8b4a760a1a3bd2b908d27c8715640bf845a07165f8a2cdae545"} Feb 26 17:14:19 crc kubenswrapper[5033]: I0226 17:14:19.976514 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-90ef-account-create-update-qqnl6" event={"ID":"88dbb18e-b5df-435b-899c-36247cb9b20c","Type":"ContainerStarted","Data":"84a7f6e1337f56674ece688e4164d4ffcbe49edbd8a51c3895132070a9b04ad5"} Feb 26 17:14:19 crc kubenswrapper[5033]: I0226 17:14:19.977884 5033 generic.go:334] "Generic (PLEG): container finished" podID="56674e29-05ce-4200-94bf-26173801be40" containerID="a4ad016ba9768939669d0306245875217ac01494167e2c66d6fce09e0484935e" exitCode=0 Feb 26 17:14:19 crc kubenswrapper[5033]: I0226 17:14:19.977915 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2cdrs" event={"ID":"56674e29-05ce-4200-94bf-26173801be40","Type":"ContainerDied","Data":"a4ad016ba9768939669d0306245875217ac01494167e2c66d6fce09e0484935e"} Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.384918 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.392769 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.453873 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmjj8\" (UniqueName: \"kubernetes.io/projected/56674e29-05ce-4200-94bf-26173801be40-kube-api-access-dmjj8\") pod \"56674e29-05ce-4200-94bf-26173801be40\" (UID: \"56674e29-05ce-4200-94bf-26173801be40\") " Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.454616 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56674e29-05ce-4200-94bf-26173801be40-operator-scripts\") pod \"56674e29-05ce-4200-94bf-26173801be40\" (UID: \"56674e29-05ce-4200-94bf-26173801be40\") " Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.454738 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlgxk\" (UniqueName: \"kubernetes.io/projected/88dbb18e-b5df-435b-899c-36247cb9b20c-kube-api-access-hlgxk\") pod \"88dbb18e-b5df-435b-899c-36247cb9b20c\" (UID: \"88dbb18e-b5df-435b-899c-36247cb9b20c\") " Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.454894 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88dbb18e-b5df-435b-899c-36247cb9b20c-operator-scripts\") pod \"88dbb18e-b5df-435b-899c-36247cb9b20c\" (UID: \"88dbb18e-b5df-435b-899c-36247cb9b20c\") " Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.455354 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88dbb18e-b5df-435b-899c-36247cb9b20c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88dbb18e-b5df-435b-899c-36247cb9b20c" (UID: "88dbb18e-b5df-435b-899c-36247cb9b20c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.455628 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88dbb18e-b5df-435b-899c-36247cb9b20c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.455638 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56674e29-05ce-4200-94bf-26173801be40-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56674e29-05ce-4200-94bf-26173801be40" (UID: "56674e29-05ce-4200-94bf-26173801be40"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.459763 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88dbb18e-b5df-435b-899c-36247cb9b20c-kube-api-access-hlgxk" (OuterVolumeSpecName: "kube-api-access-hlgxk") pod "88dbb18e-b5df-435b-899c-36247cb9b20c" (UID: "88dbb18e-b5df-435b-899c-36247cb9b20c"). InnerVolumeSpecName "kube-api-access-hlgxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.461856 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56674e29-05ce-4200-94bf-26173801be40-kube-api-access-dmjj8" (OuterVolumeSpecName: "kube-api-access-dmjj8") pod "56674e29-05ce-4200-94bf-26173801be40" (UID: "56674e29-05ce-4200-94bf-26173801be40"). InnerVolumeSpecName "kube-api-access-dmjj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.558062 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmjj8\" (UniqueName: \"kubernetes.io/projected/56674e29-05ce-4200-94bf-26173801be40-kube-api-access-dmjj8\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.558130 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56674e29-05ce-4200-94bf-26173801be40-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.558145 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlgxk\" (UniqueName: \"kubernetes.io/projected/88dbb18e-b5df-435b-899c-36247cb9b20c-kube-api-access-hlgxk\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.994141 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2cdrs" event={"ID":"56674e29-05ce-4200-94bf-26173801be40","Type":"ContainerDied","Data":"e6a53976d813eb4d38cc171b904cbdcdebc9105331659657892cd2fa4588ab09"} Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.994183 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6a53976d813eb4d38cc171b904cbdcdebc9105331659657892cd2fa4588ab09" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.994309 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2cdrs" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.995814 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-90ef-account-create-update-qqnl6" event={"ID":"88dbb18e-b5df-435b-899c-36247cb9b20c","Type":"ContainerDied","Data":"84a7f6e1337f56674ece688e4164d4ffcbe49edbd8a51c3895132070a9b04ad5"} Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.995854 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-90ef-account-create-update-qqnl6" Feb 26 17:14:21 crc kubenswrapper[5033]: I0226 17:14:21.995863 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84a7f6e1337f56674ece688e4164d4ffcbe49edbd8a51c3895132070a9b04ad5" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.405412 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-s7bnh"] Feb 26 17:14:23 crc kubenswrapper[5033]: E0226 17:14:23.406118 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dbb18e-b5df-435b-899c-36247cb9b20c" containerName="mariadb-account-create-update" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.406135 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dbb18e-b5df-435b-899c-36247cb9b20c" containerName="mariadb-account-create-update" Feb 26 17:14:23 crc kubenswrapper[5033]: E0226 17:14:23.406168 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56674e29-05ce-4200-94bf-26173801be40" containerName="mariadb-database-create" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.406177 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="56674e29-05ce-4200-94bf-26173801be40" containerName="mariadb-database-create" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.406437 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="88dbb18e-b5df-435b-899c-36247cb9b20c" containerName="mariadb-account-create-update" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.406487 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="56674e29-05ce-4200-94bf-26173801be40" containerName="mariadb-database-create" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.407313 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.413173 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.413192 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.413882 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m6592" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.424415 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s7bnh"] Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.491313 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-config\") pod \"neutron-db-sync-s7bnh\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.491451 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-combined-ca-bundle\") pod \"neutron-db-sync-s7bnh\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.491487 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps2c5\" (UniqueName: \"kubernetes.io/projected/2496e558-4fc2-4065-8065-f08ee04e4ee2-kube-api-access-ps2c5\") pod \"neutron-db-sync-s7bnh\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.593151 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-combined-ca-bundle\") pod \"neutron-db-sync-s7bnh\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.593239 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps2c5\" (UniqueName: \"kubernetes.io/projected/2496e558-4fc2-4065-8065-f08ee04e4ee2-kube-api-access-ps2c5\") pod \"neutron-db-sync-s7bnh\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.593395 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-config\") pod \"neutron-db-sync-s7bnh\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.598957 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-config\") pod \"neutron-db-sync-s7bnh\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.600452 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-combined-ca-bundle\") pod \"neutron-db-sync-s7bnh\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.631087 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps2c5\" (UniqueName: \"kubernetes.io/projected/2496e558-4fc2-4065-8065-f08ee04e4ee2-kube-api-access-ps2c5\") pod \"neutron-db-sync-s7bnh\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:23 crc kubenswrapper[5033]: I0226 17:14:23.726173 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:24 crc kubenswrapper[5033]: I0226 17:14:24.213335 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s7bnh"] Feb 26 17:14:25 crc kubenswrapper[5033]: I0226 17:14:25.028364 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s7bnh" event={"ID":"2496e558-4fc2-4065-8065-f08ee04e4ee2","Type":"ContainerStarted","Data":"16d2b353304092e7f233aaa326da63f3100bfd4b1eb8e11fa1712531b4ba4c0b"} Feb 26 17:14:25 crc kubenswrapper[5033]: I0226 17:14:25.028747 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s7bnh" event={"ID":"2496e558-4fc2-4065-8065-f08ee04e4ee2","Type":"ContainerStarted","Data":"1676650f758904a482a5225951c4fb480b30f9afa2688cc11b60f54aa59269a8"} Feb 26 17:14:25 crc kubenswrapper[5033]: I0226 17:14:25.068052 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-s7bnh" podStartSLOduration=2.068010674 podStartE2EDuration="2.068010674s" podCreationTimestamp="2026-02-26 17:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:14:25.062202052 +0000 UTC m=+5592.274800702" watchObservedRunningTime="2026-02-26 17:14:25.068010674 +0000 UTC m=+5592.280609324" Feb 26 17:14:29 crc kubenswrapper[5033]: I0226 17:14:29.074331 5033 generic.go:334] "Generic (PLEG): container finished" podID="2496e558-4fc2-4065-8065-f08ee04e4ee2" containerID="16d2b353304092e7f233aaa326da63f3100bfd4b1eb8e11fa1712531b4ba4c0b" exitCode=0 Feb 26 17:14:29 crc kubenswrapper[5033]: I0226 17:14:29.074375 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s7bnh" event={"ID":"2496e558-4fc2-4065-8065-f08ee04e4ee2","Type":"ContainerDied","Data":"16d2b353304092e7f233aaa326da63f3100bfd4b1eb8e11fa1712531b4ba4c0b"} Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.451407 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.613983 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-combined-ca-bundle\") pod \"2496e558-4fc2-4065-8065-f08ee04e4ee2\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.614064 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps2c5\" (UniqueName: \"kubernetes.io/projected/2496e558-4fc2-4065-8065-f08ee04e4ee2-kube-api-access-ps2c5\") pod \"2496e558-4fc2-4065-8065-f08ee04e4ee2\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.614118 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-config\") pod \"2496e558-4fc2-4065-8065-f08ee04e4ee2\" (UID: \"2496e558-4fc2-4065-8065-f08ee04e4ee2\") " Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.626796 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2496e558-4fc2-4065-8065-f08ee04e4ee2-kube-api-access-ps2c5" (OuterVolumeSpecName: "kube-api-access-ps2c5") pod "2496e558-4fc2-4065-8065-f08ee04e4ee2" (UID: "2496e558-4fc2-4065-8065-f08ee04e4ee2"). InnerVolumeSpecName "kube-api-access-ps2c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.654433 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-config" (OuterVolumeSpecName: "config") pod "2496e558-4fc2-4065-8065-f08ee04e4ee2" (UID: "2496e558-4fc2-4065-8065-f08ee04e4ee2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.658173 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2496e558-4fc2-4065-8065-f08ee04e4ee2" (UID: "2496e558-4fc2-4065-8065-f08ee04e4ee2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.716637 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.716676 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps2c5\" (UniqueName: \"kubernetes.io/projected/2496e558-4fc2-4065-8065-f08ee04e4ee2-kube-api-access-ps2c5\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:30 crc kubenswrapper[5033]: I0226 17:14:30.716691 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2496e558-4fc2-4065-8065-f08ee04e4ee2-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.095807 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s7bnh" event={"ID":"2496e558-4fc2-4065-8065-f08ee04e4ee2","Type":"ContainerDied","Data":"1676650f758904a482a5225951c4fb480b30f9afa2688cc11b60f54aa59269a8"} Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.096518 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1676650f758904a482a5225951c4fb480b30f9afa2688cc11b60f54aa59269a8" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.095881 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s7bnh" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.330400 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69599894df-tzflt"] Feb 26 17:14:31 crc kubenswrapper[5033]: E0226 17:14:31.330694 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2496e558-4fc2-4065-8065-f08ee04e4ee2" containerName="neutron-db-sync" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.330709 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2496e558-4fc2-4065-8065-f08ee04e4ee2" containerName="neutron-db-sync" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.330874 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2496e558-4fc2-4065-8065-f08ee04e4ee2" containerName="neutron-db-sync" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.332921 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.362852 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69599894df-tzflt"] Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.404897 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69cc55c897-gnbn9"] Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.406232 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.412335 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.412482 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m6592" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.412645 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.420351 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69cc55c897-gnbn9"] Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.433451 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-nb\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.433551 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-sb\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.433587 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-dns-svc\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.433655 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-config\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.433772 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4hwd\" (UniqueName: \"kubernetes.io/projected/17dbb140-32e8-4741-a910-462f0b725d67-kube-api-access-g4hwd\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.534780 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d965aba7-c944-4339-a3af-b65a14d65f85-combined-ca-bundle\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.534858 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-nb\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.534928 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p7d6\" (UniqueName: \"kubernetes.io/projected/d965aba7-c944-4339-a3af-b65a14d65f85-kube-api-access-2p7d6\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.535073 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-sb\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.535119 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-dns-svc\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.535197 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-config\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.535246 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d965aba7-c944-4339-a3af-b65a14d65f85-config\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.535404 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d965aba7-c944-4339-a3af-b65a14d65f85-httpd-config\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.535466 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4hwd\" (UniqueName: \"kubernetes.io/projected/17dbb140-32e8-4741-a910-462f0b725d67-kube-api-access-g4hwd\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.535729 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-nb\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.535851 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-dns-svc\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.535940 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-sb\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.536303 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-config\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.554115 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4hwd\" (UniqueName: \"kubernetes.io/projected/17dbb140-32e8-4741-a910-462f0b725d67-kube-api-access-g4hwd\") pod \"dnsmasq-dns-69599894df-tzflt\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.636855 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d965aba7-c944-4339-a3af-b65a14d65f85-httpd-config\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.636920 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d965aba7-c944-4339-a3af-b65a14d65f85-combined-ca-bundle\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.636948 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p7d6\" (UniqueName: \"kubernetes.io/projected/d965aba7-c944-4339-a3af-b65a14d65f85-kube-api-access-2p7d6\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.637028 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d965aba7-c944-4339-a3af-b65a14d65f85-config\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.640994 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d965aba7-c944-4339-a3af-b65a14d65f85-config\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.641720 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d965aba7-c944-4339-a3af-b65a14d65f85-combined-ca-bundle\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.642275 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d965aba7-c944-4339-a3af-b65a14d65f85-httpd-config\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.654952 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p7d6\" (UniqueName: \"kubernetes.io/projected/d965aba7-c944-4339-a3af-b65a14d65f85-kube-api-access-2p7d6\") pod \"neutron-69cc55c897-gnbn9\" (UID: \"d965aba7-c944-4339-a3af-b65a14d65f85\") " pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.659208 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:31 crc kubenswrapper[5033]: I0226 17:14:31.728711 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:32 crc kubenswrapper[5033]: I0226 17:14:32.107817 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69cc55c897-gnbn9"] Feb 26 17:14:32 crc kubenswrapper[5033]: I0226 17:14:32.115955 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69599894df-tzflt"] Feb 26 17:14:32 crc kubenswrapper[5033]: W0226 17:14:32.117879 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17dbb140_32e8_4741_a910_462f0b725d67.slice/crio-6f64a52b2fee01271990a89c173536cb7ec53caafe6f326bfc3ece0c9768df98 WatchSource:0}: Error finding container 6f64a52b2fee01271990a89c173536cb7ec53caafe6f326bfc3ece0c9768df98: Status 404 returned error can't find the container with id 6f64a52b2fee01271990a89c173536cb7ec53caafe6f326bfc3ece0c9768df98 Feb 26 17:14:33 crc kubenswrapper[5033]: I0226 17:14:33.112754 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69cc55c897-gnbn9" event={"ID":"d965aba7-c944-4339-a3af-b65a14d65f85","Type":"ContainerStarted","Data":"11756f5c6d060292fb371702581a37049bef4fbf84379e1dfe8d3b603e18a213"} Feb 26 17:14:33 crc kubenswrapper[5033]: I0226 17:14:33.113185 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69cc55c897-gnbn9" event={"ID":"d965aba7-c944-4339-a3af-b65a14d65f85","Type":"ContainerStarted","Data":"7a25ecc03922ee526257c3041f0764ab9fbe4377384ac4398fe7a65a597ed601"} Feb 26 17:14:33 crc kubenswrapper[5033]: I0226 17:14:33.113195 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69cc55c897-gnbn9" event={"ID":"d965aba7-c944-4339-a3af-b65a14d65f85","Type":"ContainerStarted","Data":"bf68238463e2297c14e1501b6cbc130f657b7cd6ea330d11f33a24b54843b13c"} Feb 26 17:14:33 crc kubenswrapper[5033]: I0226 17:14:33.113548 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:14:33 crc kubenswrapper[5033]: I0226 17:14:33.117979 5033 generic.go:334] "Generic (PLEG): container finished" podID="17dbb140-32e8-4741-a910-462f0b725d67" containerID="edc4c1d88408b545e866f039fc6d8b7782062ad0015b713375f3348c9a476acf" exitCode=0 Feb 26 17:14:33 crc kubenswrapper[5033]: I0226 17:14:33.118048 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69599894df-tzflt" event={"ID":"17dbb140-32e8-4741-a910-462f0b725d67","Type":"ContainerDied","Data":"edc4c1d88408b545e866f039fc6d8b7782062ad0015b713375f3348c9a476acf"} Feb 26 17:14:33 crc kubenswrapper[5033]: I0226 17:14:33.118117 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69599894df-tzflt" event={"ID":"17dbb140-32e8-4741-a910-462f0b725d67","Type":"ContainerStarted","Data":"6f64a52b2fee01271990a89c173536cb7ec53caafe6f326bfc3ece0c9768df98"} Feb 26 17:14:33 crc kubenswrapper[5033]: I0226 17:14:33.152079 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69cc55c897-gnbn9" podStartSLOduration=2.152057743 podStartE2EDuration="2.152057743s" podCreationTimestamp="2026-02-26 17:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:14:33.1351339 +0000 UTC m=+5600.347732540" watchObservedRunningTime="2026-02-26 17:14:33.152057743 +0000 UTC m=+5600.364656393" Feb 26 17:14:33 crc kubenswrapper[5033]: I0226 17:14:33.770317 5033 scope.go:117] "RemoveContainer" containerID="070e573458cf9b62bdf811b913b376cdafa2be63e19933dd1a95143737d75222" Feb 26 17:14:34 crc kubenswrapper[5033]: I0226 17:14:34.128134 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69599894df-tzflt" event={"ID":"17dbb140-32e8-4741-a910-462f0b725d67","Type":"ContainerStarted","Data":"f59d32c4cba61d1471666e03d7a62dcde01129f8519a25ceac421d86f7e99f57"} Feb 26 17:14:34 crc kubenswrapper[5033]: I0226 17:14:34.128178 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:41 crc kubenswrapper[5033]: I0226 17:14:41.662165 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:14:41 crc kubenswrapper[5033]: I0226 17:14:41.683760 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69599894df-tzflt" podStartSLOduration=10.683742646 podStartE2EDuration="10.683742646s" podCreationTimestamp="2026-02-26 17:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:14:34.15147787 +0000 UTC m=+5601.364076520" watchObservedRunningTime="2026-02-26 17:14:41.683742646 +0000 UTC m=+5608.896341296" Feb 26 17:14:41 crc kubenswrapper[5033]: I0226 17:14:41.722223 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f75c6ccc5-c7dhg"] Feb 26 17:14:41 crc kubenswrapper[5033]: I0226 17:14:41.722555 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" podUID="90c2225d-302a-464e-b874-c5fc02a75c5f" containerName="dnsmasq-dns" containerID="cri-o://4d2b27cfb9ae183393f028ecf4bb352b8dcbd987e5ff68fca403308041e8fcf0" gracePeriod=10 Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.195606 5033 generic.go:334] "Generic (PLEG): container finished" podID="90c2225d-302a-464e-b874-c5fc02a75c5f" containerID="4d2b27cfb9ae183393f028ecf4bb352b8dcbd987e5ff68fca403308041e8fcf0" exitCode=0 Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.195944 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" event={"ID":"90c2225d-302a-464e-b874-c5fc02a75c5f","Type":"ContainerDied","Data":"4d2b27cfb9ae183393f028ecf4bb352b8dcbd987e5ff68fca403308041e8fcf0"} Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.308412 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.442149 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zps25\" (UniqueName: \"kubernetes.io/projected/90c2225d-302a-464e-b874-c5fc02a75c5f-kube-api-access-zps25\") pod \"90c2225d-302a-464e-b874-c5fc02a75c5f\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.442299 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-nb\") pod \"90c2225d-302a-464e-b874-c5fc02a75c5f\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.442372 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-config\") pod \"90c2225d-302a-464e-b874-c5fc02a75c5f\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.442411 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-sb\") pod \"90c2225d-302a-464e-b874-c5fc02a75c5f\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.442497 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-dns-svc\") pod \"90c2225d-302a-464e-b874-c5fc02a75c5f\" (UID: \"90c2225d-302a-464e-b874-c5fc02a75c5f\") " Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.462601 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c2225d-302a-464e-b874-c5fc02a75c5f-kube-api-access-zps25" (OuterVolumeSpecName: "kube-api-access-zps25") pod "90c2225d-302a-464e-b874-c5fc02a75c5f" (UID: "90c2225d-302a-464e-b874-c5fc02a75c5f"). InnerVolumeSpecName "kube-api-access-zps25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.501432 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90c2225d-302a-464e-b874-c5fc02a75c5f" (UID: "90c2225d-302a-464e-b874-c5fc02a75c5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.514386 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90c2225d-302a-464e-b874-c5fc02a75c5f" (UID: "90c2225d-302a-464e-b874-c5fc02a75c5f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.518879 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-config" (OuterVolumeSpecName: "config") pod "90c2225d-302a-464e-b874-c5fc02a75c5f" (UID: "90c2225d-302a-464e-b874-c5fc02a75c5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.538078 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90c2225d-302a-464e-b874-c5fc02a75c5f" (UID: "90c2225d-302a-464e-b874-c5fc02a75c5f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.547445 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.547487 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zps25\" (UniqueName: \"kubernetes.io/projected/90c2225d-302a-464e-b874-c5fc02a75c5f-kube-api-access-zps25\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.547502 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.547511 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:42 crc kubenswrapper[5033]: I0226 17:14:42.547520 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90c2225d-302a-464e-b874-c5fc02a75c5f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:14:43 crc kubenswrapper[5033]: I0226 17:14:43.207164 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" event={"ID":"90c2225d-302a-464e-b874-c5fc02a75c5f","Type":"ContainerDied","Data":"e33d9815e97a7300753ba95ddc8a118140569d18b920023323234c3257d8ca39"} Feb 26 17:14:43 crc kubenswrapper[5033]: I0226 17:14:43.207216 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f75c6ccc5-c7dhg" Feb 26 17:14:43 crc kubenswrapper[5033]: I0226 17:14:43.207236 5033 scope.go:117] "RemoveContainer" containerID="4d2b27cfb9ae183393f028ecf4bb352b8dcbd987e5ff68fca403308041e8fcf0" Feb 26 17:14:43 crc kubenswrapper[5033]: I0226 17:14:43.236829 5033 scope.go:117] "RemoveContainer" containerID="c6d1452462e5ed9277df5f98ff191701698282d3f6116c1388cf3eb193d8b7a6" Feb 26 17:14:43 crc kubenswrapper[5033]: I0226 17:14:43.252821 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f75c6ccc5-c7dhg"] Feb 26 17:14:43 crc kubenswrapper[5033]: I0226 17:14:43.263394 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f75c6ccc5-c7dhg"] Feb 26 17:14:43 crc kubenswrapper[5033]: I0226 17:14:43.638940 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c2225d-302a-464e-b874-c5fc02a75c5f" path="/var/lib/kubelet/pods/90c2225d-302a-464e-b874-c5fc02a75c5f/volumes" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.181840 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66"] Feb 26 17:15:00 crc kubenswrapper[5033]: E0226 17:15:00.182805 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c2225d-302a-464e-b874-c5fc02a75c5f" containerName="init" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.182820 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c2225d-302a-464e-b874-c5fc02a75c5f" containerName="init" Feb 26 17:15:00 crc kubenswrapper[5033]: E0226 17:15:00.182847 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c2225d-302a-464e-b874-c5fc02a75c5f" containerName="dnsmasq-dns" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.182855 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c2225d-302a-464e-b874-c5fc02a75c5f" containerName="dnsmasq-dns" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.183057 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c2225d-302a-464e-b874-c5fc02a75c5f" containerName="dnsmasq-dns" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.190777 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.194353 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.194932 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.208343 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e24a7aaa-2809-4c7e-8892-18229730d971-secret-volume\") pod \"collect-profiles-29535435-99h66\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.208726 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbt6n\" (UniqueName: \"kubernetes.io/projected/e24a7aaa-2809-4c7e-8892-18229730d971-kube-api-access-tbt6n\") pod \"collect-profiles-29535435-99h66\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.208972 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e24a7aaa-2809-4c7e-8892-18229730d971-config-volume\") pod \"collect-profiles-29535435-99h66\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.208394 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66"] Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.311996 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e24a7aaa-2809-4c7e-8892-18229730d971-secret-volume\") pod \"collect-profiles-29535435-99h66\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.312076 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbt6n\" (UniqueName: \"kubernetes.io/projected/e24a7aaa-2809-4c7e-8892-18229730d971-kube-api-access-tbt6n\") pod \"collect-profiles-29535435-99h66\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.312148 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e24a7aaa-2809-4c7e-8892-18229730d971-config-volume\") pod \"collect-profiles-29535435-99h66\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.313443 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e24a7aaa-2809-4c7e-8892-18229730d971-config-volume\") pod \"collect-profiles-29535435-99h66\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.328041 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e24a7aaa-2809-4c7e-8892-18229730d971-secret-volume\") pod \"collect-profiles-29535435-99h66\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.341989 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbt6n\" (UniqueName: \"kubernetes.io/projected/e24a7aaa-2809-4c7e-8892-18229730d971-kube-api-access-tbt6n\") pod \"collect-profiles-29535435-99h66\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.514664 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:00 crc kubenswrapper[5033]: I0226 17:15:00.942840 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66"] Feb 26 17:15:01 crc kubenswrapper[5033]: I0226 17:15:01.359468 5033 generic.go:334] "Generic (PLEG): container finished" podID="e24a7aaa-2809-4c7e-8892-18229730d971" containerID="77b4103b1fd596a6df56a0b1438bb60c5cbe857a0b840497c04fec4b2e8706eb" exitCode=0 Feb 26 17:15:01 crc kubenswrapper[5033]: I0226 17:15:01.359549 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" event={"ID":"e24a7aaa-2809-4c7e-8892-18229730d971","Type":"ContainerDied","Data":"77b4103b1fd596a6df56a0b1438bb60c5cbe857a0b840497c04fec4b2e8706eb"} Feb 26 17:15:01 crc kubenswrapper[5033]: I0226 17:15:01.359792 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" event={"ID":"e24a7aaa-2809-4c7e-8892-18229730d971","Type":"ContainerStarted","Data":"136af1e155ce3380214adf0faae7ac9bf047a0794c6f4d0debc06ecaa279848b"} Feb 26 17:15:01 crc kubenswrapper[5033]: I0226 17:15:01.740326 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69cc55c897-gnbn9" Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.670665 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.850249 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e24a7aaa-2809-4c7e-8892-18229730d971-config-volume\") pod \"e24a7aaa-2809-4c7e-8892-18229730d971\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.850369 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbt6n\" (UniqueName: \"kubernetes.io/projected/e24a7aaa-2809-4c7e-8892-18229730d971-kube-api-access-tbt6n\") pod \"e24a7aaa-2809-4c7e-8892-18229730d971\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.850428 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e24a7aaa-2809-4c7e-8892-18229730d971-secret-volume\") pod \"e24a7aaa-2809-4c7e-8892-18229730d971\" (UID: \"e24a7aaa-2809-4c7e-8892-18229730d971\") " Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.850863 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24a7aaa-2809-4c7e-8892-18229730d971-config-volume" (OuterVolumeSpecName: "config-volume") pod "e24a7aaa-2809-4c7e-8892-18229730d971" (UID: "e24a7aaa-2809-4c7e-8892-18229730d971"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.855600 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24a7aaa-2809-4c7e-8892-18229730d971-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e24a7aaa-2809-4c7e-8892-18229730d971" (UID: "e24a7aaa-2809-4c7e-8892-18229730d971"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.856957 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24a7aaa-2809-4c7e-8892-18229730d971-kube-api-access-tbt6n" (OuterVolumeSpecName: "kube-api-access-tbt6n") pod "e24a7aaa-2809-4c7e-8892-18229730d971" (UID: "e24a7aaa-2809-4c7e-8892-18229730d971"). InnerVolumeSpecName "kube-api-access-tbt6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.952099 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbt6n\" (UniqueName: \"kubernetes.io/projected/e24a7aaa-2809-4c7e-8892-18229730d971-kube-api-access-tbt6n\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.952132 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e24a7aaa-2809-4c7e-8892-18229730d971-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:02 crc kubenswrapper[5033]: I0226 17:15:02.952142 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e24a7aaa-2809-4c7e-8892-18229730d971-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:03 crc kubenswrapper[5033]: I0226 17:15:03.381343 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" event={"ID":"e24a7aaa-2809-4c7e-8892-18229730d971","Type":"ContainerDied","Data":"136af1e155ce3380214adf0faae7ac9bf047a0794c6f4d0debc06ecaa279848b"} Feb 26 17:15:03 crc kubenswrapper[5033]: I0226 17:15:03.381863 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="136af1e155ce3380214adf0faae7ac9bf047a0794c6f4d0debc06ecaa279848b" Feb 26 17:15:03 crc kubenswrapper[5033]: I0226 17:15:03.381387 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66" Feb 26 17:15:03 crc kubenswrapper[5033]: I0226 17:15:03.767770 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf"] Feb 26 17:15:03 crc kubenswrapper[5033]: I0226 17:15:03.778753 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535390-fg8wf"] Feb 26 17:15:05 crc kubenswrapper[5033]: I0226 17:15:05.637068 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bda0c14-3fc0-4f66-a3b1-f9359e21a587" path="/var/lib/kubelet/pods/3bda0c14-3fc0-4f66-a3b1-f9359e21a587/volumes" Feb 26 17:15:07 crc kubenswrapper[5033]: E0226 17:15:07.809164 5033 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.89:48606->38.102.83.89:39479: write tcp 38.102.83.89:48606->38.102.83.89:39479: write: broken pipe Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.262330 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2kp7z"] Feb 26 17:15:09 crc kubenswrapper[5033]: E0226 17:15:09.262840 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24a7aaa-2809-4c7e-8892-18229730d971" containerName="collect-profiles" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.262859 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24a7aaa-2809-4c7e-8892-18229730d971" containerName="collect-profiles" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.263075 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24a7aaa-2809-4c7e-8892-18229730d971" containerName="collect-profiles" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.263855 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.288210 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2kp7z"] Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.347999 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6b14-account-create-update-h6ct9"] Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.349317 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.350986 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.358792 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6b14-account-create-update-h6ct9"] Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.369418 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db2df6-733f-49c7-84d0-6183262c9cc6-operator-scripts\") pod \"glance-db-create-2kp7z\" (UID: \"44db2df6-733f-49c7-84d0-6183262c9cc6\") " pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.369494 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pj9m\" (UniqueName: \"kubernetes.io/projected/44db2df6-733f-49c7-84d0-6183262c9cc6-kube-api-access-7pj9m\") pod \"glance-db-create-2kp7z\" (UID: \"44db2df6-733f-49c7-84d0-6183262c9cc6\") " pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.471171 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db2df6-733f-49c7-84d0-6183262c9cc6-operator-scripts\") pod \"glance-db-create-2kp7z\" (UID: \"44db2df6-733f-49c7-84d0-6183262c9cc6\") " pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.471276 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pj9m\" (UniqueName: \"kubernetes.io/projected/44db2df6-733f-49c7-84d0-6183262c9cc6-kube-api-access-7pj9m\") pod \"glance-db-create-2kp7z\" (UID: \"44db2df6-733f-49c7-84d0-6183262c9cc6\") " pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.471346 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc7wm\" (UniqueName: \"kubernetes.io/projected/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-kube-api-access-tc7wm\") pod \"glance-6b14-account-create-update-h6ct9\" (UID: \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\") " pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.471389 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-operator-scripts\") pod \"glance-6b14-account-create-update-h6ct9\" (UID: \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\") " pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.472541 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db2df6-733f-49c7-84d0-6183262c9cc6-operator-scripts\") pod \"glance-db-create-2kp7z\" (UID: \"44db2df6-733f-49c7-84d0-6183262c9cc6\") " pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.493708 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pj9m\" (UniqueName: \"kubernetes.io/projected/44db2df6-733f-49c7-84d0-6183262c9cc6-kube-api-access-7pj9m\") pod \"glance-db-create-2kp7z\" (UID: \"44db2df6-733f-49c7-84d0-6183262c9cc6\") " pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.573516 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc7wm\" (UniqueName: \"kubernetes.io/projected/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-kube-api-access-tc7wm\") pod \"glance-6b14-account-create-update-h6ct9\" (UID: \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\") " pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.573590 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-operator-scripts\") pod \"glance-6b14-account-create-update-h6ct9\" (UID: \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\") " pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.574559 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-operator-scripts\") pod \"glance-6b14-account-create-update-h6ct9\" (UID: \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\") " pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.590690 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.611800 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc7wm\" (UniqueName: \"kubernetes.io/projected/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-kube-api-access-tc7wm\") pod \"glance-6b14-account-create-update-h6ct9\" (UID: \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\") " pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:09 crc kubenswrapper[5033]: I0226 17:15:09.675612 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:10 crc kubenswrapper[5033]: I0226 17:15:10.143356 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2kp7z"] Feb 26 17:15:10 crc kubenswrapper[5033]: I0226 17:15:10.226876 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6b14-account-create-update-h6ct9"] Feb 26 17:15:10 crc kubenswrapper[5033]: W0226 17:15:10.244910 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd8d70ca_5b67_45fa_a174_e47c57b6cfb2.slice/crio-8855cb3aee985edcef00d52bd5f948e7841359ba50cf5d9ab4807b13234105f6 WatchSource:0}: Error finding container 8855cb3aee985edcef00d52bd5f948e7841359ba50cf5d9ab4807b13234105f6: Status 404 returned error can't find the container with id 8855cb3aee985edcef00d52bd5f948e7841359ba50cf5d9ab4807b13234105f6 Feb 26 17:15:10 crc kubenswrapper[5033]: I0226 17:15:10.442819 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6b14-account-create-update-h6ct9" event={"ID":"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2","Type":"ContainerStarted","Data":"39feeb0fd932078974be5d4f740c0c0c77216f5cb656b7291e4874dd5f696365"} Feb 26 17:15:10 crc kubenswrapper[5033]: I0226 17:15:10.442868 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6b14-account-create-update-h6ct9" event={"ID":"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2","Type":"ContainerStarted","Data":"8855cb3aee985edcef00d52bd5f948e7841359ba50cf5d9ab4807b13234105f6"} Feb 26 17:15:10 crc kubenswrapper[5033]: I0226 17:15:10.445582 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2kp7z" event={"ID":"44db2df6-733f-49c7-84d0-6183262c9cc6","Type":"ContainerStarted","Data":"3f52b35551bda5b447877dc604a33e259f63fe02950a689c11f30c680f7a15cd"} Feb 26 17:15:10 crc kubenswrapper[5033]: I0226 17:15:10.445632 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2kp7z" event={"ID":"44db2df6-733f-49c7-84d0-6183262c9cc6","Type":"ContainerStarted","Data":"c8106ef103a591eba6205cb36037a5ed32d78c3a03b95494b34a40627e1c5407"} Feb 26 17:15:10 crc kubenswrapper[5033]: I0226 17:15:10.466549 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-6b14-account-create-update-h6ct9" podStartSLOduration=1.466527737 podStartE2EDuration="1.466527737s" podCreationTimestamp="2026-02-26 17:15:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:10.462476689 +0000 UTC m=+5637.675075359" watchObservedRunningTime="2026-02-26 17:15:10.466527737 +0000 UTC m=+5637.679126377" Feb 26 17:15:10 crc kubenswrapper[5033]: I0226 17:15:10.482561 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-2kp7z" podStartSLOduration=1.482538528 podStartE2EDuration="1.482538528s" podCreationTimestamp="2026-02-26 17:15:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:10.477588527 +0000 UTC m=+5637.690187177" watchObservedRunningTime="2026-02-26 17:15:10.482538528 +0000 UTC m=+5637.695137178" Feb 26 17:15:11 crc kubenswrapper[5033]: I0226 17:15:11.459221 5033 generic.go:334] "Generic (PLEG): container finished" podID="fd8d70ca-5b67-45fa-a174-e47c57b6cfb2" containerID="39feeb0fd932078974be5d4f740c0c0c77216f5cb656b7291e4874dd5f696365" exitCode=0 Feb 26 17:15:11 crc kubenswrapper[5033]: I0226 17:15:11.459337 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6b14-account-create-update-h6ct9" event={"ID":"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2","Type":"ContainerDied","Data":"39feeb0fd932078974be5d4f740c0c0c77216f5cb656b7291e4874dd5f696365"} Feb 26 17:15:11 crc kubenswrapper[5033]: I0226 17:15:11.461909 5033 generic.go:334] "Generic (PLEG): container finished" podID="44db2df6-733f-49c7-84d0-6183262c9cc6" containerID="3f52b35551bda5b447877dc604a33e259f63fe02950a689c11f30c680f7a15cd" exitCode=0 Feb 26 17:15:11 crc kubenswrapper[5033]: I0226 17:15:11.461948 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2kp7z" event={"ID":"44db2df6-733f-49c7-84d0-6183262c9cc6","Type":"ContainerDied","Data":"3f52b35551bda5b447877dc604a33e259f63fe02950a689c11f30c680f7a15cd"} Feb 26 17:15:12 crc kubenswrapper[5033]: I0226 17:15:12.859531 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:12 crc kubenswrapper[5033]: I0226 17:15:12.866583 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.041395 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db2df6-733f-49c7-84d0-6183262c9cc6-operator-scripts\") pod \"44db2df6-733f-49c7-84d0-6183262c9cc6\" (UID: \"44db2df6-733f-49c7-84d0-6183262c9cc6\") " Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.041446 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc7wm\" (UniqueName: \"kubernetes.io/projected/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-kube-api-access-tc7wm\") pod \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\" (UID: \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\") " Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.041515 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pj9m\" (UniqueName: \"kubernetes.io/projected/44db2df6-733f-49c7-84d0-6183262c9cc6-kube-api-access-7pj9m\") pod \"44db2df6-733f-49c7-84d0-6183262c9cc6\" (UID: \"44db2df6-733f-49c7-84d0-6183262c9cc6\") " Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.041552 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-operator-scripts\") pod \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\" (UID: \"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2\") " Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.042477 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44db2df6-733f-49c7-84d0-6183262c9cc6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44db2df6-733f-49c7-84d0-6183262c9cc6" (UID: "44db2df6-733f-49c7-84d0-6183262c9cc6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.042776 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd8d70ca-5b67-45fa-a174-e47c57b6cfb2" (UID: "fd8d70ca-5b67-45fa-a174-e47c57b6cfb2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.059599 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44db2df6-733f-49c7-84d0-6183262c9cc6-kube-api-access-7pj9m" (OuterVolumeSpecName: "kube-api-access-7pj9m") pod "44db2df6-733f-49c7-84d0-6183262c9cc6" (UID: "44db2df6-733f-49c7-84d0-6183262c9cc6"). InnerVolumeSpecName "kube-api-access-7pj9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.059757 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-kube-api-access-tc7wm" (OuterVolumeSpecName: "kube-api-access-tc7wm") pod "fd8d70ca-5b67-45fa-a174-e47c57b6cfb2" (UID: "fd8d70ca-5b67-45fa-a174-e47c57b6cfb2"). InnerVolumeSpecName "kube-api-access-tc7wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.143469 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44db2df6-733f-49c7-84d0-6183262c9cc6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.143509 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc7wm\" (UniqueName: \"kubernetes.io/projected/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-kube-api-access-tc7wm\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.143524 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pj9m\" (UniqueName: \"kubernetes.io/projected/44db2df6-733f-49c7-84d0-6183262c9cc6-kube-api-access-7pj9m\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.143535 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.495884 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6b14-account-create-update-h6ct9" event={"ID":"fd8d70ca-5b67-45fa-a174-e47c57b6cfb2","Type":"ContainerDied","Data":"8855cb3aee985edcef00d52bd5f948e7841359ba50cf5d9ab4807b13234105f6"} Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.496041 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8855cb3aee985edcef00d52bd5f948e7841359ba50cf5d9ab4807b13234105f6" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.496309 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6b14-account-create-update-h6ct9" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.506821 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2kp7z" event={"ID":"44db2df6-733f-49c7-84d0-6183262c9cc6","Type":"ContainerDied","Data":"c8106ef103a591eba6205cb36037a5ed32d78c3a03b95494b34a40627e1c5407"} Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.506888 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2kp7z" Feb 26 17:15:13 crc kubenswrapper[5033]: I0226 17:15:13.506906 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8106ef103a591eba6205cb36037a5ed32d78c3a03b95494b34a40627e1c5407" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.555400 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qhct7"] Feb 26 17:15:14 crc kubenswrapper[5033]: E0226 17:15:14.558907 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44db2df6-733f-49c7-84d0-6183262c9cc6" containerName="mariadb-database-create" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.558935 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="44db2df6-733f-49c7-84d0-6183262c9cc6" containerName="mariadb-database-create" Feb 26 17:15:14 crc kubenswrapper[5033]: E0226 17:15:14.558964 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8d70ca-5b67-45fa-a174-e47c57b6cfb2" containerName="mariadb-account-create-update" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.558975 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8d70ca-5b67-45fa-a174-e47c57b6cfb2" containerName="mariadb-account-create-update" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.559201 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="44db2df6-733f-49c7-84d0-6183262c9cc6" containerName="mariadb-database-create" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.559232 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8d70ca-5b67-45fa-a174-e47c57b6cfb2" containerName="mariadb-account-create-update" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.559930 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.562559 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.563179 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xlkhb" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.568222 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qhct7"] Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.673948 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-config-data\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.673999 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-db-sync-config-data\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.674071 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmzx\" (UniqueName: \"kubernetes.io/projected/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-kube-api-access-4vmzx\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.674439 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-combined-ca-bundle\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.775812 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-config-data\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.775859 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-db-sync-config-data\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.775929 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmzx\" (UniqueName: \"kubernetes.io/projected/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-kube-api-access-4vmzx\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.775986 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-combined-ca-bundle\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.781069 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-combined-ca-bundle\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.783686 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-db-sync-config-data\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.784747 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-config-data\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.792709 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmzx\" (UniqueName: \"kubernetes.io/projected/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-kube-api-access-4vmzx\") pod \"glance-db-sync-qhct7\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:14 crc kubenswrapper[5033]: I0226 17:15:14.887254 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:15 crc kubenswrapper[5033]: I0226 17:15:15.405669 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qhct7"] Feb 26 17:15:15 crc kubenswrapper[5033]: W0226 17:15:15.412532 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef21f49e_cefa_4d4d_9675_1c9d30a78aaa.slice/crio-55ae1d2c239cd870120328fba678f436f24f82956ad4765c3098682e579b3667 WatchSource:0}: Error finding container 55ae1d2c239cd870120328fba678f436f24f82956ad4765c3098682e579b3667: Status 404 returned error can't find the container with id 55ae1d2c239cd870120328fba678f436f24f82956ad4765c3098682e579b3667 Feb 26 17:15:15 crc kubenswrapper[5033]: I0226 17:15:15.531117 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qhct7" event={"ID":"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa","Type":"ContainerStarted","Data":"55ae1d2c239cd870120328fba678f436f24f82956ad4765c3098682e579b3667"} Feb 26 17:15:16 crc kubenswrapper[5033]: I0226 17:15:16.554607 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qhct7" event={"ID":"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa","Type":"ContainerStarted","Data":"eb7a13789208c19e4a36e6654620cc293b75bf65afdfd02bd5b8d8baa9f1b6ce"} Feb 26 17:15:16 crc kubenswrapper[5033]: I0226 17:15:16.570593 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qhct7" podStartSLOduration=2.570577044 podStartE2EDuration="2.570577044s" podCreationTimestamp="2026-02-26 17:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:16.566966695 +0000 UTC m=+5643.779565345" watchObservedRunningTime="2026-02-26 17:15:16.570577044 +0000 UTC m=+5643.783175694" Feb 26 17:15:19 crc kubenswrapper[5033]: I0226 17:15:19.604670 5033 generic.go:334] "Generic (PLEG): container finished" podID="ef21f49e-cefa-4d4d-9675-1c9d30a78aaa" containerID="eb7a13789208c19e4a36e6654620cc293b75bf65afdfd02bd5b8d8baa9f1b6ce" exitCode=0 Feb 26 17:15:19 crc kubenswrapper[5033]: I0226 17:15:19.604751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qhct7" event={"ID":"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa","Type":"ContainerDied","Data":"eb7a13789208c19e4a36e6654620cc293b75bf65afdfd02bd5b8d8baa9f1b6ce"} Feb 26 17:15:20 crc kubenswrapper[5033]: I0226 17:15:20.995287 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.086731 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-config-data\") pod \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.086843 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-combined-ca-bundle\") pod \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.086899 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-db-sync-config-data\") pod \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.086924 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vmzx\" (UniqueName: \"kubernetes.io/projected/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-kube-api-access-4vmzx\") pod \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\" (UID: \"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa\") " Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.093194 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ef21f49e-cefa-4d4d-9675-1c9d30a78aaa" (UID: "ef21f49e-cefa-4d4d-9675-1c9d30a78aaa"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.093838 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-kube-api-access-4vmzx" (OuterVolumeSpecName: "kube-api-access-4vmzx") pod "ef21f49e-cefa-4d4d-9675-1c9d30a78aaa" (UID: "ef21f49e-cefa-4d4d-9675-1c9d30a78aaa"). InnerVolumeSpecName "kube-api-access-4vmzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.139828 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef21f49e-cefa-4d4d-9675-1c9d30a78aaa" (UID: "ef21f49e-cefa-4d4d-9675-1c9d30a78aaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.145602 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-config-data" (OuterVolumeSpecName: "config-data") pod "ef21f49e-cefa-4d4d-9675-1c9d30a78aaa" (UID: "ef21f49e-cefa-4d4d-9675-1c9d30a78aaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.188840 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.188880 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.188890 5033 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.188900 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vmzx\" (UniqueName: \"kubernetes.io/projected/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa-kube-api-access-4vmzx\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.631011 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qhct7" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.648795 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qhct7" event={"ID":"ef21f49e-cefa-4d4d-9675-1c9d30a78aaa","Type":"ContainerDied","Data":"55ae1d2c239cd870120328fba678f436f24f82956ad4765c3098682e579b3667"} Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.648839 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55ae1d2c239cd870120328fba678f436f24f82956ad4765c3098682e579b3667" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.926051 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75f5f55c85-4gs2z"] Feb 26 17:15:21 crc kubenswrapper[5033]: E0226 17:15:21.928744 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef21f49e-cefa-4d4d-9675-1c9d30a78aaa" containerName="glance-db-sync" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.928783 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef21f49e-cefa-4d4d-9675-1c9d30a78aaa" containerName="glance-db-sync" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.929005 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef21f49e-cefa-4d4d-9675-1c9d30a78aaa" containerName="glance-db-sync" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.930158 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.943037 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75f5f55c85-4gs2z"] Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.981425 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.982861 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.987740 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xlkhb" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.988042 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.988053 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 17:15:21 crc kubenswrapper[5033]: I0226 17:15:21.988163 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.001357 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.090995 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.101291 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.103495 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwcsv\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-kube-api-access-rwcsv\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.103725 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-config\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.103827 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-nb\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.103946 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-ceph\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.104082 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-config-data\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.104191 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-sb\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.106048 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-dns-svc\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.106216 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-scripts\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.106375 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.106640 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.106747 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-logs\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.106854 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n64v\" (UniqueName: \"kubernetes.io/projected/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-kube-api-access-9n64v\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.106769 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.142054 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208101 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-ceph\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208152 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-config-data\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208184 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-sb\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208206 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-dns-svc\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208236 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-scripts\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208287 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208349 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s29sh\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-kube-api-access-s29sh\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208395 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208424 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208446 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-logs\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208472 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-logs\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208495 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n64v\" (UniqueName: \"kubernetes.io/projected/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-kube-api-access-9n64v\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208516 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208545 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208572 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208608 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwcsv\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-kube-api-access-rwcsv\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208653 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208675 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-config\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.208696 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-nb\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.209435 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-logs\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.209547 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-nb\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.209888 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.210072 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-sb\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.210599 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-dns-svc\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.217471 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-config\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.217789 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-ceph\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.218027 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-scripts\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.221550 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.225417 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-config-data\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.242056 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n64v\" (UniqueName: \"kubernetes.io/projected/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-kube-api-access-9n64v\") pod \"dnsmasq-dns-75f5f55c85-4gs2z\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.254874 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwcsv\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-kube-api-access-rwcsv\") pod \"glance-default-external-api-0\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.265736 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.306628 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.310020 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s29sh\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-kube-api-access-s29sh\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.310071 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.310095 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-logs\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.310116 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.310136 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.310153 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.310206 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.311105 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-logs\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.311321 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.314245 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.315762 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.318243 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.320699 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.331318 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s29sh\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-kube-api-access-s29sh\") pod \"glance-default-internal-api-0\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.427793 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.841669 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75f5f55c85-4gs2z"] Feb 26 17:15:22 crc kubenswrapper[5033]: I0226 17:15:22.944049 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:15:23 crc kubenswrapper[5033]: I0226 17:15:23.024117 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:15:23 crc kubenswrapper[5033]: I0226 17:15:23.104764 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:15:23 crc kubenswrapper[5033]: W0226 17:15:23.156107 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55911bc3_f18d_4b68_a56a_0ca95d54e95c.slice/crio-0dd48ec707530437f1127a140ff9e3ad4035ec6c1680a41a1dc4ff9059788836 WatchSource:0}: Error finding container 0dd48ec707530437f1127a140ff9e3ad4035ec6c1680a41a1dc4ff9059788836: Status 404 returned error can't find the container with id 0dd48ec707530437f1127a140ff9e3ad4035ec6c1680a41a1dc4ff9059788836 Feb 26 17:15:23 crc kubenswrapper[5033]: I0226 17:15:23.665395 5033 generic.go:334] "Generic (PLEG): container finished" podID="5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" containerID="fa8abaf51a660c845e618f702f1ca52fb31e097a6b3b75a0f143e05b36e051ec" exitCode=0 Feb 26 17:15:23 crc kubenswrapper[5033]: I0226 17:15:23.665700 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" event={"ID":"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403","Type":"ContainerDied","Data":"fa8abaf51a660c845e618f702f1ca52fb31e097a6b3b75a0f143e05b36e051ec"} Feb 26 17:15:23 crc kubenswrapper[5033]: I0226 17:15:23.665729 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" event={"ID":"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403","Type":"ContainerStarted","Data":"ad3be7cabc74bfe98fa2c9650fed7d763ba4909c94ce09a7b177b114ef2a8b1a"} Feb 26 17:15:23 crc kubenswrapper[5033]: I0226 17:15:23.674935 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102","Type":"ContainerStarted","Data":"7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60"} Feb 26 17:15:23 crc kubenswrapper[5033]: I0226 17:15:23.675014 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102","Type":"ContainerStarted","Data":"da185f06f06976491faafe1cea9a893922922b22a8f14449b06fd437d2091b21"} Feb 26 17:15:23 crc kubenswrapper[5033]: I0226 17:15:23.680773 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"55911bc3-f18d-4b68-a56a-0ca95d54e95c","Type":"ContainerStarted","Data":"0dd48ec707530437f1127a140ff9e3ad4035ec6c1680a41a1dc4ff9059788836"} Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.699797 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102","Type":"ContainerStarted","Data":"62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d"} Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.699864 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerName="glance-log" containerID="cri-o://7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60" gracePeriod=30 Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.699916 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerName="glance-httpd" containerID="cri-o://62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d" gracePeriod=30 Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.704785 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"55911bc3-f18d-4b68-a56a-0ca95d54e95c","Type":"ContainerStarted","Data":"b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc"} Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.704828 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"55911bc3-f18d-4b68-a56a-0ca95d54e95c","Type":"ContainerStarted","Data":"0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c"} Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.710103 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" event={"ID":"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403","Type":"ContainerStarted","Data":"fb7041a3ebeba7ccfda8b2483193ceb9cc30b7ec1e410e551260a8bd38c3f2cb"} Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.710250 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.732849 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.732802361 podStartE2EDuration="3.732802361s" podCreationTimestamp="2026-02-26 17:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:24.727241965 +0000 UTC m=+5651.939840615" watchObservedRunningTime="2026-02-26 17:15:24.732802361 +0000 UTC m=+5651.945401011" Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.758063 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.774824 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.7748031749999997 podStartE2EDuration="2.774803175s" podCreationTimestamp="2026-02-26 17:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:24.771414263 +0000 UTC m=+5651.984012933" watchObservedRunningTime="2026-02-26 17:15:24.774803175 +0000 UTC m=+5651.987401825" Feb 26 17:15:24 crc kubenswrapper[5033]: I0226 17:15:24.795313 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" podStartSLOduration=3.795231324 podStartE2EDuration="3.795231324s" podCreationTimestamp="2026-02-26 17:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:24.786427669 +0000 UTC m=+5651.999026329" watchObservedRunningTime="2026-02-26 17:15:24.795231324 +0000 UTC m=+5652.007829974" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.302116 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.367349 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwcsv\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-kube-api-access-rwcsv\") pod \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.367394 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-ceph\") pod \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.367425 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-httpd-run\") pod \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.367990 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" (UID: "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.368066 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-config-data\") pod \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.368406 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-combined-ca-bundle\") pod \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.368543 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-scripts\") pod \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.368567 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-logs\") pod \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\" (UID: \"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102\") " Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.368992 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.369296 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-logs" (OuterVolumeSpecName: "logs") pod "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" (UID: "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.373418 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-kube-api-access-rwcsv" (OuterVolumeSpecName: "kube-api-access-rwcsv") pod "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" (UID: "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102"). InnerVolumeSpecName "kube-api-access-rwcsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.373871 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-scripts" (OuterVolumeSpecName: "scripts") pod "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" (UID: "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.377390 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-ceph" (OuterVolumeSpecName: "ceph") pod "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" (UID: "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.390860 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" (UID: "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.409131 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-config-data" (OuterVolumeSpecName: "config-data") pod "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" (UID: "cfb27f23-26e7-4f8e-ac4b-6d82ef65a102"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.470503 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.470535 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.470546 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwcsv\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-kube-api-access-rwcsv\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.470557 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.470566 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.470576 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.724768 5033 generic.go:334] "Generic (PLEG): container finished" podID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerID="62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d" exitCode=0 Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.725123 5033 generic.go:334] "Generic (PLEG): container finished" podID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerID="7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60" exitCode=143 Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.726211 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.726362 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102","Type":"ContainerDied","Data":"62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d"} Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.726397 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102","Type":"ContainerDied","Data":"7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60"} Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.726409 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cfb27f23-26e7-4f8e-ac4b-6d82ef65a102","Type":"ContainerDied","Data":"da185f06f06976491faafe1cea9a893922922b22a8f14449b06fd437d2091b21"} Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.726424 5033 scope.go:117] "RemoveContainer" containerID="62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.754591 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.763928 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.773674 5033 scope.go:117] "RemoveContainer" containerID="7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.794376 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:15:25 crc kubenswrapper[5033]: E0226 17:15:25.794857 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerName="glance-httpd" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.794882 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerName="glance-httpd" Feb 26 17:15:25 crc kubenswrapper[5033]: E0226 17:15:25.794895 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerName="glance-log" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.794903 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerName="glance-log" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.795090 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerName="glance-httpd" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.795110 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" containerName="glance-log" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.796272 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.799946 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.807421 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.808757 5033 scope.go:117] "RemoveContainer" containerID="62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d" Feb 26 17:15:25 crc kubenswrapper[5033]: E0226 17:15:25.811111 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d\": container with ID starting with 62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d not found: ID does not exist" containerID="62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.811171 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d"} err="failed to get container status \"62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d\": rpc error: code = NotFound desc = could not find container \"62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d\": container with ID starting with 62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d not found: ID does not exist" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.811206 5033 scope.go:117] "RemoveContainer" containerID="7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60" Feb 26 17:15:25 crc kubenswrapper[5033]: E0226 17:15:25.825522 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60\": container with ID starting with 7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60 not found: ID does not exist" containerID="7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.825564 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60"} err="failed to get container status \"7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60\": rpc error: code = NotFound desc = could not find container \"7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60\": container with ID starting with 7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60 not found: ID does not exist" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.825587 5033 scope.go:117] "RemoveContainer" containerID="62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.830128 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d"} err="failed to get container status \"62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d\": rpc error: code = NotFound desc = could not find container \"62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d\": container with ID starting with 62beb05a493f7d6d068094fddf6929cba0fbdd3b4ff33e2ba311a5b83cdc3e1d not found: ID does not exist" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.830167 5033 scope.go:117] "RemoveContainer" containerID="7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.831004 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60"} err="failed to get container status \"7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60\": rpc error: code = NotFound desc = could not find container \"7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60\": container with ID starting with 7bc296f0c7ef453e56ef5414e537502fa35f489b0cd696679e595dbd23678e60 not found: ID does not exist" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.876925 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-logs\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.876974 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-ceph\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.877002 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.877024 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.877058 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.877085 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv88l\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-kube-api-access-fv88l\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.877117 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.978178 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.978321 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-logs\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.978384 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-ceph\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.978409 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.978425 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.978458 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.978483 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv88l\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-kube-api-access-fv88l\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.979800 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-logs\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.979906 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.982822 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.983090 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.983860 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-ceph\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.993450 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:25 crc kubenswrapper[5033]: I0226 17:15:25.996340 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv88l\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-kube-api-access-fv88l\") pod \"glance-default-external-api-0\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " pod="openstack/glance-default-external-api-0" Feb 26 17:15:26 crc kubenswrapper[5033]: I0226 17:15:26.130360 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:15:26 crc kubenswrapper[5033]: W0226 17:15:26.684799 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef6f86d8_6fcf_41cf_b941_975937a1107f.slice/crio-a7eb0a7208ae0c73ece38b5da2adb7cd97e3e026aa4a3f9e5cff6b75fcd46656 WatchSource:0}: Error finding container a7eb0a7208ae0c73ece38b5da2adb7cd97e3e026aa4a3f9e5cff6b75fcd46656: Status 404 returned error can't find the container with id a7eb0a7208ae0c73ece38b5da2adb7cd97e3e026aa4a3f9e5cff6b75fcd46656 Feb 26 17:15:26 crc kubenswrapper[5033]: I0226 17:15:26.689077 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:15:26 crc kubenswrapper[5033]: I0226 17:15:26.734315 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6f86d8-6fcf-41cf-b941-975937a1107f","Type":"ContainerStarted","Data":"a7eb0a7208ae0c73ece38b5da2adb7cd97e3e026aa4a3f9e5cff6b75fcd46656"} Feb 26 17:15:26 crc kubenswrapper[5033]: I0226 17:15:26.736128 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerName="glance-log" containerID="cri-o://0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c" gracePeriod=30 Feb 26 17:15:26 crc kubenswrapper[5033]: I0226 17:15:26.736251 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerName="glance-httpd" containerID="cri-o://b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc" gracePeriod=30 Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.356040 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.419857 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-httpd-run\") pod \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.419906 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-config-data\") pod \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.419988 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-logs\") pod \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.420066 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-combined-ca-bundle\") pod \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.420124 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s29sh\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-kube-api-access-s29sh\") pod \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.420152 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-scripts\") pod \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.420209 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-ceph\") pod \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\" (UID: \"55911bc3-f18d-4b68-a56a-0ca95d54e95c\") " Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.420472 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-logs" (OuterVolumeSpecName: "logs") pod "55911bc3-f18d-4b68-a56a-0ca95d54e95c" (UID: "55911bc3-f18d-4b68-a56a-0ca95d54e95c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.420509 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "55911bc3-f18d-4b68-a56a-0ca95d54e95c" (UID: "55911bc3-f18d-4b68-a56a-0ca95d54e95c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.424595 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-kube-api-access-s29sh" (OuterVolumeSpecName: "kube-api-access-s29sh") pod "55911bc3-f18d-4b68-a56a-0ca95d54e95c" (UID: "55911bc3-f18d-4b68-a56a-0ca95d54e95c"). InnerVolumeSpecName "kube-api-access-s29sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.438729 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-ceph" (OuterVolumeSpecName: "ceph") pod "55911bc3-f18d-4b68-a56a-0ca95d54e95c" (UID: "55911bc3-f18d-4b68-a56a-0ca95d54e95c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.439608 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-scripts" (OuterVolumeSpecName: "scripts") pod "55911bc3-f18d-4b68-a56a-0ca95d54e95c" (UID: "55911bc3-f18d-4b68-a56a-0ca95d54e95c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.448487 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55911bc3-f18d-4b68-a56a-0ca95d54e95c" (UID: "55911bc3-f18d-4b68-a56a-0ca95d54e95c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.476984 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-config-data" (OuterVolumeSpecName: "config-data") pod "55911bc3-f18d-4b68-a56a-0ca95d54e95c" (UID: "55911bc3-f18d-4b68-a56a-0ca95d54e95c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.522400 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.522448 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.522463 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s29sh\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-kube-api-access-s29sh\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.522474 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.522483 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/55911bc3-f18d-4b68-a56a-0ca95d54e95c-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.522491 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55911bc3-f18d-4b68-a56a-0ca95d54e95c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.522499 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55911bc3-f18d-4b68-a56a-0ca95d54e95c-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.638767 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb27f23-26e7-4f8e-ac4b-6d82ef65a102" path="/var/lib/kubelet/pods/cfb27f23-26e7-4f8e-ac4b-6d82ef65a102/volumes" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.745427 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6f86d8-6fcf-41cf-b941-975937a1107f","Type":"ContainerStarted","Data":"3394bc8b51e86c92d9183a7166dc1a9e222a4f0e3d8239a1428859f05de2973d"} Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.748160 5033 generic.go:334] "Generic (PLEG): container finished" podID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerID="b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc" exitCode=0 Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.748187 5033 generic.go:334] "Generic (PLEG): container finished" podID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerID="0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c" exitCode=143 Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.748208 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"55911bc3-f18d-4b68-a56a-0ca95d54e95c","Type":"ContainerDied","Data":"b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc"} Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.748215 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.748234 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"55911bc3-f18d-4b68-a56a-0ca95d54e95c","Type":"ContainerDied","Data":"0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c"} Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.748246 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"55911bc3-f18d-4b68-a56a-0ca95d54e95c","Type":"ContainerDied","Data":"0dd48ec707530437f1127a140ff9e3ad4035ec6c1680a41a1dc4ff9059788836"} Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.748277 5033 scope.go:117] "RemoveContainer" containerID="b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.778155 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.789727 5033 scope.go:117] "RemoveContainer" containerID="0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.790756 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.798619 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:15:27 crc kubenswrapper[5033]: E0226 17:15:27.798952 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerName="glance-httpd" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.798970 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerName="glance-httpd" Feb 26 17:15:27 crc kubenswrapper[5033]: E0226 17:15:27.799003 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerName="glance-log" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.799010 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerName="glance-log" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.799182 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerName="glance-log" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.799210 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" containerName="glance-httpd" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.800363 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.802181 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.823785 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.826599 5033 scope.go:117] "RemoveContainer" containerID="b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc" Feb 26 17:15:27 crc kubenswrapper[5033]: E0226 17:15:27.828664 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc\": container with ID starting with b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc not found: ID does not exist" containerID="b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.828795 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc"} err="failed to get container status \"b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc\": rpc error: code = NotFound desc = could not find container \"b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc\": container with ID starting with b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc not found: ID does not exist" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.828912 5033 scope.go:117] "RemoveContainer" containerID="0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c" Feb 26 17:15:27 crc kubenswrapper[5033]: E0226 17:15:27.829287 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c\": container with ID starting with 0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c not found: ID does not exist" containerID="0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.829328 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c"} err="failed to get container status \"0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c\": rpc error: code = NotFound desc = could not find container \"0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c\": container with ID starting with 0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c not found: ID does not exist" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.829357 5033 scope.go:117] "RemoveContainer" containerID="b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.829629 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc"} err="failed to get container status \"b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc\": rpc error: code = NotFound desc = could not find container \"b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc\": container with ID starting with b43a702836ab8a42efa0c44b2ea203d70b9819e24a17ab2345c4db29312650fc not found: ID does not exist" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.829666 5033 scope.go:117] "RemoveContainer" containerID="0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.829999 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c"} err="failed to get container status \"0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c\": rpc error: code = NotFound desc = could not find container \"0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c\": container with ID starting with 0beb03a56808f597393936362021013e0be1ccc7d787f5e9a32651f681a10a9c not found: ID does not exist" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.931044 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.931317 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-ceph\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.931425 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-logs\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.931621 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-config-data\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.931679 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6d8\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-kube-api-access-7s6d8\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.931744 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-scripts\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:27 crc kubenswrapper[5033]: I0226 17:15:27.931835 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.033406 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.033746 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-ceph\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.033849 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-logs\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.033957 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-config-data\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.034041 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6d8\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-kube-api-access-7s6d8\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.034141 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-scripts\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.034283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.035507 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.035636 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-logs\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.039532 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-scripts\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.042345 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-config-data\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.043511 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.043864 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-ceph\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.055789 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6d8\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-kube-api-access-7s6d8\") pod \"glance-default-internal-api-0\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.127318 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.704147 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.763186 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6f86d8-6fcf-41cf-b941-975937a1107f","Type":"ContainerStarted","Data":"ede5c075a69ef5eafd8a11e1d92ef436d277ecd12d2dec6c885417a9024961db"} Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.766049 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"015270d2-adee-49f0-b82e-694fe6f99a02","Type":"ContainerStarted","Data":"8cc1f22b90e8d2e242afcfd15cf162ce110524b432857cab65abd17da37ba8a4"} Feb 26 17:15:28 crc kubenswrapper[5033]: I0226 17:15:28.790848 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.790828811 podStartE2EDuration="3.790828811s" podCreationTimestamp="2026-02-26 17:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:28.785102781 +0000 UTC m=+5655.997701441" watchObservedRunningTime="2026-02-26 17:15:28.790828811 +0000 UTC m=+5656.003427471" Feb 26 17:15:29 crc kubenswrapper[5033]: I0226 17:15:29.639153 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55911bc3-f18d-4b68-a56a-0ca95d54e95c" path="/var/lib/kubelet/pods/55911bc3-f18d-4b68-a56a-0ca95d54e95c/volumes" Feb 26 17:15:29 crc kubenswrapper[5033]: I0226 17:15:29.782570 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"015270d2-adee-49f0-b82e-694fe6f99a02","Type":"ContainerStarted","Data":"48829ba27463616e92bfb0c2f4536c0bf955bd61a3b84796c2b4335d01023841"} Feb 26 17:15:30 crc kubenswrapper[5033]: I0226 17:15:30.790815 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"015270d2-adee-49f0-b82e-694fe6f99a02","Type":"ContainerStarted","Data":"4b8827271304642b8b65add542278af58cfa41d160b1f880cf6058c58ad6ab2a"} Feb 26 17:15:30 crc kubenswrapper[5033]: I0226 17:15:30.827975 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.827956239 podStartE2EDuration="3.827956239s" podCreationTimestamp="2026-02-26 17:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:30.824641808 +0000 UTC m=+5658.037240458" watchObservedRunningTime="2026-02-26 17:15:30.827956239 +0000 UTC m=+5658.040554899" Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.267802 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.350236 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69599894df-tzflt"] Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.350637 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69599894df-tzflt" podUID="17dbb140-32e8-4741-a910-462f0b725d67" containerName="dnsmasq-dns" containerID="cri-o://f59d32c4cba61d1471666e03d7a62dcde01129f8519a25ceac421d86f7e99f57" gracePeriod=10 Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.810518 5033 generic.go:334] "Generic (PLEG): container finished" podID="17dbb140-32e8-4741-a910-462f0b725d67" containerID="f59d32c4cba61d1471666e03d7a62dcde01129f8519a25ceac421d86f7e99f57" exitCode=0 Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.810718 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69599894df-tzflt" event={"ID":"17dbb140-32e8-4741-a910-462f0b725d67","Type":"ContainerDied","Data":"f59d32c4cba61d1471666e03d7a62dcde01129f8519a25ceac421d86f7e99f57"} Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.810843 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69599894df-tzflt" event={"ID":"17dbb140-32e8-4741-a910-462f0b725d67","Type":"ContainerDied","Data":"6f64a52b2fee01271990a89c173536cb7ec53caafe6f326bfc3ece0c9768df98"} Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.810864 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f64a52b2fee01271990a89c173536cb7ec53caafe6f326bfc3ece0c9768df98" Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.818043 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.931366 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-dns-svc\") pod \"17dbb140-32e8-4741-a910-462f0b725d67\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.931597 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-nb\") pod \"17dbb140-32e8-4741-a910-462f0b725d67\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.931621 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4hwd\" (UniqueName: \"kubernetes.io/projected/17dbb140-32e8-4741-a910-462f0b725d67-kube-api-access-g4hwd\") pod \"17dbb140-32e8-4741-a910-462f0b725d67\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.931652 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-config\") pod \"17dbb140-32e8-4741-a910-462f0b725d67\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.931741 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-sb\") pod \"17dbb140-32e8-4741-a910-462f0b725d67\" (UID: \"17dbb140-32e8-4741-a910-462f0b725d67\") " Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.938535 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17dbb140-32e8-4741-a910-462f0b725d67-kube-api-access-g4hwd" (OuterVolumeSpecName: "kube-api-access-g4hwd") pod "17dbb140-32e8-4741-a910-462f0b725d67" (UID: "17dbb140-32e8-4741-a910-462f0b725d67"). InnerVolumeSpecName "kube-api-access-g4hwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.974230 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17dbb140-32e8-4741-a910-462f0b725d67" (UID: "17dbb140-32e8-4741-a910-462f0b725d67"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.975830 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17dbb140-32e8-4741-a910-462f0b725d67" (UID: "17dbb140-32e8-4741-a910-462f0b725d67"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.989777 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-config" (OuterVolumeSpecName: "config") pod "17dbb140-32e8-4741-a910-462f0b725d67" (UID: "17dbb140-32e8-4741-a910-462f0b725d67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:15:32 crc kubenswrapper[5033]: I0226 17:15:32.991413 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17dbb140-32e8-4741-a910-462f0b725d67" (UID: "17dbb140-32e8-4741-a910-462f0b725d67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.033415 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4hwd\" (UniqueName: \"kubernetes.io/projected/17dbb140-32e8-4741-a910-462f0b725d67-kube-api-access-g4hwd\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.033449 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.033460 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.033470 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.033480 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dbb140-32e8-4741-a910-462f0b725d67-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.824227 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69599894df-tzflt" Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.855400 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69599894df-tzflt"] Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.862759 5033 scope.go:117] "RemoveContainer" containerID="735649ec8dc2e420d393f9746aaaca6c864f7bca67aebbbd2f5931365f50469d" Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.870578 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69599894df-tzflt"] Feb 26 17:15:33 crc kubenswrapper[5033]: I0226 17:15:33.893588 5033 scope.go:117] "RemoveContainer" containerID="eeffbfcc3bfc323ab786e3eb99a82574482bc9f6f05b168523e4837b23bfb3a0" Feb 26 17:15:35 crc kubenswrapper[5033]: I0226 17:15:35.647224 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17dbb140-32e8-4741-a910-462f0b725d67" path="/var/lib/kubelet/pods/17dbb140-32e8-4741-a910-462f0b725d67/volumes" Feb 26 17:15:36 crc kubenswrapper[5033]: I0226 17:15:36.130614 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 17:15:36 crc kubenswrapper[5033]: I0226 17:15:36.130703 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 17:15:36 crc kubenswrapper[5033]: I0226 17:15:36.168040 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 17:15:36 crc kubenswrapper[5033]: I0226 17:15:36.172760 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 17:15:36 crc kubenswrapper[5033]: I0226 17:15:36.849954 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 17:15:36 crc kubenswrapper[5033]: I0226 17:15:36.850163 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 17:15:38 crc kubenswrapper[5033]: I0226 17:15:38.128623 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:38 crc kubenswrapper[5033]: I0226 17:15:38.128666 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:38 crc kubenswrapper[5033]: I0226 17:15:38.168682 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:38 crc kubenswrapper[5033]: I0226 17:15:38.169092 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:38 crc kubenswrapper[5033]: I0226 17:15:38.772100 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 17:15:38 crc kubenswrapper[5033]: I0226 17:15:38.839404 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 17:15:38 crc kubenswrapper[5033]: I0226 17:15:38.896463 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:38 crc kubenswrapper[5033]: I0226 17:15:38.896781 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:40 crc kubenswrapper[5033]: I0226 17:15:40.719202 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:40 crc kubenswrapper[5033]: I0226 17:15:40.720742 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.574623 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x7mjt"] Feb 26 17:15:42 crc kubenswrapper[5033]: E0226 17:15:42.575371 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17dbb140-32e8-4741-a910-462f0b725d67" containerName="init" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.575390 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="17dbb140-32e8-4741-a910-462f0b725d67" containerName="init" Feb 26 17:15:42 crc kubenswrapper[5033]: E0226 17:15:42.575416 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17dbb140-32e8-4741-a910-462f0b725d67" containerName="dnsmasq-dns" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.575424 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="17dbb140-32e8-4741-a910-462f0b725d67" containerName="dnsmasq-dns" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.575630 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="17dbb140-32e8-4741-a910-462f0b725d67" containerName="dnsmasq-dns" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.599853 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.608281 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-catalog-content\") pod \"redhat-marketplace-x7mjt\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.608400 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-utilities\") pod \"redhat-marketplace-x7mjt\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.608437 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcglp\" (UniqueName: \"kubernetes.io/projected/bf100af4-901f-4268-9109-a672177ea41c-kube-api-access-gcglp\") pod \"redhat-marketplace-x7mjt\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.609801 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7mjt"] Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.709756 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-catalog-content\") pod \"redhat-marketplace-x7mjt\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.710194 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-utilities\") pod \"redhat-marketplace-x7mjt\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.710248 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcglp\" (UniqueName: \"kubernetes.io/projected/bf100af4-901f-4268-9109-a672177ea41c-kube-api-access-gcglp\") pod \"redhat-marketplace-x7mjt\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.710869 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-catalog-content\") pod \"redhat-marketplace-x7mjt\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.710921 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-utilities\") pod \"redhat-marketplace-x7mjt\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.744977 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcglp\" (UniqueName: \"kubernetes.io/projected/bf100af4-901f-4268-9109-a672177ea41c-kube-api-access-gcglp\") pod \"redhat-marketplace-x7mjt\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:42 crc kubenswrapper[5033]: I0226 17:15:42.942400 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:43 crc kubenswrapper[5033]: I0226 17:15:43.451874 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7mjt"] Feb 26 17:15:43 crc kubenswrapper[5033]: W0226 17:15:43.456183 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf100af4_901f_4268_9109_a672177ea41c.slice/crio-1c201a14830c48ed55b281bd4717d2608e0bebfae6e4de89ebb20ac9fc1a36dd WatchSource:0}: Error finding container 1c201a14830c48ed55b281bd4717d2608e0bebfae6e4de89ebb20ac9fc1a36dd: Status 404 returned error can't find the container with id 1c201a14830c48ed55b281bd4717d2608e0bebfae6e4de89ebb20ac9fc1a36dd Feb 26 17:15:43 crc kubenswrapper[5033]: I0226 17:15:43.944818 5033 generic.go:334] "Generic (PLEG): container finished" podID="bf100af4-901f-4268-9109-a672177ea41c" containerID="c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d" exitCode=0 Feb 26 17:15:43 crc kubenswrapper[5033]: I0226 17:15:43.945072 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7mjt" event={"ID":"bf100af4-901f-4268-9109-a672177ea41c","Type":"ContainerDied","Data":"c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d"} Feb 26 17:15:43 crc kubenswrapper[5033]: I0226 17:15:43.945425 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7mjt" event={"ID":"bf100af4-901f-4268-9109-a672177ea41c","Type":"ContainerStarted","Data":"1c201a14830c48ed55b281bd4717d2608e0bebfae6e4de89ebb20ac9fc1a36dd"} Feb 26 17:15:45 crc kubenswrapper[5033]: I0226 17:15:45.969008 5033 generic.go:334] "Generic (PLEG): container finished" podID="bf100af4-901f-4268-9109-a672177ea41c" containerID="1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94" exitCode=0 Feb 26 17:15:45 crc kubenswrapper[5033]: I0226 17:15:45.969097 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7mjt" event={"ID":"bf100af4-901f-4268-9109-a672177ea41c","Type":"ContainerDied","Data":"1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94"} Feb 26 17:15:46 crc kubenswrapper[5033]: I0226 17:15:46.978154 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7mjt" event={"ID":"bf100af4-901f-4268-9109-a672177ea41c","Type":"ContainerStarted","Data":"04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb"} Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.001701 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x7mjt" podStartSLOduration=2.587623128 podStartE2EDuration="5.001680174s" podCreationTimestamp="2026-02-26 17:15:42 +0000 UTC" firstStartedPulling="2026-02-26 17:15:43.947015986 +0000 UTC m=+5671.159614646" lastFinishedPulling="2026-02-26 17:15:46.361073022 +0000 UTC m=+5673.573671692" observedRunningTime="2026-02-26 17:15:46.993465363 +0000 UTC m=+5674.206064023" watchObservedRunningTime="2026-02-26 17:15:47.001680174 +0000 UTC m=+5674.214278824" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.064183 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7vnpp"] Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.065844 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.100084 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-150c-account-create-update-zp46w"] Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.101834 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.104381 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.108854 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-150c-account-create-update-zp46w"] Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.115952 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7vnpp"] Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.197784 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a47002-d1e8-4687-adc2-9a34643775e3-operator-scripts\") pod \"placement-db-create-7vnpp\" (UID: \"f7a47002-d1e8-4687-adc2-9a34643775e3\") " pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.198118 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrd88\" (UniqueName: \"kubernetes.io/projected/f7a47002-d1e8-4687-adc2-9a34643775e3-kube-api-access-xrd88\") pod \"placement-db-create-7vnpp\" (UID: \"f7a47002-d1e8-4687-adc2-9a34643775e3\") " pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.300240 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrd88\" (UniqueName: \"kubernetes.io/projected/f7a47002-d1e8-4687-adc2-9a34643775e3-kube-api-access-xrd88\") pod \"placement-db-create-7vnpp\" (UID: \"f7a47002-d1e8-4687-adc2-9a34643775e3\") " pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.300879 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a52682-a5b9-41c1-a122-c251a1a69f5a-operator-scripts\") pod \"placement-150c-account-create-update-zp46w\" (UID: \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\") " pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.301004 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtgb\" (UniqueName: \"kubernetes.io/projected/c3a52682-a5b9-41c1-a122-c251a1a69f5a-kube-api-access-sdtgb\") pod \"placement-150c-account-create-update-zp46w\" (UID: \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\") " pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.301123 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a47002-d1e8-4687-adc2-9a34643775e3-operator-scripts\") pod \"placement-db-create-7vnpp\" (UID: \"f7a47002-d1e8-4687-adc2-9a34643775e3\") " pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.302019 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a47002-d1e8-4687-adc2-9a34643775e3-operator-scripts\") pod \"placement-db-create-7vnpp\" (UID: \"f7a47002-d1e8-4687-adc2-9a34643775e3\") " pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.316990 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrd88\" (UniqueName: \"kubernetes.io/projected/f7a47002-d1e8-4687-adc2-9a34643775e3-kube-api-access-xrd88\") pod \"placement-db-create-7vnpp\" (UID: \"f7a47002-d1e8-4687-adc2-9a34643775e3\") " pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.402234 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a52682-a5b9-41c1-a122-c251a1a69f5a-operator-scripts\") pod \"placement-150c-account-create-update-zp46w\" (UID: \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\") " pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.402312 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtgb\" (UniqueName: \"kubernetes.io/projected/c3a52682-a5b9-41c1-a122-c251a1a69f5a-kube-api-access-sdtgb\") pod \"placement-150c-account-create-update-zp46w\" (UID: \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\") " pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.403139 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a52682-a5b9-41c1-a122-c251a1a69f5a-operator-scripts\") pod \"placement-150c-account-create-update-zp46w\" (UID: \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\") " pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.416226 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.428590 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtgb\" (UniqueName: \"kubernetes.io/projected/c3a52682-a5b9-41c1-a122-c251a1a69f5a-kube-api-access-sdtgb\") pod \"placement-150c-account-create-update-zp46w\" (UID: \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\") " pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.728371 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:47 crc kubenswrapper[5033]: W0226 17:15:47.893599 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7a47002_d1e8_4687_adc2_9a34643775e3.slice/crio-89b53a2b8166acf464a0b74af6b41df6a53e9b7498b65601177032c115b39ac4 WatchSource:0}: Error finding container 89b53a2b8166acf464a0b74af6b41df6a53e9b7498b65601177032c115b39ac4: Status 404 returned error can't find the container with id 89b53a2b8166acf464a0b74af6b41df6a53e9b7498b65601177032c115b39ac4 Feb 26 17:15:47 crc kubenswrapper[5033]: I0226 17:15:47.894585 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7vnpp"] Feb 26 17:15:48 crc kubenswrapper[5033]: I0226 17:15:48.000552 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7vnpp" event={"ID":"f7a47002-d1e8-4687-adc2-9a34643775e3","Type":"ContainerStarted","Data":"89b53a2b8166acf464a0b74af6b41df6a53e9b7498b65601177032c115b39ac4"} Feb 26 17:15:48 crc kubenswrapper[5033]: I0226 17:15:48.228682 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-150c-account-create-update-zp46w"] Feb 26 17:15:49 crc kubenswrapper[5033]: I0226 17:15:49.010523 5033 generic.go:334] "Generic (PLEG): container finished" podID="c3a52682-a5b9-41c1-a122-c251a1a69f5a" containerID="d9b026180b470927a12c2159555297e04830e9eb4b396d091f3bfcd5f5ae1829" exitCode=0 Feb 26 17:15:49 crc kubenswrapper[5033]: I0226 17:15:49.010587 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-150c-account-create-update-zp46w" event={"ID":"c3a52682-a5b9-41c1-a122-c251a1a69f5a","Type":"ContainerDied","Data":"d9b026180b470927a12c2159555297e04830e9eb4b396d091f3bfcd5f5ae1829"} Feb 26 17:15:49 crc kubenswrapper[5033]: I0226 17:15:49.010614 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-150c-account-create-update-zp46w" event={"ID":"c3a52682-a5b9-41c1-a122-c251a1a69f5a","Type":"ContainerStarted","Data":"4b7927cbadd218e4b4f5a5aaaeee2b788d4daca811a202f3fbd2f7bca2f1253d"} Feb 26 17:15:49 crc kubenswrapper[5033]: I0226 17:15:49.012775 5033 generic.go:334] "Generic (PLEG): container finished" podID="f7a47002-d1e8-4687-adc2-9a34643775e3" containerID="d33f54c76f48b6a2978f86307569ec5f7151fc932b878b095567cdda3c8ca97c" exitCode=0 Feb 26 17:15:49 crc kubenswrapper[5033]: I0226 17:15:49.012804 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7vnpp" event={"ID":"f7a47002-d1e8-4687-adc2-9a34643775e3","Type":"ContainerDied","Data":"d33f54c76f48b6a2978f86307569ec5f7151fc932b878b095567cdda3c8ca97c"} Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.188940 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.189199 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.460522 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.467399 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.566049 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrd88\" (UniqueName: \"kubernetes.io/projected/f7a47002-d1e8-4687-adc2-9a34643775e3-kube-api-access-xrd88\") pod \"f7a47002-d1e8-4687-adc2-9a34643775e3\" (UID: \"f7a47002-d1e8-4687-adc2-9a34643775e3\") " Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.566111 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a47002-d1e8-4687-adc2-9a34643775e3-operator-scripts\") pod \"f7a47002-d1e8-4687-adc2-9a34643775e3\" (UID: \"f7a47002-d1e8-4687-adc2-9a34643775e3\") " Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.566211 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdtgb\" (UniqueName: \"kubernetes.io/projected/c3a52682-a5b9-41c1-a122-c251a1a69f5a-kube-api-access-sdtgb\") pod \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\" (UID: \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\") " Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.566239 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a52682-a5b9-41c1-a122-c251a1a69f5a-operator-scripts\") pod \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\" (UID: \"c3a52682-a5b9-41c1-a122-c251a1a69f5a\") " Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.566997 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a52682-a5b9-41c1-a122-c251a1a69f5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3a52682-a5b9-41c1-a122-c251a1a69f5a" (UID: "c3a52682-a5b9-41c1-a122-c251a1a69f5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.567112 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7a47002-d1e8-4687-adc2-9a34643775e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7a47002-d1e8-4687-adc2-9a34643775e3" (UID: "f7a47002-d1e8-4687-adc2-9a34643775e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.570872 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a47002-d1e8-4687-adc2-9a34643775e3-kube-api-access-xrd88" (OuterVolumeSpecName: "kube-api-access-xrd88") pod "f7a47002-d1e8-4687-adc2-9a34643775e3" (UID: "f7a47002-d1e8-4687-adc2-9a34643775e3"). InnerVolumeSpecName "kube-api-access-xrd88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.571405 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a52682-a5b9-41c1-a122-c251a1a69f5a-kube-api-access-sdtgb" (OuterVolumeSpecName: "kube-api-access-sdtgb") pod "c3a52682-a5b9-41c1-a122-c251a1a69f5a" (UID: "c3a52682-a5b9-41c1-a122-c251a1a69f5a"). InnerVolumeSpecName "kube-api-access-sdtgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.668488 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrd88\" (UniqueName: \"kubernetes.io/projected/f7a47002-d1e8-4687-adc2-9a34643775e3-kube-api-access-xrd88\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.668520 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a47002-d1e8-4687-adc2-9a34643775e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.668529 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdtgb\" (UniqueName: \"kubernetes.io/projected/c3a52682-a5b9-41c1-a122-c251a1a69f5a-kube-api-access-sdtgb\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:50 crc kubenswrapper[5033]: I0226 17:15:50.668537 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a52682-a5b9-41c1-a122-c251a1a69f5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:51 crc kubenswrapper[5033]: I0226 17:15:51.043324 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7vnpp" Feb 26 17:15:51 crc kubenswrapper[5033]: I0226 17:15:51.043342 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7vnpp" event={"ID":"f7a47002-d1e8-4687-adc2-9a34643775e3","Type":"ContainerDied","Data":"89b53a2b8166acf464a0b74af6b41df6a53e9b7498b65601177032c115b39ac4"} Feb 26 17:15:51 crc kubenswrapper[5033]: I0226 17:15:51.043391 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b53a2b8166acf464a0b74af6b41df6a53e9b7498b65601177032c115b39ac4" Feb 26 17:15:51 crc kubenswrapper[5033]: I0226 17:15:51.045799 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-150c-account-create-update-zp46w" event={"ID":"c3a52682-a5b9-41c1-a122-c251a1a69f5a","Type":"ContainerDied","Data":"4b7927cbadd218e4b4f5a5aaaeee2b788d4daca811a202f3fbd2f7bca2f1253d"} Feb 26 17:15:51 crc kubenswrapper[5033]: I0226 17:15:51.045988 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b7927cbadd218e4b4f5a5aaaeee2b788d4daca811a202f3fbd2f7bca2f1253d" Feb 26 17:15:51 crc kubenswrapper[5033]: I0226 17:15:51.045841 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-150c-account-create-update-zp46w" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.492226 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9f997b9c-rb87b"] Feb 26 17:15:52 crc kubenswrapper[5033]: E0226 17:15:52.492726 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a47002-d1e8-4687-adc2-9a34643775e3" containerName="mariadb-database-create" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.492744 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a47002-d1e8-4687-adc2-9a34643775e3" containerName="mariadb-database-create" Feb 26 17:15:52 crc kubenswrapper[5033]: E0226 17:15:52.492759 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a52682-a5b9-41c1-a122-c251a1a69f5a" containerName="mariadb-account-create-update" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.492768 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a52682-a5b9-41c1-a122-c251a1a69f5a" containerName="mariadb-account-create-update" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.493013 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a47002-d1e8-4687-adc2-9a34643775e3" containerName="mariadb-database-create" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.493024 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a52682-a5b9-41c1-a122-c251a1a69f5a" containerName="mariadb-account-create-update" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.494148 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.514475 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9f997b9c-rb87b"] Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.533586 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-rm685"] Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.534675 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.539090 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.539342 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.539472 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-278nn" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.561219 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rm685"] Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.617995 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-config-data\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.618043 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ede1906-e96c-43de-9108-5ab8febc23d6-logs\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.618068 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7dns\" (UniqueName: \"kubernetes.io/projected/858c2903-f2d4-487b-89c1-0d5f29774f01-kube-api-access-r7dns\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.618086 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-scripts\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.618105 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-dns-svc\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.618139 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-nb\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.618156 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-sb\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.618176 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-combined-ca-bundle\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.618228 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/2ede1906-e96c-43de-9108-5ab8febc23d6-kube-api-access-d52tn\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.618283 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-config\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720269 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-config-data\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720366 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ede1906-e96c-43de-9108-5ab8febc23d6-logs\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720541 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7dns\" (UniqueName: \"kubernetes.io/projected/858c2903-f2d4-487b-89c1-0d5f29774f01-kube-api-access-r7dns\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720585 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-scripts\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720629 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-dns-svc\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720727 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-nb\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720776 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-sb\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720824 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-combined-ca-bundle\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720930 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/2ede1906-e96c-43de-9108-5ab8febc23d6-kube-api-access-d52tn\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.720976 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-config\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.721761 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ede1906-e96c-43de-9108-5ab8febc23d6-logs\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.722413 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-nb\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.722427 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-sb\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.722970 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-dns-svc\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.723331 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-config\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.733866 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-scripts\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.734087 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-combined-ca-bundle\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.734370 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-config-data\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.741078 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/2ede1906-e96c-43de-9108-5ab8febc23d6-kube-api-access-d52tn\") pod \"placement-db-sync-rm685\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.761020 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7dns\" (UniqueName: \"kubernetes.io/projected/858c2903-f2d4-487b-89c1-0d5f29774f01-kube-api-access-r7dns\") pod \"dnsmasq-dns-9f997b9c-rb87b\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.829500 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.856842 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rm685" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.943962 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:52 crc kubenswrapper[5033]: I0226 17:15:52.944007 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:53 crc kubenswrapper[5033]: I0226 17:15:53.037218 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:53 crc kubenswrapper[5033]: I0226 17:15:53.118657 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:53 crc kubenswrapper[5033]: I0226 17:15:53.275291 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7mjt"] Feb 26 17:15:53 crc kubenswrapper[5033]: I0226 17:15:53.319081 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9f997b9c-rb87b"] Feb 26 17:15:53 crc kubenswrapper[5033]: W0226 17:15:53.422503 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ede1906_e96c_43de_9108_5ab8febc23d6.slice/crio-e7dbf0ff90eb19d2d533f5d8ac5cbe17d1aa46c6c91127264bd395483741313b WatchSource:0}: Error finding container e7dbf0ff90eb19d2d533f5d8ac5cbe17d1aa46c6c91127264bd395483741313b: Status 404 returned error can't find the container with id e7dbf0ff90eb19d2d533f5d8ac5cbe17d1aa46c6c91127264bd395483741313b Feb 26 17:15:53 crc kubenswrapper[5033]: I0226 17:15:53.431603 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rm685"] Feb 26 17:15:54 crc kubenswrapper[5033]: I0226 17:15:54.081043 5033 generic.go:334] "Generic (PLEG): container finished" podID="858c2903-f2d4-487b-89c1-0d5f29774f01" containerID="4364bfb19d0b1387b2a6164f83a73a66153cfcaeab44a1769fd0a2e04bd1e79f" exitCode=0 Feb 26 17:15:54 crc kubenswrapper[5033]: I0226 17:15:54.081122 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" event={"ID":"858c2903-f2d4-487b-89c1-0d5f29774f01","Type":"ContainerDied","Data":"4364bfb19d0b1387b2a6164f83a73a66153cfcaeab44a1769fd0a2e04bd1e79f"} Feb 26 17:15:54 crc kubenswrapper[5033]: I0226 17:15:54.081565 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" event={"ID":"858c2903-f2d4-487b-89c1-0d5f29774f01","Type":"ContainerStarted","Data":"2f41d2eb080fb44f0a4c61fb3ec7c9cf69be57567582ed1901221bbebc92b1cf"} Feb 26 17:15:54 crc kubenswrapper[5033]: I0226 17:15:54.083018 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rm685" event={"ID":"2ede1906-e96c-43de-9108-5ab8febc23d6","Type":"ContainerStarted","Data":"b4ba149fa1e83b3c267ef2584833fcea66da9b6be0a7fceffd5f84c151caf79c"} Feb 26 17:15:54 crc kubenswrapper[5033]: I0226 17:15:54.083069 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rm685" event={"ID":"2ede1906-e96c-43de-9108-5ab8febc23d6","Type":"ContainerStarted","Data":"e7dbf0ff90eb19d2d533f5d8ac5cbe17d1aa46c6c91127264bd395483741313b"} Feb 26 17:15:54 crc kubenswrapper[5033]: I0226 17:15:54.133820 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-rm685" podStartSLOduration=2.133788865 podStartE2EDuration="2.133788865s" podCreationTimestamp="2026-02-26 17:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:54.122013937 +0000 UTC m=+5681.334612587" watchObservedRunningTime="2026-02-26 17:15:54.133788865 +0000 UTC m=+5681.346387535" Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.094308 5033 generic.go:334] "Generic (PLEG): container finished" podID="2ede1906-e96c-43de-9108-5ab8febc23d6" containerID="b4ba149fa1e83b3c267ef2584833fcea66da9b6be0a7fceffd5f84c151caf79c" exitCode=0 Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.094428 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rm685" event={"ID":"2ede1906-e96c-43de-9108-5ab8febc23d6","Type":"ContainerDied","Data":"b4ba149fa1e83b3c267ef2584833fcea66da9b6be0a7fceffd5f84c151caf79c"} Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.097100 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" event={"ID":"858c2903-f2d4-487b-89c1-0d5f29774f01","Type":"ContainerStarted","Data":"507f536d72f6cbddd2b0f79605b953393afc6bd3104284c25d7b6a13bb616ea4"} Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.097235 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x7mjt" podUID="bf100af4-901f-4268-9109-a672177ea41c" containerName="registry-server" containerID="cri-o://04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb" gracePeriod=2 Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.148108 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" podStartSLOduration=3.148085945 podStartE2EDuration="3.148085945s" podCreationTimestamp="2026-02-26 17:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:55.139344282 +0000 UTC m=+5682.351942942" watchObservedRunningTime="2026-02-26 17:15:55.148085945 +0000 UTC m=+5682.360684605" Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.650706 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.772467 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-catalog-content\") pod \"bf100af4-901f-4268-9109-a672177ea41c\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.772888 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcglp\" (UniqueName: \"kubernetes.io/projected/bf100af4-901f-4268-9109-a672177ea41c-kube-api-access-gcglp\") pod \"bf100af4-901f-4268-9109-a672177ea41c\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.773394 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-utilities\") pod \"bf100af4-901f-4268-9109-a672177ea41c\" (UID: \"bf100af4-901f-4268-9109-a672177ea41c\") " Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.775318 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-utilities" (OuterVolumeSpecName: "utilities") pod "bf100af4-901f-4268-9109-a672177ea41c" (UID: "bf100af4-901f-4268-9109-a672177ea41c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.779214 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf100af4-901f-4268-9109-a672177ea41c-kube-api-access-gcglp" (OuterVolumeSpecName: "kube-api-access-gcglp") pod "bf100af4-901f-4268-9109-a672177ea41c" (UID: "bf100af4-901f-4268-9109-a672177ea41c"). InnerVolumeSpecName "kube-api-access-gcglp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.780575 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcglp\" (UniqueName: \"kubernetes.io/projected/bf100af4-901f-4268-9109-a672177ea41c-kube-api-access-gcglp\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.780611 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.836802 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf100af4-901f-4268-9109-a672177ea41c" (UID: "bf100af4-901f-4268-9109-a672177ea41c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:15:55 crc kubenswrapper[5033]: I0226 17:15:55.882560 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf100af4-901f-4268-9109-a672177ea41c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.108641 5033 generic.go:334] "Generic (PLEG): container finished" podID="bf100af4-901f-4268-9109-a672177ea41c" containerID="04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb" exitCode=0 Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.108714 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7mjt" event={"ID":"bf100af4-901f-4268-9109-a672177ea41c","Type":"ContainerDied","Data":"04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb"} Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.108779 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7mjt" event={"ID":"bf100af4-901f-4268-9109-a672177ea41c","Type":"ContainerDied","Data":"1c201a14830c48ed55b281bd4717d2608e0bebfae6e4de89ebb20ac9fc1a36dd"} Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.108802 5033 scope.go:117] "RemoveContainer" containerID="04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.108990 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.111961 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7mjt" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.145192 5033 scope.go:117] "RemoveContainer" containerID="1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.152897 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7mjt"] Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.160966 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7mjt"] Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.189301 5033 scope.go:117] "RemoveContainer" containerID="c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.261081 5033 scope.go:117] "RemoveContainer" containerID="04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb" Feb 26 17:15:56 crc kubenswrapper[5033]: E0226 17:15:56.261391 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb\": container with ID starting with 04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb not found: ID does not exist" containerID="04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.261427 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb"} err="failed to get container status \"04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb\": rpc error: code = NotFound desc = could not find container \"04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb\": container with ID starting with 04e863951dd0ab014cfaeeec36bba2f93b1011835b3a79ca6a188e73851d8abb not found: ID does not exist" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.261451 5033 scope.go:117] "RemoveContainer" containerID="1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94" Feb 26 17:15:56 crc kubenswrapper[5033]: E0226 17:15:56.261867 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94\": container with ID starting with 1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94 not found: ID does not exist" containerID="1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.262095 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94"} err="failed to get container status \"1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94\": rpc error: code = NotFound desc = could not find container \"1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94\": container with ID starting with 1757f315eaaabc7b6a6d154255743248d9ed67c69f469a2fe846a15edb75be94 not found: ID does not exist" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.262394 5033 scope.go:117] "RemoveContainer" containerID="c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d" Feb 26 17:15:56 crc kubenswrapper[5033]: E0226 17:15:56.263014 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d\": container with ID starting with c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d not found: ID does not exist" containerID="c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.263111 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d"} err="failed to get container status \"c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d\": rpc error: code = NotFound desc = could not find container \"c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d\": container with ID starting with c9c0b8e6d11d8bab8b90d37b6beb53a75101c20faecfb20bae1a3c2d9b74247d not found: ID does not exist" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.451209 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rm685" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.596533 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/2ede1906-e96c-43de-9108-5ab8febc23d6-kube-api-access-d52tn\") pod \"2ede1906-e96c-43de-9108-5ab8febc23d6\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.596749 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-config-data\") pod \"2ede1906-e96c-43de-9108-5ab8febc23d6\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.596938 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ede1906-e96c-43de-9108-5ab8febc23d6-logs\") pod \"2ede1906-e96c-43de-9108-5ab8febc23d6\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.597040 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-scripts\") pod \"2ede1906-e96c-43de-9108-5ab8febc23d6\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.597102 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-combined-ca-bundle\") pod \"2ede1906-e96c-43de-9108-5ab8febc23d6\" (UID: \"2ede1906-e96c-43de-9108-5ab8febc23d6\") " Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.597451 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ede1906-e96c-43de-9108-5ab8febc23d6-logs" (OuterVolumeSpecName: "logs") pod "2ede1906-e96c-43de-9108-5ab8febc23d6" (UID: "2ede1906-e96c-43de-9108-5ab8febc23d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.598086 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ede1906-e96c-43de-9108-5ab8febc23d6-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.603769 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-scripts" (OuterVolumeSpecName: "scripts") pod "2ede1906-e96c-43de-9108-5ab8febc23d6" (UID: "2ede1906-e96c-43de-9108-5ab8febc23d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.603830 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ede1906-e96c-43de-9108-5ab8febc23d6-kube-api-access-d52tn" (OuterVolumeSpecName: "kube-api-access-d52tn") pod "2ede1906-e96c-43de-9108-5ab8febc23d6" (UID: "2ede1906-e96c-43de-9108-5ab8febc23d6"). InnerVolumeSpecName "kube-api-access-d52tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.629511 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-config-data" (OuterVolumeSpecName: "config-data") pod "2ede1906-e96c-43de-9108-5ab8febc23d6" (UID: "2ede1906-e96c-43de-9108-5ab8febc23d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.641370 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ede1906-e96c-43de-9108-5ab8febc23d6" (UID: "2ede1906-e96c-43de-9108-5ab8febc23d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.700516 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.700584 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.700615 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d52tn\" (UniqueName: \"kubernetes.io/projected/2ede1906-e96c-43de-9108-5ab8febc23d6-kube-api-access-d52tn\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.700641 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ede1906-e96c-43de-9108-5ab8febc23d6-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.784818 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-788bb6fb9b-t7f5z"] Feb 26 17:15:56 crc kubenswrapper[5033]: E0226 17:15:56.785164 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf100af4-901f-4268-9109-a672177ea41c" containerName="registry-server" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.785185 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf100af4-901f-4268-9109-a672177ea41c" containerName="registry-server" Feb 26 17:15:56 crc kubenswrapper[5033]: E0226 17:15:56.785195 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ede1906-e96c-43de-9108-5ab8febc23d6" containerName="placement-db-sync" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.785200 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ede1906-e96c-43de-9108-5ab8febc23d6" containerName="placement-db-sync" Feb 26 17:15:56 crc kubenswrapper[5033]: E0226 17:15:56.785218 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf100af4-901f-4268-9109-a672177ea41c" containerName="extract-content" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.785227 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf100af4-901f-4268-9109-a672177ea41c" containerName="extract-content" Feb 26 17:15:56 crc kubenswrapper[5033]: E0226 17:15:56.785252 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf100af4-901f-4268-9109-a672177ea41c" containerName="extract-utilities" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.785274 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf100af4-901f-4268-9109-a672177ea41c" containerName="extract-utilities" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.785437 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ede1906-e96c-43de-9108-5ab8febc23d6" containerName="placement-db-sync" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.785461 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf100af4-901f-4268-9109-a672177ea41c" containerName="registry-server" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.786390 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.799792 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-788bb6fb9b-t7f5z"] Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.904424 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td762\" (UniqueName: \"kubernetes.io/projected/962b99a2-813c-4ab4-95ed-1aae526764b1-kube-api-access-td762\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.904494 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962b99a2-813c-4ab4-95ed-1aae526764b1-combined-ca-bundle\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.904577 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962b99a2-813c-4ab4-95ed-1aae526764b1-scripts\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.904635 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962b99a2-813c-4ab4-95ed-1aae526764b1-logs\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:56 crc kubenswrapper[5033]: I0226 17:15:56.904693 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962b99a2-813c-4ab4-95ed-1aae526764b1-config-data\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.006401 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td762\" (UniqueName: \"kubernetes.io/projected/962b99a2-813c-4ab4-95ed-1aae526764b1-kube-api-access-td762\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.006483 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962b99a2-813c-4ab4-95ed-1aae526764b1-combined-ca-bundle\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.006529 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962b99a2-813c-4ab4-95ed-1aae526764b1-scripts\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.006619 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962b99a2-813c-4ab4-95ed-1aae526764b1-logs\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.007310 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962b99a2-813c-4ab4-95ed-1aae526764b1-logs\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.007482 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962b99a2-813c-4ab4-95ed-1aae526764b1-config-data\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.009977 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962b99a2-813c-4ab4-95ed-1aae526764b1-scripts\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.010317 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962b99a2-813c-4ab4-95ed-1aae526764b1-combined-ca-bundle\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.011013 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962b99a2-813c-4ab4-95ed-1aae526764b1-config-data\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.022711 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td762\" (UniqueName: \"kubernetes.io/projected/962b99a2-813c-4ab4-95ed-1aae526764b1-kube-api-access-td762\") pod \"placement-788bb6fb9b-t7f5z\" (UID: \"962b99a2-813c-4ab4-95ed-1aae526764b1\") " pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.109361 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.119427 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rm685" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.119450 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rm685" event={"ID":"2ede1906-e96c-43de-9108-5ab8febc23d6","Type":"ContainerDied","Data":"e7dbf0ff90eb19d2d533f5d8ac5cbe17d1aa46c6c91127264bd395483741313b"} Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.119492 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7dbf0ff90eb19d2d533f5d8ac5cbe17d1aa46c6c91127264bd395483741313b" Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.352532 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-788bb6fb9b-t7f5z"] Feb 26 17:15:57 crc kubenswrapper[5033]: I0226 17:15:57.644377 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf100af4-901f-4268-9109-a672177ea41c" path="/var/lib/kubelet/pods/bf100af4-901f-4268-9109-a672177ea41c/volumes" Feb 26 17:15:58 crc kubenswrapper[5033]: I0226 17:15:58.134419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788bb6fb9b-t7f5z" event={"ID":"962b99a2-813c-4ab4-95ed-1aae526764b1","Type":"ContainerStarted","Data":"201375ffadbb209636084ae86069d63b10ce0fe3257f189b81e141bf4a4c06b8"} Feb 26 17:15:58 crc kubenswrapper[5033]: I0226 17:15:58.134488 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788bb6fb9b-t7f5z" event={"ID":"962b99a2-813c-4ab4-95ed-1aae526764b1","Type":"ContainerStarted","Data":"f65b292a25b7d9d7295f0d78c5f26730630bad5938e19035f340fb18fd3040bd"} Feb 26 17:15:58 crc kubenswrapper[5033]: I0226 17:15:58.134519 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-788bb6fb9b-t7f5z" event={"ID":"962b99a2-813c-4ab4-95ed-1aae526764b1","Type":"ContainerStarted","Data":"67f7354bc3230a47a8a1b49cfbbe450e0731ca9e77a465b5213791a4846defa8"} Feb 26 17:15:58 crc kubenswrapper[5033]: I0226 17:15:58.136368 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:15:58 crc kubenswrapper[5033]: I0226 17:15:58.136414 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.136493 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-788bb6fb9b-t7f5z" podStartSLOduration=4.136435876 podStartE2EDuration="4.136435876s" podCreationTimestamp="2026-02-26 17:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:15:58.159527937 +0000 UTC m=+5685.372126637" watchObservedRunningTime="2026-02-26 17:16:00.136435876 +0000 UTC m=+5687.349034536" Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.142992 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535436-r62zf"] Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.144302 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535436-r62zf" Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.153665 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535436-r62zf"] Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.173240 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.173737 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.174092 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.275104 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfpwc\" (UniqueName: \"kubernetes.io/projected/15c4daae-b1b6-48ff-be61-eb705aeb62f1-kube-api-access-nfpwc\") pod \"auto-csr-approver-29535436-r62zf\" (UID: \"15c4daae-b1b6-48ff-be61-eb705aeb62f1\") " pod="openshift-infra/auto-csr-approver-29535436-r62zf" Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.377136 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfpwc\" (UniqueName: \"kubernetes.io/projected/15c4daae-b1b6-48ff-be61-eb705aeb62f1-kube-api-access-nfpwc\") pod \"auto-csr-approver-29535436-r62zf\" (UID: \"15c4daae-b1b6-48ff-be61-eb705aeb62f1\") " pod="openshift-infra/auto-csr-approver-29535436-r62zf" Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.396955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfpwc\" (UniqueName: \"kubernetes.io/projected/15c4daae-b1b6-48ff-be61-eb705aeb62f1-kube-api-access-nfpwc\") pod \"auto-csr-approver-29535436-r62zf\" (UID: \"15c4daae-b1b6-48ff-be61-eb705aeb62f1\") " pod="openshift-infra/auto-csr-approver-29535436-r62zf" Feb 26 17:16:00 crc kubenswrapper[5033]: I0226 17:16:00.512554 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535436-r62zf" Feb 26 17:16:01 crc kubenswrapper[5033]: I0226 17:16:01.007839 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535436-r62zf"] Feb 26 17:16:01 crc kubenswrapper[5033]: I0226 17:16:01.018390 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:16:01 crc kubenswrapper[5033]: I0226 17:16:01.187978 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535436-r62zf" event={"ID":"15c4daae-b1b6-48ff-be61-eb705aeb62f1","Type":"ContainerStarted","Data":"679a1e9b8cabb98250b804bdf59761cec6506c34cfafe609879256c0f574303d"} Feb 26 17:16:02 crc kubenswrapper[5033]: I0226 17:16:02.831332 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:16:02 crc kubenswrapper[5033]: I0226 17:16:02.892672 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75f5f55c85-4gs2z"] Feb 26 17:16:02 crc kubenswrapper[5033]: I0226 17:16:02.893003 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" podUID="5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" containerName="dnsmasq-dns" containerID="cri-o://fb7041a3ebeba7ccfda8b2483193ceb9cc30b7ec1e410e551260a8bd38c3f2cb" gracePeriod=10 Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.213356 5033 generic.go:334] "Generic (PLEG): container finished" podID="15c4daae-b1b6-48ff-be61-eb705aeb62f1" containerID="9eb978d195a8efe9c8bc667da43718d291d885177c9aab013f49e4fbc1a4c8e5" exitCode=0 Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.213423 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535436-r62zf" event={"ID":"15c4daae-b1b6-48ff-be61-eb705aeb62f1","Type":"ContainerDied","Data":"9eb978d195a8efe9c8bc667da43718d291d885177c9aab013f49e4fbc1a4c8e5"} Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.219796 5033 generic.go:334] "Generic (PLEG): container finished" podID="5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" containerID="fb7041a3ebeba7ccfda8b2483193ceb9cc30b7ec1e410e551260a8bd38c3f2cb" exitCode=0 Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.219844 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" event={"ID":"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403","Type":"ContainerDied","Data":"fb7041a3ebeba7ccfda8b2483193ceb9cc30b7ec1e410e551260a8bd38c3f2cb"} Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.409358 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.537687 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-config\") pod \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.537827 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-sb\") pod \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.537879 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n64v\" (UniqueName: \"kubernetes.io/projected/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-kube-api-access-9n64v\") pod \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.537918 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-nb\") pod \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.537974 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-dns-svc\") pod \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\" (UID: \"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403\") " Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.544379 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-kube-api-access-9n64v" (OuterVolumeSpecName: "kube-api-access-9n64v") pod "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" (UID: "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403"). InnerVolumeSpecName "kube-api-access-9n64v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.589219 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" (UID: "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.595796 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" (UID: "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.606678 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-config" (OuterVolumeSpecName: "config") pod "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" (UID: "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.610236 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" (UID: "5bed4dc5-a9fe-47c4-9aa6-89c4d5840403"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.639918 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.639945 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.639954 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.639965 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n64v\" (UniqueName: \"kubernetes.io/projected/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-kube-api-access-9n64v\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:03 crc kubenswrapper[5033]: I0226 17:16:03.639974 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.244893 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" event={"ID":"5bed4dc5-a9fe-47c4-9aa6-89c4d5840403","Type":"ContainerDied","Data":"ad3be7cabc74bfe98fa2c9650fed7d763ba4909c94ce09a7b177b114ef2a8b1a"} Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.245173 5033 scope.go:117] "RemoveContainer" containerID="fb7041a3ebeba7ccfda8b2483193ceb9cc30b7ec1e410e551260a8bd38c3f2cb" Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.244940 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f5f55c85-4gs2z" Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.275174 5033 scope.go:117] "RemoveContainer" containerID="fa8abaf51a660c845e618f702f1ca52fb31e097a6b3b75a0f143e05b36e051ec" Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.281582 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75f5f55c85-4gs2z"] Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.289752 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75f5f55c85-4gs2z"] Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.567588 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535436-r62zf" Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.656823 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfpwc\" (UniqueName: \"kubernetes.io/projected/15c4daae-b1b6-48ff-be61-eb705aeb62f1-kube-api-access-nfpwc\") pod \"15c4daae-b1b6-48ff-be61-eb705aeb62f1\" (UID: \"15c4daae-b1b6-48ff-be61-eb705aeb62f1\") " Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.661205 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15c4daae-b1b6-48ff-be61-eb705aeb62f1-kube-api-access-nfpwc" (OuterVolumeSpecName: "kube-api-access-nfpwc") pod "15c4daae-b1b6-48ff-be61-eb705aeb62f1" (UID: "15c4daae-b1b6-48ff-be61-eb705aeb62f1"). InnerVolumeSpecName "kube-api-access-nfpwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:16:04 crc kubenswrapper[5033]: I0226 17:16:04.759966 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfpwc\" (UniqueName: \"kubernetes.io/projected/15c4daae-b1b6-48ff-be61-eb705aeb62f1-kube-api-access-nfpwc\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:05 crc kubenswrapper[5033]: I0226 17:16:05.256356 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535436-r62zf" Feb 26 17:16:05 crc kubenswrapper[5033]: I0226 17:16:05.256376 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535436-r62zf" event={"ID":"15c4daae-b1b6-48ff-be61-eb705aeb62f1","Type":"ContainerDied","Data":"679a1e9b8cabb98250b804bdf59761cec6506c34cfafe609879256c0f574303d"} Feb 26 17:16:05 crc kubenswrapper[5033]: I0226 17:16:05.256988 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="679a1e9b8cabb98250b804bdf59761cec6506c34cfafe609879256c0f574303d" Feb 26 17:16:05 crc kubenswrapper[5033]: I0226 17:16:05.647663 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" path="/var/lib/kubelet/pods/5bed4dc5-a9fe-47c4-9aa6-89c4d5840403/volumes" Feb 26 17:16:05 crc kubenswrapper[5033]: I0226 17:16:05.652579 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535430-bpg4t"] Feb 26 17:16:05 crc kubenswrapper[5033]: I0226 17:16:05.661876 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535430-bpg4t"] Feb 26 17:16:07 crc kubenswrapper[5033]: I0226 17:16:07.642793 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651e60e1-bc9f-4872-99e2-26dc39415a1d" path="/var/lib/kubelet/pods/651e60e1-bc9f-4872-99e2-26dc39415a1d/volumes" Feb 26 17:16:20 crc kubenswrapper[5033]: I0226 17:16:20.189204 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:16:20 crc kubenswrapper[5033]: I0226 17:16:20.189947 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:16:28 crc kubenswrapper[5033]: I0226 17:16:28.134818 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:16:28 crc kubenswrapper[5033]: I0226 17:16:28.142643 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-788bb6fb9b-t7f5z" Feb 26 17:16:34 crc kubenswrapper[5033]: I0226 17:16:34.076035 5033 scope.go:117] "RemoveContainer" containerID="6fa9bf29231196d557d678cec29c4291befd76f4de98236c5d6a8103238724a6" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.402163 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-fqlpb"] Feb 26 17:16:49 crc kubenswrapper[5033]: E0226 17:16:49.403125 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" containerName="init" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.403143 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" containerName="init" Feb 26 17:16:49 crc kubenswrapper[5033]: E0226 17:16:49.403162 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c4daae-b1b6-48ff-be61-eb705aeb62f1" containerName="oc" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.403169 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c4daae-b1b6-48ff-be61-eb705aeb62f1" containerName="oc" Feb 26 17:16:49 crc kubenswrapper[5033]: E0226 17:16:49.403190 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" containerName="dnsmasq-dns" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.403200 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" containerName="dnsmasq-dns" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.403410 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="15c4daae-b1b6-48ff-be61-eb705aeb62f1" containerName="oc" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.403431 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bed4dc5-a9fe-47c4-9aa6-89c4d5840403" containerName="dnsmasq-dns" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.404171 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.411636 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fqlpb"] Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.499008 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-7qh6c"] Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.500009 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.509939 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7qh6c"] Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.556758 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-operator-scripts\") pod \"nova-api-db-create-fqlpb\" (UID: \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\") " pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.556801 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92km\" (UniqueName: \"kubernetes.io/projected/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-kube-api-access-n92km\") pod \"nova-api-db-create-fqlpb\" (UID: \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\") " pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.604038 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-09eb-account-create-update-hhcv8"] Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.605316 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.607321 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.658067 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-09eb-account-create-update-hhcv8"] Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.658685 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-operator-scripts\") pod \"nova-api-db-create-fqlpb\" (UID: \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\") " pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.658748 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n92km\" (UniqueName: \"kubernetes.io/projected/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-kube-api-access-n92km\") pod \"nova-api-db-create-fqlpb\" (UID: \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\") " pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.658790 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/019ce90c-a2fb-4c62-90e1-13a61c3190f8-operator-scripts\") pod \"nova-cell0-db-create-7qh6c\" (UID: \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\") " pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.658902 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwjn6\" (UniqueName: \"kubernetes.io/projected/019ce90c-a2fb-4c62-90e1-13a61c3190f8-kube-api-access-gwjn6\") pod \"nova-cell0-db-create-7qh6c\" (UID: \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\") " pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.659664 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-operator-scripts\") pod \"nova-api-db-create-fqlpb\" (UID: \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\") " pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.707321 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-gc7xz"] Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.709009 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.710316 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92km\" (UniqueName: \"kubernetes.io/projected/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-kube-api-access-n92km\") pod \"nova-api-db-create-fqlpb\" (UID: \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\") " pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.716735 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gc7xz"] Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.725599 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.760794 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/019ce90c-a2fb-4c62-90e1-13a61c3190f8-operator-scripts\") pod \"nova-cell0-db-create-7qh6c\" (UID: \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\") " pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.760885 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7157e0-f691-44e5-b73a-8cb559993892-operator-scripts\") pod \"nova-api-09eb-account-create-update-hhcv8\" (UID: \"7c7157e0-f691-44e5-b73a-8cb559993892\") " pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.760950 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdbq5\" (UniqueName: \"kubernetes.io/projected/7c7157e0-f691-44e5-b73a-8cb559993892-kube-api-access-cdbq5\") pod \"nova-api-09eb-account-create-update-hhcv8\" (UID: \"7c7157e0-f691-44e5-b73a-8cb559993892\") " pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.761028 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwjn6\" (UniqueName: \"kubernetes.io/projected/019ce90c-a2fb-4c62-90e1-13a61c3190f8-kube-api-access-gwjn6\") pod \"nova-cell0-db-create-7qh6c\" (UID: \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\") " pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.763139 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/019ce90c-a2fb-4c62-90e1-13a61c3190f8-operator-scripts\") pod \"nova-cell0-db-create-7qh6c\" (UID: \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\") " pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.780011 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwjn6\" (UniqueName: \"kubernetes.io/projected/019ce90c-a2fb-4c62-90e1-13a61c3190f8-kube-api-access-gwjn6\") pod \"nova-cell0-db-create-7qh6c\" (UID: \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\") " pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.818325 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.826717 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a96f-account-create-update-hxbdp"] Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.829426 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.833890 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.835399 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a96f-account-create-update-hxbdp"] Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.862822 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7157e0-f691-44e5-b73a-8cb559993892-operator-scripts\") pod \"nova-api-09eb-account-create-update-hhcv8\" (UID: \"7c7157e0-f691-44e5-b73a-8cb559993892\") " pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.862888 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdbq5\" (UniqueName: \"kubernetes.io/projected/7c7157e0-f691-44e5-b73a-8cb559993892-kube-api-access-cdbq5\") pod \"nova-api-09eb-account-create-update-hhcv8\" (UID: \"7c7157e0-f691-44e5-b73a-8cb559993892\") " pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.862931 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk8mq\" (UniqueName: \"kubernetes.io/projected/b00dae15-cf05-496e-8291-97c26f4a5cf4-kube-api-access-lk8mq\") pod \"nova-cell1-db-create-gc7xz\" (UID: \"b00dae15-cf05-496e-8291-97c26f4a5cf4\") " pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.862955 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00dae15-cf05-496e-8291-97c26f4a5cf4-operator-scripts\") pod \"nova-cell1-db-create-gc7xz\" (UID: \"b00dae15-cf05-496e-8291-97c26f4a5cf4\") " pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.863805 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7157e0-f691-44e5-b73a-8cb559993892-operator-scripts\") pod \"nova-api-09eb-account-create-update-hhcv8\" (UID: \"7c7157e0-f691-44e5-b73a-8cb559993892\") " pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.883958 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdbq5\" (UniqueName: \"kubernetes.io/projected/7c7157e0-f691-44e5-b73a-8cb559993892-kube-api-access-cdbq5\") pod \"nova-api-09eb-account-create-update-hhcv8\" (UID: \"7c7157e0-f691-44e5-b73a-8cb559993892\") " pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.949313 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.964653 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc56c\" (UniqueName: \"kubernetes.io/projected/e3a20a7d-c65b-4934-b634-bcda152fd8a8-kube-api-access-jc56c\") pod \"nova-cell0-a96f-account-create-update-hxbdp\" (UID: \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\") " pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.964709 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk8mq\" (UniqueName: \"kubernetes.io/projected/b00dae15-cf05-496e-8291-97c26f4a5cf4-kube-api-access-lk8mq\") pod \"nova-cell1-db-create-gc7xz\" (UID: \"b00dae15-cf05-496e-8291-97c26f4a5cf4\") " pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.964742 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00dae15-cf05-496e-8291-97c26f4a5cf4-operator-scripts\") pod \"nova-cell1-db-create-gc7xz\" (UID: \"b00dae15-cf05-496e-8291-97c26f4a5cf4\") " pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.964871 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a20a7d-c65b-4934-b634-bcda152fd8a8-operator-scripts\") pod \"nova-cell0-a96f-account-create-update-hxbdp\" (UID: \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\") " pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.965565 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00dae15-cf05-496e-8291-97c26f4a5cf4-operator-scripts\") pod \"nova-cell1-db-create-gc7xz\" (UID: \"b00dae15-cf05-496e-8291-97c26f4a5cf4\") " pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:49 crc kubenswrapper[5033]: I0226 17:16:49.995467 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk8mq\" (UniqueName: \"kubernetes.io/projected/b00dae15-cf05-496e-8291-97c26f4a5cf4-kube-api-access-lk8mq\") pod \"nova-cell1-db-create-gc7xz\" (UID: \"b00dae15-cf05-496e-8291-97c26f4a5cf4\") " pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.030004 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1590-account-create-update-nq5gt"] Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.031548 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.035665 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.043817 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1590-account-create-update-nq5gt"] Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.065971 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a20a7d-c65b-4934-b634-bcda152fd8a8-operator-scripts\") pod \"nova-cell0-a96f-account-create-update-hxbdp\" (UID: \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\") " pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.066066 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc56c\" (UniqueName: \"kubernetes.io/projected/e3a20a7d-c65b-4934-b634-bcda152fd8a8-kube-api-access-jc56c\") pod \"nova-cell0-a96f-account-create-update-hxbdp\" (UID: \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\") " pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.066661 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a20a7d-c65b-4934-b634-bcda152fd8a8-operator-scripts\") pod \"nova-cell0-a96f-account-create-update-hxbdp\" (UID: \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\") " pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.084415 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc56c\" (UniqueName: \"kubernetes.io/projected/e3a20a7d-c65b-4934-b634-bcda152fd8a8-kube-api-access-jc56c\") pod \"nova-cell0-a96f-account-create-update-hxbdp\" (UID: \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\") " pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.123146 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.147438 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.167055 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ltmw\" (UniqueName: \"kubernetes.io/projected/7bc806cf-c9e4-43dc-895a-5b5186d1416c-kube-api-access-9ltmw\") pod \"nova-cell1-1590-account-create-update-nq5gt\" (UID: \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\") " pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.167187 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc806cf-c9e4-43dc-895a-5b5186d1416c-operator-scripts\") pod \"nova-cell1-1590-account-create-update-nq5gt\" (UID: \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\") " pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.179396 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fqlpb"] Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.188642 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.188691 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.188729 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.189346 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.189394 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" gracePeriod=600 Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.269297 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc806cf-c9e4-43dc-895a-5b5186d1416c-operator-scripts\") pod \"nova-cell1-1590-account-create-update-nq5gt\" (UID: \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\") " pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.269483 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ltmw\" (UniqueName: \"kubernetes.io/projected/7bc806cf-c9e4-43dc-895a-5b5186d1416c-kube-api-access-9ltmw\") pod \"nova-cell1-1590-account-create-update-nq5gt\" (UID: \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\") " pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.270932 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc806cf-c9e4-43dc-895a-5b5186d1416c-operator-scripts\") pod \"nova-cell1-1590-account-create-update-nq5gt\" (UID: \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\") " pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.289223 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ltmw\" (UniqueName: \"kubernetes.io/projected/7bc806cf-c9e4-43dc-895a-5b5186d1416c-kube-api-access-9ltmw\") pod \"nova-cell1-1590-account-create-update-nq5gt\" (UID: \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\") " pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:50 crc kubenswrapper[5033]: E0226 17:16:50.320730 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.366139 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.385688 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7qh6c"] Feb 26 17:16:50 crc kubenswrapper[5033]: W0226 17:16:50.392947 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod019ce90c_a2fb_4c62_90e1_13a61c3190f8.slice/crio-1199df72908b33cf49af94b2ba3c9c88ecf42c6e7eb4f44bbc5e93b61c92177a WatchSource:0}: Error finding container 1199df72908b33cf49af94b2ba3c9c88ecf42c6e7eb4f44bbc5e93b61c92177a: Status 404 returned error can't find the container with id 1199df72908b33cf49af94b2ba3c9c88ecf42c6e7eb4f44bbc5e93b61c92177a Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.530240 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-09eb-account-create-update-hhcv8"] Feb 26 17:16:50 crc kubenswrapper[5033]: W0226 17:16:50.539339 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c7157e0_f691_44e5_b73a_8cb559993892.slice/crio-769046c810a9411a54d8b200f2482d2123a6c4dc7027413a94cdb0bb0ee94658 WatchSource:0}: Error finding container 769046c810a9411a54d8b200f2482d2123a6c4dc7027413a94cdb0bb0ee94658: Status 404 returned error can't find the container with id 769046c810a9411a54d8b200f2482d2123a6c4dc7027413a94cdb0bb0ee94658 Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.632237 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gc7xz"] Feb 26 17:16:50 crc kubenswrapper[5033]: W0226 17:16:50.651464 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb00dae15_cf05_496e_8291_97c26f4a5cf4.slice/crio-9720c7e741bf0c6515ffb269b8fa08369e37bdff1a6f183367bd63e4cb0b7b72 WatchSource:0}: Error finding container 9720c7e741bf0c6515ffb269b8fa08369e37bdff1a6f183367bd63e4cb0b7b72: Status 404 returned error can't find the container with id 9720c7e741bf0c6515ffb269b8fa08369e37bdff1a6f183367bd63e4cb0b7b72 Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.732998 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a96f-account-create-update-hxbdp"] Feb 26 17:16:50 crc kubenswrapper[5033]: W0226 17:16:50.734563 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3a20a7d_c65b_4934_b634_bcda152fd8a8.slice/crio-86c28870a667f9982b6c7ed55af5a5f7795791a06ff74f3f579836c3ba8ae2c8 WatchSource:0}: Error finding container 86c28870a667f9982b6c7ed55af5a5f7795791a06ff74f3f579836c3ba8ae2c8: Status 404 returned error can't find the container with id 86c28870a667f9982b6c7ed55af5a5f7795791a06ff74f3f579836c3ba8ae2c8 Feb 26 17:16:50 crc kubenswrapper[5033]: I0226 17:16:50.845723 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1590-account-create-update-nq5gt"] Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.091784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" event={"ID":"e3a20a7d-c65b-4934-b634-bcda152fd8a8","Type":"ContainerStarted","Data":"86c28870a667f9982b6c7ed55af5a5f7795791a06ff74f3f579836c3ba8ae2c8"} Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.093185 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1590-account-create-update-nq5gt" event={"ID":"7bc806cf-c9e4-43dc-895a-5b5186d1416c","Type":"ContainerStarted","Data":"d52ee73f28dec4802a73fd5bb22fd5ef66e02a8d94ec77abd8720b4b33e0eb8a"} Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.095437 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" exitCode=0 Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.095504 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421"} Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.095547 5033 scope.go:117] "RemoveContainer" containerID="e12b311251615971bfc352b62c31819aa1843ba8a719abffbebfc499b24d9582" Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.096138 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:16:51 crc kubenswrapper[5033]: E0226 17:16:51.096614 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.098566 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gc7xz" event={"ID":"b00dae15-cf05-496e-8291-97c26f4a5cf4","Type":"ContainerStarted","Data":"9720c7e741bf0c6515ffb269b8fa08369e37bdff1a6f183367bd63e4cb0b7b72"} Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.101473 5033 generic.go:334] "Generic (PLEG): container finished" podID="019ce90c-a2fb-4c62-90e1-13a61c3190f8" containerID="339f92ee7719eabf170ca21f54f7f870fc5691757cc318ef5ab96b55b00830bd" exitCode=0 Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.101542 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7qh6c" event={"ID":"019ce90c-a2fb-4c62-90e1-13a61c3190f8","Type":"ContainerDied","Data":"339f92ee7719eabf170ca21f54f7f870fc5691757cc318ef5ab96b55b00830bd"} Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.101564 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7qh6c" event={"ID":"019ce90c-a2fb-4c62-90e1-13a61c3190f8","Type":"ContainerStarted","Data":"1199df72908b33cf49af94b2ba3c9c88ecf42c6e7eb4f44bbc5e93b61c92177a"} Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.104065 5033 generic.go:334] "Generic (PLEG): container finished" podID="0e31358f-1ef9-46c9-9eaf-209f8a78cd9c" containerID="65cce5d38065cf5dc18de96156a1429f20284fa58c9b0f4e94f491081c4a13cb" exitCode=0 Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.104134 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fqlpb" event={"ID":"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c","Type":"ContainerDied","Data":"65cce5d38065cf5dc18de96156a1429f20284fa58c9b0f4e94f491081c4a13cb"} Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.104176 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fqlpb" event={"ID":"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c","Type":"ContainerStarted","Data":"cf145df3fe1ffe4664f6cb03c50c9aee1c190bbe4afab880e9df48ebe4e0fcbc"} Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.106484 5033 generic.go:334] "Generic (PLEG): container finished" podID="7c7157e0-f691-44e5-b73a-8cb559993892" containerID="bd4b0025e0b84c4590ee98fa8ebfc1c039c604256b6e8bd6218c205201d378a4" exitCode=0 Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.106526 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-09eb-account-create-update-hhcv8" event={"ID":"7c7157e0-f691-44e5-b73a-8cb559993892","Type":"ContainerDied","Data":"bd4b0025e0b84c4590ee98fa8ebfc1c039c604256b6e8bd6218c205201d378a4"} Feb 26 17:16:51 crc kubenswrapper[5033]: I0226 17:16:51.106544 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-09eb-account-create-update-hhcv8" event={"ID":"7c7157e0-f691-44e5-b73a-8cb559993892","Type":"ContainerStarted","Data":"769046c810a9411a54d8b200f2482d2123a6c4dc7027413a94cdb0bb0ee94658"} Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.115483 5033 generic.go:334] "Generic (PLEG): container finished" podID="7bc806cf-c9e4-43dc-895a-5b5186d1416c" containerID="929383a61ee3b828490a46339202a4f4c46ead1b76b939f224fa9394b4fb1136" exitCode=0 Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.115572 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1590-account-create-update-nq5gt" event={"ID":"7bc806cf-c9e4-43dc-895a-5b5186d1416c","Type":"ContainerDied","Data":"929383a61ee3b828490a46339202a4f4c46ead1b76b939f224fa9394b4fb1136"} Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.122799 5033 generic.go:334] "Generic (PLEG): container finished" podID="b00dae15-cf05-496e-8291-97c26f4a5cf4" containerID="6e454efcce2322cb2b27e2958b65d5c5cec1a31d09460fc3eb401c7669e77566" exitCode=0 Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.122875 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gc7xz" event={"ID":"b00dae15-cf05-496e-8291-97c26f4a5cf4","Type":"ContainerDied","Data":"6e454efcce2322cb2b27e2958b65d5c5cec1a31d09460fc3eb401c7669e77566"} Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.127242 5033 generic.go:334] "Generic (PLEG): container finished" podID="e3a20a7d-c65b-4934-b634-bcda152fd8a8" containerID="89155be67a1e0cd99fb739f0eafea1631bef2d1d40b50d138f19cfb4a8081dd8" exitCode=0 Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.127430 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" event={"ID":"e3a20a7d-c65b-4934-b634-bcda152fd8a8","Type":"ContainerDied","Data":"89155be67a1e0cd99fb739f0eafea1631bef2d1d40b50d138f19cfb4a8081dd8"} Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.543639 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.611106 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/019ce90c-a2fb-4c62-90e1-13a61c3190f8-operator-scripts\") pod \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\" (UID: \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\") " Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.611459 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwjn6\" (UniqueName: \"kubernetes.io/projected/019ce90c-a2fb-4c62-90e1-13a61c3190f8-kube-api-access-gwjn6\") pod \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\" (UID: \"019ce90c-a2fb-4c62-90e1-13a61c3190f8\") " Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.611710 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/019ce90c-a2fb-4c62-90e1-13a61c3190f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "019ce90c-a2fb-4c62-90e1-13a61c3190f8" (UID: "019ce90c-a2fb-4c62-90e1-13a61c3190f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.612030 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/019ce90c-a2fb-4c62-90e1-13a61c3190f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.618688 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019ce90c-a2fb-4c62-90e1-13a61c3190f8-kube-api-access-gwjn6" (OuterVolumeSpecName: "kube-api-access-gwjn6") pod "019ce90c-a2fb-4c62-90e1-13a61c3190f8" (UID: "019ce90c-a2fb-4c62-90e1-13a61c3190f8"). InnerVolumeSpecName "kube-api-access-gwjn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.700067 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.707411 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.715282 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwjn6\" (UniqueName: \"kubernetes.io/projected/019ce90c-a2fb-4c62-90e1-13a61c3190f8-kube-api-access-gwjn6\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.817278 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n92km\" (UniqueName: \"kubernetes.io/projected/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-kube-api-access-n92km\") pod \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\" (UID: \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\") " Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.817371 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-operator-scripts\") pod \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\" (UID: \"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c\") " Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.817535 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7157e0-f691-44e5-b73a-8cb559993892-operator-scripts\") pod \"7c7157e0-f691-44e5-b73a-8cb559993892\" (UID: \"7c7157e0-f691-44e5-b73a-8cb559993892\") " Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.817591 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdbq5\" (UniqueName: \"kubernetes.io/projected/7c7157e0-f691-44e5-b73a-8cb559993892-kube-api-access-cdbq5\") pod \"7c7157e0-f691-44e5-b73a-8cb559993892\" (UID: \"7c7157e0-f691-44e5-b73a-8cb559993892\") " Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.818440 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e31358f-1ef9-46c9-9eaf-209f8a78cd9c" (UID: "0e31358f-1ef9-46c9-9eaf-209f8a78cd9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.818628 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c7157e0-f691-44e5-b73a-8cb559993892-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c7157e0-f691-44e5-b73a-8cb559993892" (UID: "7c7157e0-f691-44e5-b73a-8cb559993892"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.820968 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-kube-api-access-n92km" (OuterVolumeSpecName: "kube-api-access-n92km") pod "0e31358f-1ef9-46c9-9eaf-209f8a78cd9c" (UID: "0e31358f-1ef9-46c9-9eaf-209f8a78cd9c"). InnerVolumeSpecName "kube-api-access-n92km". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.821450 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7157e0-f691-44e5-b73a-8cb559993892-kube-api-access-cdbq5" (OuterVolumeSpecName: "kube-api-access-cdbq5") pod "7c7157e0-f691-44e5-b73a-8cb559993892" (UID: "7c7157e0-f691-44e5-b73a-8cb559993892"). InnerVolumeSpecName "kube-api-access-cdbq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.919832 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c7157e0-f691-44e5-b73a-8cb559993892-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.919872 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdbq5\" (UniqueName: \"kubernetes.io/projected/7c7157e0-f691-44e5-b73a-8cb559993892-kube-api-access-cdbq5\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.919884 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n92km\" (UniqueName: \"kubernetes.io/projected/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-kube-api-access-n92km\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:52 crc kubenswrapper[5033]: I0226 17:16:52.919893 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.139165 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-09eb-account-create-update-hhcv8" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.139146 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-09eb-account-create-update-hhcv8" event={"ID":"7c7157e0-f691-44e5-b73a-8cb559993892","Type":"ContainerDied","Data":"769046c810a9411a54d8b200f2482d2123a6c4dc7027413a94cdb0bb0ee94658"} Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.139624 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="769046c810a9411a54d8b200f2482d2123a6c4dc7027413a94cdb0bb0ee94658" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.143391 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fqlpb" event={"ID":"0e31358f-1ef9-46c9-9eaf-209f8a78cd9c","Type":"ContainerDied","Data":"cf145df3fe1ffe4664f6cb03c50c9aee1c190bbe4afab880e9df48ebe4e0fcbc"} Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.143441 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf145df3fe1ffe4664f6cb03c50c9aee1c190bbe4afab880e9df48ebe4e0fcbc" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.143498 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fqlpb" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.145302 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7qh6c" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.145379 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7qh6c" event={"ID":"019ce90c-a2fb-4c62-90e1-13a61c3190f8","Type":"ContainerDied","Data":"1199df72908b33cf49af94b2ba3c9c88ecf42c6e7eb4f44bbc5e93b61c92177a"} Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.145409 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1199df72908b33cf49af94b2ba3c9c88ecf42c6e7eb4f44bbc5e93b61c92177a" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.652163 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.656593 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.662395 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.737986 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk8mq\" (UniqueName: \"kubernetes.io/projected/b00dae15-cf05-496e-8291-97c26f4a5cf4-kube-api-access-lk8mq\") pod \"b00dae15-cf05-496e-8291-97c26f4a5cf4\" (UID: \"b00dae15-cf05-496e-8291-97c26f4a5cf4\") " Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.738073 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00dae15-cf05-496e-8291-97c26f4a5cf4-operator-scripts\") pod \"b00dae15-cf05-496e-8291-97c26f4a5cf4\" (UID: \"b00dae15-cf05-496e-8291-97c26f4a5cf4\") " Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.738097 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a20a7d-c65b-4934-b634-bcda152fd8a8-operator-scripts\") pod \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\" (UID: \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\") " Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.738123 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc806cf-c9e4-43dc-895a-5b5186d1416c-operator-scripts\") pod \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\" (UID: \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\") " Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.738201 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ltmw\" (UniqueName: \"kubernetes.io/projected/7bc806cf-c9e4-43dc-895a-5b5186d1416c-kube-api-access-9ltmw\") pod \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\" (UID: \"7bc806cf-c9e4-43dc-895a-5b5186d1416c\") " Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.738284 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc56c\" (UniqueName: \"kubernetes.io/projected/e3a20a7d-c65b-4934-b634-bcda152fd8a8-kube-api-access-jc56c\") pod \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\" (UID: \"e3a20a7d-c65b-4934-b634-bcda152fd8a8\") " Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.738661 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc806cf-c9e4-43dc-895a-5b5186d1416c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bc806cf-c9e4-43dc-895a-5b5186d1416c" (UID: "7bc806cf-c9e4-43dc-895a-5b5186d1416c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.738754 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b00dae15-cf05-496e-8291-97c26f4a5cf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b00dae15-cf05-496e-8291-97c26f4a5cf4" (UID: "b00dae15-cf05-496e-8291-97c26f4a5cf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.738776 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a20a7d-c65b-4934-b634-bcda152fd8a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3a20a7d-c65b-4934-b634-bcda152fd8a8" (UID: "e3a20a7d-c65b-4934-b634-bcda152fd8a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.742958 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc806cf-c9e4-43dc-895a-5b5186d1416c-kube-api-access-9ltmw" (OuterVolumeSpecName: "kube-api-access-9ltmw") pod "7bc806cf-c9e4-43dc-895a-5b5186d1416c" (UID: "7bc806cf-c9e4-43dc-895a-5b5186d1416c"). InnerVolumeSpecName "kube-api-access-9ltmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.743044 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a20a7d-c65b-4934-b634-bcda152fd8a8-kube-api-access-jc56c" (OuterVolumeSpecName: "kube-api-access-jc56c") pod "e3a20a7d-c65b-4934-b634-bcda152fd8a8" (UID: "e3a20a7d-c65b-4934-b634-bcda152fd8a8"). InnerVolumeSpecName "kube-api-access-jc56c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.743060 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00dae15-cf05-496e-8291-97c26f4a5cf4-kube-api-access-lk8mq" (OuterVolumeSpecName: "kube-api-access-lk8mq") pod "b00dae15-cf05-496e-8291-97c26f4a5cf4" (UID: "b00dae15-cf05-496e-8291-97c26f4a5cf4"). InnerVolumeSpecName "kube-api-access-lk8mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.840047 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk8mq\" (UniqueName: \"kubernetes.io/projected/b00dae15-cf05-496e-8291-97c26f4a5cf4-kube-api-access-lk8mq\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.840081 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b00dae15-cf05-496e-8291-97c26f4a5cf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.840094 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a20a7d-c65b-4934-b634-bcda152fd8a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.840106 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc806cf-c9e4-43dc-895a-5b5186d1416c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.840117 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ltmw\" (UniqueName: \"kubernetes.io/projected/7bc806cf-c9e4-43dc-895a-5b5186d1416c-kube-api-access-9ltmw\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:53 crc kubenswrapper[5033]: I0226 17:16:53.840128 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc56c\" (UniqueName: \"kubernetes.io/projected/e3a20a7d-c65b-4934-b634-bcda152fd8a8-kube-api-access-jc56c\") on node \"crc\" DevicePath \"\"" Feb 26 17:16:54 crc kubenswrapper[5033]: I0226 17:16:54.159715 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" event={"ID":"e3a20a7d-c65b-4934-b634-bcda152fd8a8","Type":"ContainerDied","Data":"86c28870a667f9982b6c7ed55af5a5f7795791a06ff74f3f579836c3ba8ae2c8"} Feb 26 17:16:54 crc kubenswrapper[5033]: I0226 17:16:54.159790 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86c28870a667f9982b6c7ed55af5a5f7795791a06ff74f3f579836c3ba8ae2c8" Feb 26 17:16:54 crc kubenswrapper[5033]: I0226 17:16:54.159866 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a96f-account-create-update-hxbdp" Feb 26 17:16:54 crc kubenswrapper[5033]: I0226 17:16:54.163186 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1590-account-create-update-nq5gt" Feb 26 17:16:54 crc kubenswrapper[5033]: I0226 17:16:54.163206 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1590-account-create-update-nq5gt" event={"ID":"7bc806cf-c9e4-43dc-895a-5b5186d1416c","Type":"ContainerDied","Data":"d52ee73f28dec4802a73fd5bb22fd5ef66e02a8d94ec77abd8720b4b33e0eb8a"} Feb 26 17:16:54 crc kubenswrapper[5033]: I0226 17:16:54.163318 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d52ee73f28dec4802a73fd5bb22fd5ef66e02a8d94ec77abd8720b4b33e0eb8a" Feb 26 17:16:54 crc kubenswrapper[5033]: I0226 17:16:54.165807 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gc7xz" event={"ID":"b00dae15-cf05-496e-8291-97c26f4a5cf4","Type":"ContainerDied","Data":"9720c7e741bf0c6515ffb269b8fa08369e37bdff1a6f183367bd63e4cb0b7b72"} Feb 26 17:16:54 crc kubenswrapper[5033]: I0226 17:16:54.165838 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9720c7e741bf0c6515ffb269b8fa08369e37bdff1a6f183367bd63e4cb0b7b72" Feb 26 17:16:54 crc kubenswrapper[5033]: I0226 17:16:54.165905 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gc7xz" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.119388 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bwn8f"] Feb 26 17:16:55 crc kubenswrapper[5033]: E0226 17:16:55.119804 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc806cf-c9e4-43dc-895a-5b5186d1416c" containerName="mariadb-account-create-update" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.119828 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc806cf-c9e4-43dc-895a-5b5186d1416c" containerName="mariadb-account-create-update" Feb 26 17:16:55 crc kubenswrapper[5033]: E0226 17:16:55.119845 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019ce90c-a2fb-4c62-90e1-13a61c3190f8" containerName="mariadb-database-create" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.119853 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="019ce90c-a2fb-4c62-90e1-13a61c3190f8" containerName="mariadb-database-create" Feb 26 17:16:55 crc kubenswrapper[5033]: E0226 17:16:55.119880 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00dae15-cf05-496e-8291-97c26f4a5cf4" containerName="mariadb-database-create" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.119888 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00dae15-cf05-496e-8291-97c26f4a5cf4" containerName="mariadb-database-create" Feb 26 17:16:55 crc kubenswrapper[5033]: E0226 17:16:55.119904 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e31358f-1ef9-46c9-9eaf-209f8a78cd9c" containerName="mariadb-database-create" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.119911 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e31358f-1ef9-46c9-9eaf-209f8a78cd9c" containerName="mariadb-database-create" Feb 26 17:16:55 crc kubenswrapper[5033]: E0226 17:16:55.119930 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a20a7d-c65b-4934-b634-bcda152fd8a8" containerName="mariadb-account-create-update" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.119939 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a20a7d-c65b-4934-b634-bcda152fd8a8" containerName="mariadb-account-create-update" Feb 26 17:16:55 crc kubenswrapper[5033]: E0226 17:16:55.119951 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7157e0-f691-44e5-b73a-8cb559993892" containerName="mariadb-account-create-update" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.119960 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7157e0-f691-44e5-b73a-8cb559993892" containerName="mariadb-account-create-update" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.120167 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc806cf-c9e4-43dc-895a-5b5186d1416c" containerName="mariadb-account-create-update" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.120187 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="019ce90c-a2fb-4c62-90e1-13a61c3190f8" containerName="mariadb-database-create" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.120212 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7157e0-f691-44e5-b73a-8cb559993892" containerName="mariadb-account-create-update" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.120223 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e31358f-1ef9-46c9-9eaf-209f8a78cd9c" containerName="mariadb-database-create" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.120238 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00dae15-cf05-496e-8291-97c26f4a5cf4" containerName="mariadb-database-create" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.120250 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a20a7d-c65b-4934-b634-bcda152fd8a8" containerName="mariadb-account-create-update" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.120987 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.124049 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.124390 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-l4n5w" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.124793 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.129889 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bwn8f"] Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.166737 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.166828 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-scripts\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.166862 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jhxr\" (UniqueName: \"kubernetes.io/projected/12a99f35-d435-4af6-9c92-35202db780d1-kube-api-access-6jhxr\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.166893 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-config-data\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.269039 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-scripts\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.269093 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jhxr\" (UniqueName: \"kubernetes.io/projected/12a99f35-d435-4af6-9c92-35202db780d1-kube-api-access-6jhxr\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.269130 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-config-data\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.269210 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.272712 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-scripts\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.273540 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.285330 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jhxr\" (UniqueName: \"kubernetes.io/projected/12a99f35-d435-4af6-9c92-35202db780d1-kube-api-access-6jhxr\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.288123 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-config-data\") pod \"nova-cell0-conductor-db-sync-bwn8f\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.483503 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:16:55 crc kubenswrapper[5033]: I0226 17:16:55.931015 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bwn8f"] Feb 26 17:16:55 crc kubenswrapper[5033]: W0226 17:16:55.935277 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12a99f35_d435_4af6_9c92_35202db780d1.slice/crio-0203e6429e12a8ecebd40890b2646bb2d0c4bc9fc2abafe54b986310a57040e4 WatchSource:0}: Error finding container 0203e6429e12a8ecebd40890b2646bb2d0c4bc9fc2abafe54b986310a57040e4: Status 404 returned error can't find the container with id 0203e6429e12a8ecebd40890b2646bb2d0c4bc9fc2abafe54b986310a57040e4 Feb 26 17:16:56 crc kubenswrapper[5033]: I0226 17:16:56.182278 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bwn8f" event={"ID":"12a99f35-d435-4af6-9c92-35202db780d1","Type":"ContainerStarted","Data":"6bc2910fe6c4c284470058468f3a142b87240ba0523a2be94487b4d1a1ec0373"} Feb 26 17:16:56 crc kubenswrapper[5033]: I0226 17:16:56.182327 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bwn8f" event={"ID":"12a99f35-d435-4af6-9c92-35202db780d1","Type":"ContainerStarted","Data":"0203e6429e12a8ecebd40890b2646bb2d0c4bc9fc2abafe54b986310a57040e4"} Feb 26 17:16:56 crc kubenswrapper[5033]: I0226 17:16:56.202016 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bwn8f" podStartSLOduration=1.201996734 podStartE2EDuration="1.201996734s" podCreationTimestamp="2026-02-26 17:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:16:56.195683459 +0000 UTC m=+5743.408282109" watchObservedRunningTime="2026-02-26 17:16:56.201996734 +0000 UTC m=+5743.414595384" Feb 26 17:17:02 crc kubenswrapper[5033]: I0226 17:17:02.245493 5033 generic.go:334] "Generic (PLEG): container finished" podID="12a99f35-d435-4af6-9c92-35202db780d1" containerID="6bc2910fe6c4c284470058468f3a142b87240ba0523a2be94487b4d1a1ec0373" exitCode=0 Feb 26 17:17:02 crc kubenswrapper[5033]: I0226 17:17:02.245602 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bwn8f" event={"ID":"12a99f35-d435-4af6-9c92-35202db780d1","Type":"ContainerDied","Data":"6bc2910fe6c4c284470058468f3a142b87240ba0523a2be94487b4d1a1ec0373"} Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.604813 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.726230 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-combined-ca-bundle\") pod \"12a99f35-d435-4af6-9c92-35202db780d1\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.726621 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-config-data\") pod \"12a99f35-d435-4af6-9c92-35202db780d1\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.726708 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jhxr\" (UniqueName: \"kubernetes.io/projected/12a99f35-d435-4af6-9c92-35202db780d1-kube-api-access-6jhxr\") pod \"12a99f35-d435-4af6-9c92-35202db780d1\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.726745 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-scripts\") pod \"12a99f35-d435-4af6-9c92-35202db780d1\" (UID: \"12a99f35-d435-4af6-9c92-35202db780d1\") " Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.733166 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a99f35-d435-4af6-9c92-35202db780d1-kube-api-access-6jhxr" (OuterVolumeSpecName: "kube-api-access-6jhxr") pod "12a99f35-d435-4af6-9c92-35202db780d1" (UID: "12a99f35-d435-4af6-9c92-35202db780d1"). InnerVolumeSpecName "kube-api-access-6jhxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.734501 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-scripts" (OuterVolumeSpecName: "scripts") pod "12a99f35-d435-4af6-9c92-35202db780d1" (UID: "12a99f35-d435-4af6-9c92-35202db780d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.750705 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12a99f35-d435-4af6-9c92-35202db780d1" (UID: "12a99f35-d435-4af6-9c92-35202db780d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.766456 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-config-data" (OuterVolumeSpecName: "config-data") pod "12a99f35-d435-4af6-9c92-35202db780d1" (UID: "12a99f35-d435-4af6-9c92-35202db780d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.828931 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.829097 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.829164 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jhxr\" (UniqueName: \"kubernetes.io/projected/12a99f35-d435-4af6-9c92-35202db780d1-kube-api-access-6jhxr\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:03 crc kubenswrapper[5033]: I0226 17:17:03.829231 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12a99f35-d435-4af6-9c92-35202db780d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.264416 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bwn8f" event={"ID":"12a99f35-d435-4af6-9c92-35202db780d1","Type":"ContainerDied","Data":"0203e6429e12a8ecebd40890b2646bb2d0c4bc9fc2abafe54b986310a57040e4"} Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.264533 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0203e6429e12a8ecebd40890b2646bb2d0c4bc9fc2abafe54b986310a57040e4" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.264494 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bwn8f" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.336343 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 17:17:04 crc kubenswrapper[5033]: E0226 17:17:04.336792 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a99f35-d435-4af6-9c92-35202db780d1" containerName="nova-cell0-conductor-db-sync" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.336820 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a99f35-d435-4af6-9c92-35202db780d1" containerName="nova-cell0-conductor-db-sync" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.337095 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a99f35-d435-4af6-9c92-35202db780d1" containerName="nova-cell0-conductor-db-sync" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.338007 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.339788 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-l4n5w" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.339867 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.352145 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.437167 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.437458 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.437545 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pcxd\" (UniqueName: \"kubernetes.io/projected/59148331-e81c-4ca3-aeb1-702fe06efa3a-kube-api-access-9pcxd\") pod \"nova-cell0-conductor-0\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.539541 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.539629 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pcxd\" (UniqueName: \"kubernetes.io/projected/59148331-e81c-4ca3-aeb1-702fe06efa3a-kube-api-access-9pcxd\") pod \"nova-cell0-conductor-0\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.539940 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.543993 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.544084 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.557074 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pcxd\" (UniqueName: \"kubernetes.io/projected/59148331-e81c-4ca3-aeb1-702fe06efa3a-kube-api-access-9pcxd\") pod \"nova-cell0-conductor-0\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:04 crc kubenswrapper[5033]: I0226 17:17:04.658600 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:05 crc kubenswrapper[5033]: I0226 17:17:05.098562 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 17:17:05 crc kubenswrapper[5033]: I0226 17:17:05.273695 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"59148331-e81c-4ca3-aeb1-702fe06efa3a","Type":"ContainerStarted","Data":"53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3"} Feb 26 17:17:05 crc kubenswrapper[5033]: I0226 17:17:05.273745 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"59148331-e81c-4ca3-aeb1-702fe06efa3a","Type":"ContainerStarted","Data":"070a151dec086da4715253a937784889cb1ad35a6578e556e56bb1596b99955b"} Feb 26 17:17:05 crc kubenswrapper[5033]: I0226 17:17:05.274824 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:05 crc kubenswrapper[5033]: I0226 17:17:05.291361 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.291343473 podStartE2EDuration="1.291343473s" podCreationTimestamp="2026-02-26 17:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:05.287556851 +0000 UTC m=+5752.500155501" watchObservedRunningTime="2026-02-26 17:17:05.291343473 +0000 UTC m=+5752.503942123" Feb 26 17:17:05 crc kubenswrapper[5033]: I0226 17:17:05.629245 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:17:05 crc kubenswrapper[5033]: E0226 17:17:05.629566 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:17:14 crc kubenswrapper[5033]: I0226 17:17:14.701782 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.137376 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9rsvh"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.138660 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.144700 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.148325 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.171380 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9rsvh"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.242809 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-scripts\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.242904 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-config-data\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.242986 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.243021 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4872\" (UniqueName: \"kubernetes.io/projected/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-kube-api-access-f4872\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.344427 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-config-data\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.344531 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.344566 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4872\" (UniqueName: \"kubernetes.io/projected/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-kube-api-access-f4872\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.344592 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-scripts\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.352205 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-config-data\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.363000 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.372217 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-scripts\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.393929 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4872\" (UniqueName: \"kubernetes.io/projected/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-kube-api-access-f4872\") pod \"nova-cell0-cell-mapping-9rsvh\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.395887 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.457014 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.467379 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.469762 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.514073 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.549493 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.551880 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.560106 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.560278 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d457d-03fa-407f-a152-8ec8c898095f-logs\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.560328 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.560348 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dt5b\" (UniqueName: \"kubernetes.io/projected/789d457d-03fa-407f-a152-8ec8c898095f-kube-api-access-6dt5b\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.560519 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-config-data\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.614363 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.663783 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d457d-03fa-407f-a152-8ec8c898095f-logs\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.665492 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.664495 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d457d-03fa-407f-a152-8ec8c898095f-logs\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.669298 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.670771 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.670792 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dt5b\" (UniqueName: \"kubernetes.io/projected/789d457d-03fa-407f-a152-8ec8c898095f-kube-api-access-6dt5b\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.671873 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-config-data\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.673620 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.681625 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-config-data\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.682103 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.694032 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dt5b\" (UniqueName: \"kubernetes.io/projected/789d457d-03fa-407f-a152-8ec8c898095f-kube-api-access-6dt5b\") pod \"nova-api-0\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.712395 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.721729 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c87bcb4cc-724jv"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.724135 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.756519 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c87bcb4cc-724jv"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.769894 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.774430 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.774493 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-config-data\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.774518 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbswh\" (UniqueName: \"kubernetes.io/projected/d627150f-9aac-45c2-8dec-39a344fef2d2-kube-api-access-rbswh\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.775869 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d627150f-9aac-45c2-8dec-39a344fef2d2-logs\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.775919 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.778863 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.782078 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.828835 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879665 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45b9t\" (UniqueName: \"kubernetes.io/projected/37c09ad3-7436-4c25-bc1f-61b08f872396-kube-api-access-45b9t\") pod \"nova-scheduler-0\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879736 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879789 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879809 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-config-data\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879830 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-sb\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879852 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbswh\" (UniqueName: \"kubernetes.io/projected/d627150f-9aac-45c2-8dec-39a344fef2d2-kube-api-access-rbswh\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879882 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-config\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879911 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879937 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9tgc\" (UniqueName: \"kubernetes.io/projected/40bae61a-5728-4f79-88c4-0a66ab71dff4-kube-api-access-s9tgc\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.879967 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.880017 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-config-data\") pod \"nova-scheduler-0\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.880043 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-nb\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.880076 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfs44\" (UniqueName: \"kubernetes.io/projected/ec4cbe35-7184-45d4-b140-df2a2bac4020-kube-api-access-wfs44\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.880133 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d627150f-9aac-45c2-8dec-39a344fef2d2-logs\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.880339 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-dns-svc\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.880619 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d627150f-9aac-45c2-8dec-39a344fef2d2-logs\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.885175 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.886675 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-config-data\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.897775 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbswh\" (UniqueName: \"kubernetes.io/projected/d627150f-9aac-45c2-8dec-39a344fef2d2-kube-api-access-rbswh\") pod \"nova-metadata-0\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " pod="openstack/nova-metadata-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.980997 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-config-data\") pod \"nova-scheduler-0\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981430 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-nb\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981485 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfs44\" (UniqueName: \"kubernetes.io/projected/ec4cbe35-7184-45d4-b140-df2a2bac4020-kube-api-access-wfs44\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981532 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-dns-svc\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981585 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45b9t\" (UniqueName: \"kubernetes.io/projected/37c09ad3-7436-4c25-bc1f-61b08f872396-kube-api-access-45b9t\") pod \"nova-scheduler-0\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981638 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-sb\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981687 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981802 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-config\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981833 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981866 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9tgc\" (UniqueName: \"kubernetes.io/projected/40bae61a-5728-4f79-88c4-0a66ab71dff4-kube-api-access-s9tgc\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.981903 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.982780 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-sb\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.982977 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-nb\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.983000 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-dns-svc\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.984329 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-config\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.986407 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-config-data\") pod \"nova-scheduler-0\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.996498 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.998362 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:15 crc kubenswrapper[5033]: I0226 17:17:15.998753 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.006008 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9tgc\" (UniqueName: \"kubernetes.io/projected/40bae61a-5728-4f79-88c4-0a66ab71dff4-kube-api-access-s9tgc\") pod \"dnsmasq-dns-7c87bcb4cc-724jv\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.013450 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45b9t\" (UniqueName: \"kubernetes.io/projected/37c09ad3-7436-4c25-bc1f-61b08f872396-kube-api-access-45b9t\") pod \"nova-scheduler-0\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.014060 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfs44\" (UniqueName: \"kubernetes.io/projected/ec4cbe35-7184-45d4-b140-df2a2bac4020-kube-api-access-wfs44\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.035012 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.058151 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.070812 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.101213 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.115160 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9rsvh"] Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.316806 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.379943 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wcths"] Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.381374 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.385237 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.385479 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.389819 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wcths"] Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.396600 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzd69\" (UniqueName: \"kubernetes.io/projected/c674b55b-5acf-421a-8772-fea6c2e3ad3d-kube-api-access-lzd69\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.396704 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-config-data\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.396753 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.396845 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-scripts\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.417277 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"789d457d-03fa-407f-a152-8ec8c898095f","Type":"ContainerStarted","Data":"2ee1c732eb7476c0a2dbb0054942d6fcfb4dba0cfc9abe0f71e7e8eb57094599"} Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.423904 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9rsvh" event={"ID":"407bdbec-fe4f-4726-baa6-061f9c9ebc6d","Type":"ContainerStarted","Data":"be5b8344503a21ecc6491df0c4f148730106dcca84cd1fd516299bfb8c9b9398"} Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.442395 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9rsvh" podStartSLOduration=1.44237772 podStartE2EDuration="1.44237772s" podCreationTimestamp="2026-02-26 17:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:16.441721043 +0000 UTC m=+5763.654319713" watchObservedRunningTime="2026-02-26 17:17:16.44237772 +0000 UTC m=+5763.654976370" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.498370 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-scripts\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.498499 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzd69\" (UniqueName: \"kubernetes.io/projected/c674b55b-5acf-421a-8772-fea6c2e3ad3d-kube-api-access-lzd69\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.498572 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-config-data\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.498612 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.514086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-scripts\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.514367 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-config-data\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.517374 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzd69\" (UniqueName: \"kubernetes.io/projected/c674b55b-5acf-421a-8772-fea6c2e3ad3d-kube-api-access-lzd69\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.528424 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wcths\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.590240 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.629168 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:17:16 crc kubenswrapper[5033]: E0226 17:17:16.629568 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.690207 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:16 crc kubenswrapper[5033]: W0226 17:17:16.694586 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37c09ad3_7436_4c25_bc1f_61b08f872396.slice/crio-976919a54d912a75c72345da6464419f064258755b01d20354faea0f9901c490 WatchSource:0}: Error finding container 976919a54d912a75c72345da6464419f064258755b01d20354faea0f9901c490: Status 404 returned error can't find the container with id 976919a54d912a75c72345da6464419f064258755b01d20354faea0f9901c490 Feb 26 17:17:16 crc kubenswrapper[5033]: W0226 17:17:16.699109 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40bae61a_5728_4f79_88c4_0a66ab71dff4.slice/crio-99759a25d9e7b4a778ac66bf621475a0e9305c090e1e2f8cce79da5e17ab1c15 WatchSource:0}: Error finding container 99759a25d9e7b4a778ac66bf621475a0e9305c090e1e2f8cce79da5e17ab1c15: Status 404 returned error can't find the container with id 99759a25d9e7b4a778ac66bf621475a0e9305c090e1e2f8cce79da5e17ab1c15 Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.709816 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c87bcb4cc-724jv"] Feb 26 17:17:16 crc kubenswrapper[5033]: W0226 17:17:16.723807 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd627150f_9aac_45c2_8dec_39a344fef2d2.slice/crio-ec34a501aaff6359b03c765a37d9c53581f07ac5c20e71de163f97000513e605 WatchSource:0}: Error finding container ec34a501aaff6359b03c765a37d9c53581f07ac5c20e71de163f97000513e605: Status 404 returned error can't find the container with id ec34a501aaff6359b03c765a37d9c53581f07ac5c20e71de163f97000513e605 Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.737992 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:16 crc kubenswrapper[5033]: I0226 17:17:16.869773 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 17:17:16 crc kubenswrapper[5033]: W0226 17:17:16.882673 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec4cbe35_7184_45d4_b140_df2a2bac4020.slice/crio-cc8a5e0e1e2d59c6d5449520bc2d0844ecdc78ea2061fca4472583634543fe28 WatchSource:0}: Error finding container cc8a5e0e1e2d59c6d5449520bc2d0844ecdc78ea2061fca4472583634543fe28: Status 404 returned error can't find the container with id cc8a5e0e1e2d59c6d5449520bc2d0844ecdc78ea2061fca4472583634543fe28 Feb 26 17:17:17 crc kubenswrapper[5033]: W0226 17:17:17.153745 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc674b55b_5acf_421a_8772_fea6c2e3ad3d.slice/crio-94acaacc22bb148e3cd203f17ff3ef7fd9edea78f13e88f70f059f98474c0be3 WatchSource:0}: Error finding container 94acaacc22bb148e3cd203f17ff3ef7fd9edea78f13e88f70f059f98474c0be3: Status 404 returned error can't find the container with id 94acaacc22bb148e3cd203f17ff3ef7fd9edea78f13e88f70f059f98474c0be3 Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.159999 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wcths"] Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.436172 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d627150f-9aac-45c2-8dec-39a344fef2d2","Type":"ContainerStarted","Data":"d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.436524 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d627150f-9aac-45c2-8dec-39a344fef2d2","Type":"ContainerStarted","Data":"a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.436536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d627150f-9aac-45c2-8dec-39a344fef2d2","Type":"ContainerStarted","Data":"ec34a501aaff6359b03c765a37d9c53581f07ac5c20e71de163f97000513e605"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.439665 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"789d457d-03fa-407f-a152-8ec8c898095f","Type":"ContainerStarted","Data":"6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.439722 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"789d457d-03fa-407f-a152-8ec8c898095f","Type":"ContainerStarted","Data":"1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.441450 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wcths" event={"ID":"c674b55b-5acf-421a-8772-fea6c2e3ad3d","Type":"ContainerStarted","Data":"0561d285f8af60c4647c398e67aa76efbdae27793c11ce3e75367699fb17beb5"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.441482 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wcths" event={"ID":"c674b55b-5acf-421a-8772-fea6c2e3ad3d","Type":"ContainerStarted","Data":"94acaacc22bb148e3cd203f17ff3ef7fd9edea78f13e88f70f059f98474c0be3"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.445485 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9rsvh" event={"ID":"407bdbec-fe4f-4726-baa6-061f9c9ebc6d","Type":"ContainerStarted","Data":"c1a63c5dcac537112cf58361d0485436b317be39972e171e823526ff84a0c62e"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.452908 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ec4cbe35-7184-45d4-b140-df2a2bac4020","Type":"ContainerStarted","Data":"90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.452964 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ec4cbe35-7184-45d4-b140-df2a2bac4020","Type":"ContainerStarted","Data":"cc8a5e0e1e2d59c6d5449520bc2d0844ecdc78ea2061fca4472583634543fe28"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.459107 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.459093719 podStartE2EDuration="2.459093719s" podCreationTimestamp="2026-02-26 17:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:17.454958318 +0000 UTC m=+5764.667556968" watchObservedRunningTime="2026-02-26 17:17:17.459093719 +0000 UTC m=+5764.671692369" Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.465322 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"37c09ad3-7436-4c25-bc1f-61b08f872396","Type":"ContainerStarted","Data":"18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.465370 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"37c09ad3-7436-4c25-bc1f-61b08f872396","Type":"ContainerStarted","Data":"976919a54d912a75c72345da6464419f064258755b01d20354faea0f9901c490"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.477783 5033 generic.go:334] "Generic (PLEG): container finished" podID="40bae61a-5728-4f79-88c4-0a66ab71dff4" containerID="89d63937d60761beba1c708088cff2a6b3934417617f6e0e403704085e9f1d0b" exitCode=0 Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.477841 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" event={"ID":"40bae61a-5728-4f79-88c4-0a66ab71dff4","Type":"ContainerDied","Data":"89d63937d60761beba1c708088cff2a6b3934417617f6e0e403704085e9f1d0b"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.477870 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" event={"ID":"40bae61a-5728-4f79-88c4-0a66ab71dff4","Type":"ContainerStarted","Data":"99759a25d9e7b4a778ac66bf621475a0e9305c090e1e2f8cce79da5e17ab1c15"} Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.477910 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.477893597 podStartE2EDuration="2.477893597s" podCreationTimestamp="2026-02-26 17:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:17.476556504 +0000 UTC m=+5764.689155154" watchObservedRunningTime="2026-02-26 17:17:17.477893597 +0000 UTC m=+5764.690492247" Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.508699 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wcths" podStartSLOduration=1.508678498 podStartE2EDuration="1.508678498s" podCreationTimestamp="2026-02-26 17:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:17.492950745 +0000 UTC m=+5764.705549395" watchObservedRunningTime="2026-02-26 17:17:17.508678498 +0000 UTC m=+5764.721277138" Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.555838 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.555815039 podStartE2EDuration="2.555815039s" podCreationTimestamp="2026-02-26 17:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:17.513584728 +0000 UTC m=+5764.726183378" watchObservedRunningTime="2026-02-26 17:17:17.555815039 +0000 UTC m=+5764.768413679" Feb 26 17:17:17 crc kubenswrapper[5033]: I0226 17:17:17.605574 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.605553232 podStartE2EDuration="2.605553232s" podCreationTimestamp="2026-02-26 17:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:17.569251596 +0000 UTC m=+5764.781850246" watchObservedRunningTime="2026-02-26 17:17:17.605553232 +0000 UTC m=+5764.818151892" Feb 26 17:17:18 crc kubenswrapper[5033]: I0226 17:17:18.493067 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" event={"ID":"40bae61a-5728-4f79-88c4-0a66ab71dff4","Type":"ContainerStarted","Data":"039048c04e1cba311341614048edacb7f1c46b02800a4fee3518e701d866519e"} Feb 26 17:17:18 crc kubenswrapper[5033]: I0226 17:17:18.493789 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:18 crc kubenswrapper[5033]: I0226 17:17:18.523320 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" podStartSLOduration=3.523302376 podStartE2EDuration="3.523302376s" podCreationTimestamp="2026-02-26 17:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:18.517521116 +0000 UTC m=+5765.730119806" watchObservedRunningTime="2026-02-26 17:17:18.523302376 +0000 UTC m=+5765.735901026" Feb 26 17:17:20 crc kubenswrapper[5033]: I0226 17:17:20.544233 5033 generic.go:334] "Generic (PLEG): container finished" podID="c674b55b-5acf-421a-8772-fea6c2e3ad3d" containerID="0561d285f8af60c4647c398e67aa76efbdae27793c11ce3e75367699fb17beb5" exitCode=0 Feb 26 17:17:20 crc kubenswrapper[5033]: I0226 17:17:20.544850 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wcths" event={"ID":"c674b55b-5acf-421a-8772-fea6c2e3ad3d","Type":"ContainerDied","Data":"0561d285f8af60c4647c398e67aa76efbdae27793c11ce3e75367699fb17beb5"} Feb 26 17:17:21 crc kubenswrapper[5033]: I0226 17:17:21.035355 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 17:17:21 crc kubenswrapper[5033]: I0226 17:17:21.035443 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 17:17:21 crc kubenswrapper[5033]: I0226 17:17:21.058411 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 17:17:21 crc kubenswrapper[5033]: I0226 17:17:21.102517 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:21 crc kubenswrapper[5033]: I0226 17:17:21.558630 5033 generic.go:334] "Generic (PLEG): container finished" podID="407bdbec-fe4f-4726-baa6-061f9c9ebc6d" containerID="c1a63c5dcac537112cf58361d0485436b317be39972e171e823526ff84a0c62e" exitCode=0 Feb 26 17:17:21 crc kubenswrapper[5033]: I0226 17:17:21.558728 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9rsvh" event={"ID":"407bdbec-fe4f-4726-baa6-061f9c9ebc6d","Type":"ContainerDied","Data":"c1a63c5dcac537112cf58361d0485436b317be39972e171e823526ff84a0c62e"} Feb 26 17:17:21 crc kubenswrapper[5033]: I0226 17:17:21.978549 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.130154 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-config-data\") pod \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.130299 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-scripts\") pod \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.130375 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-combined-ca-bundle\") pod \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.130425 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzd69\" (UniqueName: \"kubernetes.io/projected/c674b55b-5acf-421a-8772-fea6c2e3ad3d-kube-api-access-lzd69\") pod \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\" (UID: \"c674b55b-5acf-421a-8772-fea6c2e3ad3d\") " Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.136460 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c674b55b-5acf-421a-8772-fea6c2e3ad3d-kube-api-access-lzd69" (OuterVolumeSpecName: "kube-api-access-lzd69") pod "c674b55b-5acf-421a-8772-fea6c2e3ad3d" (UID: "c674b55b-5acf-421a-8772-fea6c2e3ad3d"). InnerVolumeSpecName "kube-api-access-lzd69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.136557 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-scripts" (OuterVolumeSpecName: "scripts") pod "c674b55b-5acf-421a-8772-fea6c2e3ad3d" (UID: "c674b55b-5acf-421a-8772-fea6c2e3ad3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.164063 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-config-data" (OuterVolumeSpecName: "config-data") pod "c674b55b-5acf-421a-8772-fea6c2e3ad3d" (UID: "c674b55b-5acf-421a-8772-fea6c2e3ad3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.182615 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c674b55b-5acf-421a-8772-fea6c2e3ad3d" (UID: "c674b55b-5acf-421a-8772-fea6c2e3ad3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.232719 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.232917 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.233006 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzd69\" (UniqueName: \"kubernetes.io/projected/c674b55b-5acf-421a-8772-fea6c2e3ad3d-kube-api-access-lzd69\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.233070 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c674b55b-5acf-421a-8772-fea6c2e3ad3d-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.569368 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wcths" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.569368 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wcths" event={"ID":"c674b55b-5acf-421a-8772-fea6c2e3ad3d","Type":"ContainerDied","Data":"94acaacc22bb148e3cd203f17ff3ef7fd9edea78f13e88f70f059f98474c0be3"} Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.569639 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94acaacc22bb148e3cd203f17ff3ef7fd9edea78f13e88f70f059f98474c0be3" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.645817 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 17:17:22 crc kubenswrapper[5033]: E0226 17:17:22.646272 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c674b55b-5acf-421a-8772-fea6c2e3ad3d" containerName="nova-cell1-conductor-db-sync" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.646295 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c674b55b-5acf-421a-8772-fea6c2e3ad3d" containerName="nova-cell1-conductor-db-sync" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.646509 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c674b55b-5acf-421a-8772-fea6c2e3ad3d" containerName="nova-cell1-conductor-db-sync" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.647252 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.655555 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.676775 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.740579 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.740717 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cshb8\" (UniqueName: \"kubernetes.io/projected/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-kube-api-access-cshb8\") pod \"nova-cell1-conductor-0\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.740804 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.841964 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.842062 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cshb8\" (UniqueName: \"kubernetes.io/projected/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-kube-api-access-cshb8\") pod \"nova-cell1-conductor-0\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.842109 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.848880 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.849162 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.855072 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.859061 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cshb8\" (UniqueName: \"kubernetes.io/projected/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-kube-api-access-cshb8\") pod \"nova-cell1-conductor-0\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.943099 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-config-data\") pod \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.943451 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-scripts\") pod \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.943775 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4872\" (UniqueName: \"kubernetes.io/projected/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-kube-api-access-f4872\") pod \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.943912 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-combined-ca-bundle\") pod \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\" (UID: \"407bdbec-fe4f-4726-baa6-061f9c9ebc6d\") " Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.946749 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-scripts" (OuterVolumeSpecName: "scripts") pod "407bdbec-fe4f-4726-baa6-061f9c9ebc6d" (UID: "407bdbec-fe4f-4726-baa6-061f9c9ebc6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.946765 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-kube-api-access-f4872" (OuterVolumeSpecName: "kube-api-access-f4872") pod "407bdbec-fe4f-4726-baa6-061f9c9ebc6d" (UID: "407bdbec-fe4f-4726-baa6-061f9c9ebc6d"). InnerVolumeSpecName "kube-api-access-f4872". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.964480 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-config-data" (OuterVolumeSpecName: "config-data") pod "407bdbec-fe4f-4726-baa6-061f9c9ebc6d" (UID: "407bdbec-fe4f-4726-baa6-061f9c9ebc6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.965809 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "407bdbec-fe4f-4726-baa6-061f9c9ebc6d" (UID: "407bdbec-fe4f-4726-baa6-061f9c9ebc6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:22 crc kubenswrapper[5033]: I0226 17:17:22.969058 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.046447 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.046477 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4872\" (UniqueName: \"kubernetes.io/projected/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-kube-api-access-f4872\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.046489 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.046497 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407bdbec-fe4f-4726-baa6-061f9c9ebc6d-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.403011 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.581643 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7","Type":"ContainerStarted","Data":"89bb0fc3dad0d006a3668222d0d76f2f1cf4074040bd312975731d521cf9ba73"} Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.583813 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9rsvh" event={"ID":"407bdbec-fe4f-4726-baa6-061f9c9ebc6d","Type":"ContainerDied","Data":"be5b8344503a21ecc6491df0c4f148730106dcca84cd1fd516299bfb8c9b9398"} Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.583942 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9rsvh" Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.584069 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be5b8344503a21ecc6491df0c4f148730106dcca84cd1fd516299bfb8c9b9398" Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.766461 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.767796 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="789d457d-03fa-407f-a152-8ec8c898095f" containerName="nova-api-api" containerID="cri-o://6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f" gracePeriod=30 Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.768367 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="789d457d-03fa-407f-a152-8ec8c898095f" containerName="nova-api-log" containerID="cri-o://1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158" gracePeriod=30 Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.778803 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.778999 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="37c09ad3-7436-4c25-bc1f-61b08f872396" containerName="nova-scheduler-scheduler" containerID="cri-o://18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6" gracePeriod=30 Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.810192 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.810468 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerName="nova-metadata-log" containerID="cri-o://a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac" gracePeriod=30 Feb 26 17:17:23 crc kubenswrapper[5033]: I0226 17:17:23.810606 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerName="nova-metadata-metadata" containerID="cri-o://d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b" gracePeriod=30 Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.266244 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.333933 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.368926 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d457d-03fa-407f-a152-8ec8c898095f-logs\") pod \"789d457d-03fa-407f-a152-8ec8c898095f\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.369025 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-config-data\") pod \"789d457d-03fa-407f-a152-8ec8c898095f\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.369091 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-combined-ca-bundle\") pod \"789d457d-03fa-407f-a152-8ec8c898095f\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.369115 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dt5b\" (UniqueName: \"kubernetes.io/projected/789d457d-03fa-407f-a152-8ec8c898095f-kube-api-access-6dt5b\") pod \"789d457d-03fa-407f-a152-8ec8c898095f\" (UID: \"789d457d-03fa-407f-a152-8ec8c898095f\") " Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.369470 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/789d457d-03fa-407f-a152-8ec8c898095f-logs" (OuterVolumeSpecName: "logs") pod "789d457d-03fa-407f-a152-8ec8c898095f" (UID: "789d457d-03fa-407f-a152-8ec8c898095f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.370483 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d457d-03fa-407f-a152-8ec8c898095f-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.377761 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/789d457d-03fa-407f-a152-8ec8c898095f-kube-api-access-6dt5b" (OuterVolumeSpecName: "kube-api-access-6dt5b") pod "789d457d-03fa-407f-a152-8ec8c898095f" (UID: "789d457d-03fa-407f-a152-8ec8c898095f"). InnerVolumeSpecName "kube-api-access-6dt5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.396163 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "789d457d-03fa-407f-a152-8ec8c898095f" (UID: "789d457d-03fa-407f-a152-8ec8c898095f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.412771 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-config-data" (OuterVolumeSpecName: "config-data") pod "789d457d-03fa-407f-a152-8ec8c898095f" (UID: "789d457d-03fa-407f-a152-8ec8c898095f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.471715 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-config-data\") pod \"d627150f-9aac-45c2-8dec-39a344fef2d2\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.471779 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbswh\" (UniqueName: \"kubernetes.io/projected/d627150f-9aac-45c2-8dec-39a344fef2d2-kube-api-access-rbswh\") pod \"d627150f-9aac-45c2-8dec-39a344fef2d2\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.471832 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-combined-ca-bundle\") pod \"d627150f-9aac-45c2-8dec-39a344fef2d2\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.472038 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d627150f-9aac-45c2-8dec-39a344fef2d2-logs\") pod \"d627150f-9aac-45c2-8dec-39a344fef2d2\" (UID: \"d627150f-9aac-45c2-8dec-39a344fef2d2\") " Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.472666 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.472688 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d457d-03fa-407f-a152-8ec8c898095f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.473133 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dt5b\" (UniqueName: \"kubernetes.io/projected/789d457d-03fa-407f-a152-8ec8c898095f-kube-api-access-6dt5b\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.473040 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d627150f-9aac-45c2-8dec-39a344fef2d2-logs" (OuterVolumeSpecName: "logs") pod "d627150f-9aac-45c2-8dec-39a344fef2d2" (UID: "d627150f-9aac-45c2-8dec-39a344fef2d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.476807 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d627150f-9aac-45c2-8dec-39a344fef2d2-kube-api-access-rbswh" (OuterVolumeSpecName: "kube-api-access-rbswh") pod "d627150f-9aac-45c2-8dec-39a344fef2d2" (UID: "d627150f-9aac-45c2-8dec-39a344fef2d2"). InnerVolumeSpecName "kube-api-access-rbswh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.495298 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-config-data" (OuterVolumeSpecName: "config-data") pod "d627150f-9aac-45c2-8dec-39a344fef2d2" (UID: "d627150f-9aac-45c2-8dec-39a344fef2d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.496649 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d627150f-9aac-45c2-8dec-39a344fef2d2" (UID: "d627150f-9aac-45c2-8dec-39a344fef2d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.574697 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.574731 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbswh\" (UniqueName: \"kubernetes.io/projected/d627150f-9aac-45c2-8dec-39a344fef2d2-kube-api-access-rbswh\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.574740 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d627150f-9aac-45c2-8dec-39a344fef2d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.574748 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d627150f-9aac-45c2-8dec-39a344fef2d2-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.598278 5033 generic.go:334] "Generic (PLEG): container finished" podID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerID="d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b" exitCode=0 Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.598323 5033 generic.go:334] "Generic (PLEG): container finished" podID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerID="a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac" exitCode=143 Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.598335 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.598424 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d627150f-9aac-45c2-8dec-39a344fef2d2","Type":"ContainerDied","Data":"d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b"} Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.598457 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d627150f-9aac-45c2-8dec-39a344fef2d2","Type":"ContainerDied","Data":"a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac"} Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.598473 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d627150f-9aac-45c2-8dec-39a344fef2d2","Type":"ContainerDied","Data":"ec34a501aaff6359b03c765a37d9c53581f07ac5c20e71de163f97000513e605"} Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.598493 5033 scope.go:117] "RemoveContainer" containerID="d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.601542 5033 generic.go:334] "Generic (PLEG): container finished" podID="789d457d-03fa-407f-a152-8ec8c898095f" containerID="6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f" exitCode=0 Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.601569 5033 generic.go:334] "Generic (PLEG): container finished" podID="789d457d-03fa-407f-a152-8ec8c898095f" containerID="1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158" exitCode=143 Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.601611 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"789d457d-03fa-407f-a152-8ec8c898095f","Type":"ContainerDied","Data":"6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f"} Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.601632 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"789d457d-03fa-407f-a152-8ec8c898095f","Type":"ContainerDied","Data":"1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158"} Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.601645 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"789d457d-03fa-407f-a152-8ec8c898095f","Type":"ContainerDied","Data":"2ee1c732eb7476c0a2dbb0054942d6fcfb4dba0cfc9abe0f71e7e8eb57094599"} Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.601711 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.604103 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7","Type":"ContainerStarted","Data":"9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3"} Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.605046 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.624805 5033 scope.go:117] "RemoveContainer" containerID="a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.627074 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.627056084 podStartE2EDuration="2.627056084s" podCreationTimestamp="2026-02-26 17:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:24.62356149 +0000 UTC m=+5771.836160160" watchObservedRunningTime="2026-02-26 17:17:24.627056084 +0000 UTC m=+5771.839654734" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.662850 5033 scope.go:117] "RemoveContainer" containerID="d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.663012 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:24 crc kubenswrapper[5033]: E0226 17:17:24.664722 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b\": container with ID starting with d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b not found: ID does not exist" containerID="d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.664762 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b"} err="failed to get container status \"d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b\": rpc error: code = NotFound desc = could not find container \"d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b\": container with ID starting with d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b not found: ID does not exist" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.664789 5033 scope.go:117] "RemoveContainer" containerID="a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac" Feb 26 17:17:24 crc kubenswrapper[5033]: E0226 17:17:24.665652 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac\": container with ID starting with a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac not found: ID does not exist" containerID="a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.665697 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac"} err="failed to get container status \"a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac\": rpc error: code = NotFound desc = could not find container \"a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac\": container with ID starting with a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac not found: ID does not exist" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.665721 5033 scope.go:117] "RemoveContainer" containerID="d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.669384 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b"} err="failed to get container status \"d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b\": rpc error: code = NotFound desc = could not find container \"d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b\": container with ID starting with d060b7d2d3faad20f747f77768aa5d1a910afff2df60af50d519cddb7208207b not found: ID does not exist" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.669418 5033 scope.go:117] "RemoveContainer" containerID="a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.671228 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac"} err="failed to get container status \"a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac\": rpc error: code = NotFound desc = could not find container \"a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac\": container with ID starting with a408be21b96c86f1004b36d88fe23b5ff609b8655218560528e4a56c1038b7ac not found: ID does not exist" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.671311 5033 scope.go:117] "RemoveContainer" containerID="6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.682512 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.693965 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.704764 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:24 crc kubenswrapper[5033]: E0226 17:17:24.705228 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="789d457d-03fa-407f-a152-8ec8c898095f" containerName="nova-api-log" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705248 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="789d457d-03fa-407f-a152-8ec8c898095f" containerName="nova-api-log" Feb 26 17:17:24 crc kubenswrapper[5033]: E0226 17:17:24.705306 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="789d457d-03fa-407f-a152-8ec8c898095f" containerName="nova-api-api" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705317 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="789d457d-03fa-407f-a152-8ec8c898095f" containerName="nova-api-api" Feb 26 17:17:24 crc kubenswrapper[5033]: E0226 17:17:24.705345 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerName="nova-metadata-metadata" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705356 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerName="nova-metadata-metadata" Feb 26 17:17:24 crc kubenswrapper[5033]: E0226 17:17:24.705382 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407bdbec-fe4f-4726-baa6-061f9c9ebc6d" containerName="nova-manage" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705395 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="407bdbec-fe4f-4726-baa6-061f9c9ebc6d" containerName="nova-manage" Feb 26 17:17:24 crc kubenswrapper[5033]: E0226 17:17:24.705412 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerName="nova-metadata-log" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705422 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerName="nova-metadata-log" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705639 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerName="nova-metadata-log" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705672 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="789d457d-03fa-407f-a152-8ec8c898095f" containerName="nova-api-log" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705688 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d627150f-9aac-45c2-8dec-39a344fef2d2" containerName="nova-metadata-metadata" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705700 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="407bdbec-fe4f-4726-baa6-061f9c9ebc6d" containerName="nova-manage" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.705714 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="789d457d-03fa-407f-a152-8ec8c898095f" containerName="nova-api-api" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.706930 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.718516 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.724676 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.729770 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.740371 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.742506 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.747116 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.751680 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.756793 5033 scope.go:117] "RemoveContainer" containerID="1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.783976 5033 scope.go:117] "RemoveContainer" containerID="6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f" Feb 26 17:17:24 crc kubenswrapper[5033]: E0226 17:17:24.784509 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f\": container with ID starting with 6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f not found: ID does not exist" containerID="6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.784596 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f"} err="failed to get container status \"6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f\": rpc error: code = NotFound desc = could not find container \"6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f\": container with ID starting with 6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f not found: ID does not exist" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.784683 5033 scope.go:117] "RemoveContainer" containerID="1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158" Feb 26 17:17:24 crc kubenswrapper[5033]: E0226 17:17:24.785023 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158\": container with ID starting with 1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158 not found: ID does not exist" containerID="1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.785075 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158"} err="failed to get container status \"1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158\": rpc error: code = NotFound desc = could not find container \"1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158\": container with ID starting with 1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158 not found: ID does not exist" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.785104 5033 scope.go:117] "RemoveContainer" containerID="6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.785446 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f"} err="failed to get container status \"6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f\": rpc error: code = NotFound desc = could not find container \"6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f\": container with ID starting with 6ae15d4caeafda4a6f605ae531bf5759004ee8f79303574088cf3c0f059b0a1f not found: ID does not exist" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.785538 5033 scope.go:117] "RemoveContainer" containerID="1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.785977 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158"} err="failed to get container status \"1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158\": rpc error: code = NotFound desc = could not find container \"1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158\": container with ID starting with 1503508f0f1daa8d705e07654a7db53a44eb84e693bd5a05c463de5c199ce158 not found: ID does not exist" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.884133 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-config-data\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.884330 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-config-data\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.884380 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.884430 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.884573 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6w9g\" (UniqueName: \"kubernetes.io/projected/6100fc2b-831e-4526-8ded-29fd902b3aa7-kube-api-access-v6w9g\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.884647 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ghvb\" (UniqueName: \"kubernetes.io/projected/dcd10b14-5a18-46c3-b51a-1623eab7a022-kube-api-access-5ghvb\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.884718 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcd10b14-5a18-46c3-b51a-1623eab7a022-logs\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.885208 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6100fc2b-831e-4526-8ded-29fd902b3aa7-logs\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.986955 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcd10b14-5a18-46c3-b51a-1623eab7a022-logs\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.987061 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6100fc2b-831e-4526-8ded-29fd902b3aa7-logs\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.987137 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-config-data\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.987196 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-config-data\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.987220 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.987247 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.987471 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcd10b14-5a18-46c3-b51a-1623eab7a022-logs\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.988161 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6w9g\" (UniqueName: \"kubernetes.io/projected/6100fc2b-831e-4526-8ded-29fd902b3aa7-kube-api-access-v6w9g\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.988502 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ghvb\" (UniqueName: \"kubernetes.io/projected/dcd10b14-5a18-46c3-b51a-1623eab7a022-kube-api-access-5ghvb\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.989219 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6100fc2b-831e-4526-8ded-29fd902b3aa7-logs\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.991124 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-config-data\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.992472 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.993280 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:24 crc kubenswrapper[5033]: I0226 17:17:24.993392 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-config-data\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.004881 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6w9g\" (UniqueName: \"kubernetes.io/projected/6100fc2b-831e-4526-8ded-29fd902b3aa7-kube-api-access-v6w9g\") pod \"nova-metadata-0\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " pod="openstack/nova-metadata-0" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.005938 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ghvb\" (UniqueName: \"kubernetes.io/projected/dcd10b14-5a18-46c3-b51a-1623eab7a022-kube-api-access-5ghvb\") pod \"nova-api-0\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " pod="openstack/nova-api-0" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.060745 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.078249 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.547369 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.649643 5033 generic.go:334] "Generic (PLEG): container finished" podID="37c09ad3-7436-4c25-bc1f-61b08f872396" containerID="18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6" exitCode=0 Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.649790 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.656648 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="789d457d-03fa-407f-a152-8ec8c898095f" path="/var/lib/kubelet/pods/789d457d-03fa-407f-a152-8ec8c898095f/volumes" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.657555 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d627150f-9aac-45c2-8dec-39a344fef2d2" path="/var/lib/kubelet/pods/d627150f-9aac-45c2-8dec-39a344fef2d2/volumes" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.658333 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.658370 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6100fc2b-831e-4526-8ded-29fd902b3aa7","Type":"ContainerStarted","Data":"0aafb0740270f2e9101d12e385fad32bc05d933e03ee6f86239b457a44b20f4a"} Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.658397 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"37c09ad3-7436-4c25-bc1f-61b08f872396","Type":"ContainerDied","Data":"18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6"} Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.658431 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"37c09ad3-7436-4c25-bc1f-61b08f872396","Type":"ContainerDied","Data":"976919a54d912a75c72345da6464419f064258755b01d20354faea0f9901c490"} Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.658455 5033 scope.go:117] "RemoveContainer" containerID="18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.700610 5033 scope.go:117] "RemoveContainer" containerID="18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6" Feb 26 17:17:25 crc kubenswrapper[5033]: E0226 17:17:25.700977 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6\": container with ID starting with 18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6 not found: ID does not exist" containerID="18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.701020 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6"} err="failed to get container status \"18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6\": rpc error: code = NotFound desc = could not find container \"18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6\": container with ID starting with 18a1d1c95caa52f3a55de44e0939bdb4a8b53f3dcf53eb9da637a8dfcedc95f6 not found: ID does not exist" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.798489 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-combined-ca-bundle\") pod \"37c09ad3-7436-4c25-bc1f-61b08f872396\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.798899 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-config-data\") pod \"37c09ad3-7436-4c25-bc1f-61b08f872396\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.798960 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45b9t\" (UniqueName: \"kubernetes.io/projected/37c09ad3-7436-4c25-bc1f-61b08f872396-kube-api-access-45b9t\") pod \"37c09ad3-7436-4c25-bc1f-61b08f872396\" (UID: \"37c09ad3-7436-4c25-bc1f-61b08f872396\") " Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.806023 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c09ad3-7436-4c25-bc1f-61b08f872396-kube-api-access-45b9t" (OuterVolumeSpecName: "kube-api-access-45b9t") pod "37c09ad3-7436-4c25-bc1f-61b08f872396" (UID: "37c09ad3-7436-4c25-bc1f-61b08f872396"). InnerVolumeSpecName "kube-api-access-45b9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.834493 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37c09ad3-7436-4c25-bc1f-61b08f872396" (UID: "37c09ad3-7436-4c25-bc1f-61b08f872396"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.840119 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-config-data" (OuterVolumeSpecName: "config-data") pod "37c09ad3-7436-4c25-bc1f-61b08f872396" (UID: "37c09ad3-7436-4c25-bc1f-61b08f872396"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.906577 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.906623 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c09ad3-7436-4c25-bc1f-61b08f872396-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:25 crc kubenswrapper[5033]: I0226 17:17:25.906634 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45b9t\" (UniqueName: \"kubernetes.io/projected/37c09ad3-7436-4c25-bc1f-61b08f872396-kube-api-access-45b9t\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.073454 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.102589 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.113673 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.146209 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f997b9c-rb87b"] Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.146519 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" podUID="858c2903-f2d4-487b-89c1-0d5f29774f01" containerName="dnsmasq-dns" containerID="cri-o://507f536d72f6cbddd2b0f79605b953393afc6bd3104284c25d7b6a13bb616ea4" gracePeriod=10 Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.660965 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6100fc2b-831e-4526-8ded-29fd902b3aa7","Type":"ContainerStarted","Data":"7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42"} Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.661279 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6100fc2b-831e-4526-8ded-29fd902b3aa7","Type":"ContainerStarted","Data":"92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0"} Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.663072 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcd10b14-5a18-46c3-b51a-1623eab7a022","Type":"ContainerStarted","Data":"a017ab1ec3f640b9209ba666f4769084e138e86bd8978eb6725f77cf1ec7fdb3"} Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.663098 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcd10b14-5a18-46c3-b51a-1623eab7a022","Type":"ContainerStarted","Data":"570e9791f714be8a3c0ff940bf7c7e1098e9c5a09a6597fdd8fd14ddad1cada4"} Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.663107 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcd10b14-5a18-46c3-b51a-1623eab7a022","Type":"ContainerStarted","Data":"332c7c5b5745281e323321fa775cc68d4a1974c31e828170608a99471aaa8e2d"} Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.664484 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.671627 5033 generic.go:334] "Generic (PLEG): container finished" podID="858c2903-f2d4-487b-89c1-0d5f29774f01" containerID="507f536d72f6cbddd2b0f79605b953393afc6bd3104284c25d7b6a13bb616ea4" exitCode=0 Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.671744 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" event={"ID":"858c2903-f2d4-487b-89c1-0d5f29774f01","Type":"ContainerDied","Data":"507f536d72f6cbddd2b0f79605b953393afc6bd3104284c25d7b6a13bb616ea4"} Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.671812 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" event={"ID":"858c2903-f2d4-487b-89c1-0d5f29774f01","Type":"ContainerDied","Data":"2f41d2eb080fb44f0a4c61fb3ec7c9cf69be57567582ed1901221bbebc92b1cf"} Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.671827 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f41d2eb080fb44f0a4c61fb3ec7c9cf69be57567582ed1901221bbebc92b1cf" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.679715 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.686274 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.700430 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.700413197 podStartE2EDuration="2.700413197s" podCreationTimestamp="2026-02-26 17:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:26.685601815 +0000 UTC m=+5773.898200466" watchObservedRunningTime="2026-02-26 17:17:26.700413197 +0000 UTC m=+5773.913011837" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.732949 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.758110 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.783063 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:26 crc kubenswrapper[5033]: E0226 17:17:26.783577 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858c2903-f2d4-487b-89c1-0d5f29774f01" containerName="dnsmasq-dns" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.783594 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="858c2903-f2d4-487b-89c1-0d5f29774f01" containerName="dnsmasq-dns" Feb 26 17:17:26 crc kubenswrapper[5033]: E0226 17:17:26.783614 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c09ad3-7436-4c25-bc1f-61b08f872396" containerName="nova-scheduler-scheduler" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.783622 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c09ad3-7436-4c25-bc1f-61b08f872396" containerName="nova-scheduler-scheduler" Feb 26 17:17:26 crc kubenswrapper[5033]: E0226 17:17:26.783651 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858c2903-f2d4-487b-89c1-0d5f29774f01" containerName="init" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.783660 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="858c2903-f2d4-487b-89c1-0d5f29774f01" containerName="init" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.783900 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c09ad3-7436-4c25-bc1f-61b08f872396" containerName="nova-scheduler-scheduler" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.783922 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="858c2903-f2d4-487b-89c1-0d5f29774f01" containerName="dnsmasq-dns" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.789798 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.788800163 podStartE2EDuration="2.788800163s" podCreationTimestamp="2026-02-26 17:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:26.762639896 +0000 UTC m=+5773.975238546" watchObservedRunningTime="2026-02-26 17:17:26.788800163 +0000 UTC m=+5774.001398813" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.791592 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.796639 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.811320 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.826312 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-nb\") pod \"858c2903-f2d4-487b-89c1-0d5f29774f01\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.826373 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-dns-svc\") pod \"858c2903-f2d4-487b-89c1-0d5f29774f01\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.826463 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7dns\" (UniqueName: \"kubernetes.io/projected/858c2903-f2d4-487b-89c1-0d5f29774f01-kube-api-access-r7dns\") pod \"858c2903-f2d4-487b-89c1-0d5f29774f01\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.826560 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-config\") pod \"858c2903-f2d4-487b-89c1-0d5f29774f01\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.826648 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-sb\") pod \"858c2903-f2d4-487b-89c1-0d5f29774f01\" (UID: \"858c2903-f2d4-487b-89c1-0d5f29774f01\") " Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.833044 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858c2903-f2d4-487b-89c1-0d5f29774f01-kube-api-access-r7dns" (OuterVolumeSpecName: "kube-api-access-r7dns") pod "858c2903-f2d4-487b-89c1-0d5f29774f01" (UID: "858c2903-f2d4-487b-89c1-0d5f29774f01"). InnerVolumeSpecName "kube-api-access-r7dns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.883915 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "858c2903-f2d4-487b-89c1-0d5f29774f01" (UID: "858c2903-f2d4-487b-89c1-0d5f29774f01"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.889489 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-config" (OuterVolumeSpecName: "config") pod "858c2903-f2d4-487b-89c1-0d5f29774f01" (UID: "858c2903-f2d4-487b-89c1-0d5f29774f01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.922215 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "858c2903-f2d4-487b-89c1-0d5f29774f01" (UID: "858c2903-f2d4-487b-89c1-0d5f29774f01"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.932601 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcfc4\" (UniqueName: \"kubernetes.io/projected/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-kube-api-access-jcfc4\") pod \"nova-scheduler-0\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.932846 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.932945 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-config-data\") pod \"nova-scheduler-0\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.933053 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7dns\" (UniqueName: \"kubernetes.io/projected/858c2903-f2d4-487b-89c1-0d5f29774f01-kube-api-access-r7dns\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.933101 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.933116 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.933130 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:26 crc kubenswrapper[5033]: I0226 17:17:26.938710 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "858c2903-f2d4-487b-89c1-0d5f29774f01" (UID: "858c2903-f2d4-487b-89c1-0d5f29774f01"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.035193 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.035288 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-config-data\") pod \"nova-scheduler-0\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.035360 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcfc4\" (UniqueName: \"kubernetes.io/projected/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-kube-api-access-jcfc4\") pod \"nova-scheduler-0\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.035521 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/858c2903-f2d4-487b-89c1-0d5f29774f01-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.038842 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-config-data\") pod \"nova-scheduler-0\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.038953 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.052251 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcfc4\" (UniqueName: \"kubernetes.io/projected/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-kube-api-access-jcfc4\") pod \"nova-scheduler-0\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.117484 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.530952 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:27 crc kubenswrapper[5033]: W0226 17:17:27.531596 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1293ed8a_2c6b_423b_9f4b_36bdc849e4de.slice/crio-0a9f49bc52a81e4fa671e18ceddc66d5a06d27949f0e6e7b1ee5a610b186d232 WatchSource:0}: Error finding container 0a9f49bc52a81e4fa671e18ceddc66d5a06d27949f0e6e7b1ee5a610b186d232: Status 404 returned error can't find the container with id 0a9f49bc52a81e4fa671e18ceddc66d5a06d27949f0e6e7b1ee5a610b186d232 Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.640140 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c09ad3-7436-4c25-bc1f-61b08f872396" path="/var/lib/kubelet/pods/37c09ad3-7436-4c25-bc1f-61b08f872396/volumes" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.684574 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1293ed8a-2c6b-423b-9f4b-36bdc849e4de","Type":"ContainerStarted","Data":"0a9f49bc52a81e4fa671e18ceddc66d5a06d27949f0e6e7b1ee5a610b186d232"} Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.684721 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f997b9c-rb87b" Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.712994 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f997b9c-rb87b"] Feb 26 17:17:27 crc kubenswrapper[5033]: I0226 17:17:27.722137 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9f997b9c-rb87b"] Feb 26 17:17:28 crc kubenswrapper[5033]: I0226 17:17:28.697886 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1293ed8a-2c6b-423b-9f4b-36bdc849e4de","Type":"ContainerStarted","Data":"b41c6ee7a2b099cf5672186d614f6eacaa40cf2d57f0bb5b6e0307048a94c358"} Feb 26 17:17:28 crc kubenswrapper[5033]: I0226 17:17:28.724863 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.724842415 podStartE2EDuration="2.724842415s" podCreationTimestamp="2026-02-26 17:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:28.7188886 +0000 UTC m=+5775.931487270" watchObservedRunningTime="2026-02-26 17:17:28.724842415 +0000 UTC m=+5775.937441085" Feb 26 17:17:28 crc kubenswrapper[5033]: I0226 17:17:28.943353 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9l9q7"] Feb 26 17:17:28 crc kubenswrapper[5033]: I0226 17:17:28.945805 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:28 crc kubenswrapper[5033]: I0226 17:17:28.954726 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9l9q7"] Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.072454 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnr4k\" (UniqueName: \"kubernetes.io/projected/ca56b97b-d443-4ada-8e36-052eb3503e70-kube-api-access-wnr4k\") pod \"certified-operators-9l9q7\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.072700 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-utilities\") pod \"certified-operators-9l9q7\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.072831 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-catalog-content\") pod \"certified-operators-9l9q7\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.175163 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnr4k\" (UniqueName: \"kubernetes.io/projected/ca56b97b-d443-4ada-8e36-052eb3503e70-kube-api-access-wnr4k\") pod \"certified-operators-9l9q7\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.175592 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-utilities\") pod \"certified-operators-9l9q7\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.176070 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-utilities\") pod \"certified-operators-9l9q7\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.176446 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-catalog-content\") pod \"certified-operators-9l9q7\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.176480 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-catalog-content\") pod \"certified-operators-9l9q7\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.194586 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnr4k\" (UniqueName: \"kubernetes.io/projected/ca56b97b-d443-4ada-8e36-052eb3503e70-kube-api-access-wnr4k\") pod \"certified-operators-9l9q7\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.269940 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.641518 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858c2903-f2d4-487b-89c1-0d5f29774f01" path="/var/lib/kubelet/pods/858c2903-f2d4-487b-89c1-0d5f29774f01/volumes" Feb 26 17:17:29 crc kubenswrapper[5033]: I0226 17:17:29.783028 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9l9q7"] Feb 26 17:17:29 crc kubenswrapper[5033]: W0226 17:17:29.783286 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca56b97b_d443_4ada_8e36_052eb3503e70.slice/crio-157a91a42bf32f0468545f0d28dbb9f1cc07be8dcd1ebd85e2a4dd7cd0241c91 WatchSource:0}: Error finding container 157a91a42bf32f0468545f0d28dbb9f1cc07be8dcd1ebd85e2a4dd7cd0241c91: Status 404 returned error can't find the container with id 157a91a42bf32f0468545f0d28dbb9f1cc07be8dcd1ebd85e2a4dd7cd0241c91 Feb 26 17:17:30 crc kubenswrapper[5033]: I0226 17:17:30.060899 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 17:17:30 crc kubenswrapper[5033]: I0226 17:17:30.060951 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 17:17:30 crc kubenswrapper[5033]: I0226 17:17:30.715006 5033 generic.go:334] "Generic (PLEG): container finished" podID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerID="8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883" exitCode=0 Feb 26 17:17:30 crc kubenswrapper[5033]: I0226 17:17:30.715072 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9q7" event={"ID":"ca56b97b-d443-4ada-8e36-052eb3503e70","Type":"ContainerDied","Data":"8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883"} Feb 26 17:17:30 crc kubenswrapper[5033]: I0226 17:17:30.715111 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9q7" event={"ID":"ca56b97b-d443-4ada-8e36-052eb3503e70","Type":"ContainerStarted","Data":"157a91a42bf32f0468545f0d28dbb9f1cc07be8dcd1ebd85e2a4dd7cd0241c91"} Feb 26 17:17:31 crc kubenswrapper[5033]: I0226 17:17:31.629062 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:17:31 crc kubenswrapper[5033]: E0226 17:17:31.630207 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:17:32 crc kubenswrapper[5033]: I0226 17:17:32.119302 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 17:17:32 crc kubenswrapper[5033]: I0226 17:17:32.738801 5033 generic.go:334] "Generic (PLEG): container finished" podID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerID="93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66" exitCode=0 Feb 26 17:17:32 crc kubenswrapper[5033]: I0226 17:17:32.738862 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9q7" event={"ID":"ca56b97b-d443-4ada-8e36-052eb3503e70","Type":"ContainerDied","Data":"93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66"} Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:32.997185 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.504176 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-g5czw"] Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.506451 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.510467 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.513804 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-g5czw"] Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.534811 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.577079 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.577169 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-config-data\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.577191 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjn6r\" (UniqueName: \"kubernetes.io/projected/0c3fafef-bdef-4e63-afa3-aa2460c1b926-kube-api-access-gjn6r\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.577238 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-scripts\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.678692 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-config-data\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.678735 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjn6r\" (UniqueName: \"kubernetes.io/projected/0c3fafef-bdef-4e63-afa3-aa2460c1b926-kube-api-access-gjn6r\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.678811 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-scripts\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.678876 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.685953 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.686763 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-config-data\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.690766 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-scripts\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.705014 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjn6r\" (UniqueName: \"kubernetes.io/projected/0c3fafef-bdef-4e63-afa3-aa2460c1b926-kube-api-access-gjn6r\") pod \"nova-cell1-cell-mapping-g5czw\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.752134 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9q7" event={"ID":"ca56b97b-d443-4ada-8e36-052eb3503e70","Type":"ContainerStarted","Data":"236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736"} Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.772837 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9l9q7" podStartSLOduration=3.39089761 podStartE2EDuration="5.772816211s" podCreationTimestamp="2026-02-26 17:17:28 +0000 UTC" firstStartedPulling="2026-02-26 17:17:30.7171464 +0000 UTC m=+5777.929745050" lastFinishedPulling="2026-02-26 17:17:33.099065001 +0000 UTC m=+5780.311663651" observedRunningTime="2026-02-26 17:17:33.768277211 +0000 UTC m=+5780.980875881" watchObservedRunningTime="2026-02-26 17:17:33.772816211 +0000 UTC m=+5780.985414861" Feb 26 17:17:33 crc kubenswrapper[5033]: I0226 17:17:33.836088 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:34 crc kubenswrapper[5033]: I0226 17:17:34.194873 5033 scope.go:117] "RemoveContainer" containerID="31b83a0f2fc391d0bcb0acf8cf9cba8009549a3c5146061368132d0b65153876" Feb 26 17:17:34 crc kubenswrapper[5033]: I0226 17:17:34.390083 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-g5czw"] Feb 26 17:17:34 crc kubenswrapper[5033]: W0226 17:17:34.394485 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c3fafef_bdef_4e63_afa3_aa2460c1b926.slice/crio-5c74d08646055b2ea44b6f5748ed8c70b36149d20c78d7d2cdf7d44b9228872d WatchSource:0}: Error finding container 5c74d08646055b2ea44b6f5748ed8c70b36149d20c78d7d2cdf7d44b9228872d: Status 404 returned error can't find the container with id 5c74d08646055b2ea44b6f5748ed8c70b36149d20c78d7d2cdf7d44b9228872d Feb 26 17:17:34 crc kubenswrapper[5033]: I0226 17:17:34.765230 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g5czw" event={"ID":"0c3fafef-bdef-4e63-afa3-aa2460c1b926","Type":"ContainerStarted","Data":"19f81430c6a3452b15b6f486192cc79ec3e9914727a760f24da53ca3e93240eb"} Feb 26 17:17:34 crc kubenswrapper[5033]: I0226 17:17:34.765622 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g5czw" event={"ID":"0c3fafef-bdef-4e63-afa3-aa2460c1b926","Type":"ContainerStarted","Data":"5c74d08646055b2ea44b6f5748ed8c70b36149d20c78d7d2cdf7d44b9228872d"} Feb 26 17:17:34 crc kubenswrapper[5033]: I0226 17:17:34.790219 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-g5czw" podStartSLOduration=1.790193147 podStartE2EDuration="1.790193147s" podCreationTimestamp="2026-02-26 17:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:34.781402883 +0000 UTC m=+5781.994001533" watchObservedRunningTime="2026-02-26 17:17:34.790193147 +0000 UTC m=+5782.002791797" Feb 26 17:17:35 crc kubenswrapper[5033]: I0226 17:17:35.061623 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 17:17:35 crc kubenswrapper[5033]: I0226 17:17:35.061734 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 17:17:35 crc kubenswrapper[5033]: I0226 17:17:35.078835 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 17:17:35 crc kubenswrapper[5033]: I0226 17:17:35.078905 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 17:17:36 crc kubenswrapper[5033]: I0226 17:17:36.225442 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.102:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:17:36 crc kubenswrapper[5033]: I0226 17:17:36.225560 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.101:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:17:36 crc kubenswrapper[5033]: I0226 17:17:36.225882 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.101:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:17:36 crc kubenswrapper[5033]: I0226 17:17:36.226052 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.102:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:17:37 crc kubenswrapper[5033]: I0226 17:17:37.119035 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 17:17:37 crc kubenswrapper[5033]: I0226 17:17:37.157834 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 17:17:37 crc kubenswrapper[5033]: I0226 17:17:37.814873 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 17:17:39 crc kubenswrapper[5033]: I0226 17:17:39.270131 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:39 crc kubenswrapper[5033]: I0226 17:17:39.270395 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:39 crc kubenswrapper[5033]: I0226 17:17:39.337604 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:39 crc kubenswrapper[5033]: I0226 17:17:39.811181 5033 generic.go:334] "Generic (PLEG): container finished" podID="0c3fafef-bdef-4e63-afa3-aa2460c1b926" containerID="19f81430c6a3452b15b6f486192cc79ec3e9914727a760f24da53ca3e93240eb" exitCode=0 Feb 26 17:17:39 crc kubenswrapper[5033]: I0226 17:17:39.812380 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g5czw" event={"ID":"0c3fafef-bdef-4e63-afa3-aa2460c1b926","Type":"ContainerDied","Data":"19f81430c6a3452b15b6f486192cc79ec3e9914727a760f24da53ca3e93240eb"} Feb 26 17:17:39 crc kubenswrapper[5033]: I0226 17:17:39.868196 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:39 crc kubenswrapper[5033]: I0226 17:17:39.918917 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9l9q7"] Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.240717 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.352053 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-combined-ca-bundle\") pod \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.352103 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-scripts\") pod \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.352178 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-config-data\") pod \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.352348 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjn6r\" (UniqueName: \"kubernetes.io/projected/0c3fafef-bdef-4e63-afa3-aa2460c1b926-kube-api-access-gjn6r\") pod \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\" (UID: \"0c3fafef-bdef-4e63-afa3-aa2460c1b926\") " Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.357227 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-scripts" (OuterVolumeSpecName: "scripts") pod "0c3fafef-bdef-4e63-afa3-aa2460c1b926" (UID: "0c3fafef-bdef-4e63-afa3-aa2460c1b926"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.357426 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3fafef-bdef-4e63-afa3-aa2460c1b926-kube-api-access-gjn6r" (OuterVolumeSpecName: "kube-api-access-gjn6r") pod "0c3fafef-bdef-4e63-afa3-aa2460c1b926" (UID: "0c3fafef-bdef-4e63-afa3-aa2460c1b926"). InnerVolumeSpecName "kube-api-access-gjn6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.375838 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c3fafef-bdef-4e63-afa3-aa2460c1b926" (UID: "0c3fafef-bdef-4e63-afa3-aa2460c1b926"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.383495 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-config-data" (OuterVolumeSpecName: "config-data") pod "0c3fafef-bdef-4e63-afa3-aa2460c1b926" (UID: "0c3fafef-bdef-4e63-afa3-aa2460c1b926"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.454773 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.455074 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.455183 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjn6r\" (UniqueName: \"kubernetes.io/projected/0c3fafef-bdef-4e63-afa3-aa2460c1b926-kube-api-access-gjn6r\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.455318 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3fafef-bdef-4e63-afa3-aa2460c1b926-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.830114 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-g5czw" event={"ID":"0c3fafef-bdef-4e63-afa3-aa2460c1b926","Type":"ContainerDied","Data":"5c74d08646055b2ea44b6f5748ed8c70b36149d20c78d7d2cdf7d44b9228872d"} Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.831464 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c74d08646055b2ea44b6f5748ed8c70b36149d20c78d7d2cdf7d44b9228872d" Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.830247 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9l9q7" podUID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerName="registry-server" containerID="cri-o://236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736" gracePeriod=2 Feb 26 17:17:41 crc kubenswrapper[5033]: I0226 17:17:41.830138 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-g5czw" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.011494 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.011757 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-log" containerID="cri-o://570e9791f714be8a3c0ff940bf7c7e1098e9c5a09a6597fdd8fd14ddad1cada4" gracePeriod=30 Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.011854 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-api" containerID="cri-o://a017ab1ec3f640b9209ba666f4769084e138e86bd8978eb6725f77cf1ec7fdb3" gracePeriod=30 Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.024888 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.025107 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1293ed8a-2c6b-423b-9f4b-36bdc849e4de" containerName="nova-scheduler-scheduler" containerID="cri-o://b41c6ee7a2b099cf5672186d614f6eacaa40cf2d57f0bb5b6e0307048a94c358" gracePeriod=30 Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.045110 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.045327 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-log" containerID="cri-o://92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0" gracePeriod=30 Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.045643 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-metadata" containerID="cri-o://7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42" gracePeriod=30 Feb 26 17:17:42 crc kubenswrapper[5033]: E0226 17:17:42.121399 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b41c6ee7a2b099cf5672186d614f6eacaa40cf2d57f0bb5b6e0307048a94c358" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 17:17:42 crc kubenswrapper[5033]: E0226 17:17:42.123492 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b41c6ee7a2b099cf5672186d614f6eacaa40cf2d57f0bb5b6e0307048a94c358" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 17:17:42 crc kubenswrapper[5033]: E0226 17:17:42.124839 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b41c6ee7a2b099cf5672186d614f6eacaa40cf2d57f0bb5b6e0307048a94c358" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 17:17:42 crc kubenswrapper[5033]: E0226 17:17:42.124998 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1293ed8a-2c6b-423b-9f4b-36bdc849e4de" containerName="nova-scheduler-scheduler" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.392320 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.581874 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-catalog-content\") pod \"ca56b97b-d443-4ada-8e36-052eb3503e70\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.581925 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-utilities\") pod \"ca56b97b-d443-4ada-8e36-052eb3503e70\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.581993 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnr4k\" (UniqueName: \"kubernetes.io/projected/ca56b97b-d443-4ada-8e36-052eb3503e70-kube-api-access-wnr4k\") pod \"ca56b97b-d443-4ada-8e36-052eb3503e70\" (UID: \"ca56b97b-d443-4ada-8e36-052eb3503e70\") " Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.583510 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-utilities" (OuterVolumeSpecName: "utilities") pod "ca56b97b-d443-4ada-8e36-052eb3503e70" (UID: "ca56b97b-d443-4ada-8e36-052eb3503e70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.587470 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca56b97b-d443-4ada-8e36-052eb3503e70-kube-api-access-wnr4k" (OuterVolumeSpecName: "kube-api-access-wnr4k") pod "ca56b97b-d443-4ada-8e36-052eb3503e70" (UID: "ca56b97b-d443-4ada-8e36-052eb3503e70"). InnerVolumeSpecName "kube-api-access-wnr4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.684120 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.684151 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnr4k\" (UniqueName: \"kubernetes.io/projected/ca56b97b-d443-4ada-8e36-052eb3503e70-kube-api-access-wnr4k\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.842189 5033 generic.go:334] "Generic (PLEG): container finished" podID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerID="236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736" exitCode=0 Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.842269 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l9q7" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.842275 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9q7" event={"ID":"ca56b97b-d443-4ada-8e36-052eb3503e70","Type":"ContainerDied","Data":"236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736"} Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.842323 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9q7" event={"ID":"ca56b97b-d443-4ada-8e36-052eb3503e70","Type":"ContainerDied","Data":"157a91a42bf32f0468545f0d28dbb9f1cc07be8dcd1ebd85e2a4dd7cd0241c91"} Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.842367 5033 scope.go:117] "RemoveContainer" containerID="236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.845531 5033 generic.go:334] "Generic (PLEG): container finished" podID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerID="92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0" exitCode=143 Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.845594 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6100fc2b-831e-4526-8ded-29fd902b3aa7","Type":"ContainerDied","Data":"92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0"} Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.848668 5033 generic.go:334] "Generic (PLEG): container finished" podID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerID="570e9791f714be8a3c0ff940bf7c7e1098e9c5a09a6597fdd8fd14ddad1cada4" exitCode=143 Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.848692 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcd10b14-5a18-46c3-b51a-1623eab7a022","Type":"ContainerDied","Data":"570e9791f714be8a3c0ff940bf7c7e1098e9c5a09a6597fdd8fd14ddad1cada4"} Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.863969 5033 scope.go:117] "RemoveContainer" containerID="93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.888213 5033 scope.go:117] "RemoveContainer" containerID="8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.928226 5033 scope.go:117] "RemoveContainer" containerID="236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736" Feb 26 17:17:42 crc kubenswrapper[5033]: E0226 17:17:42.928769 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736\": container with ID starting with 236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736 not found: ID does not exist" containerID="236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.928797 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736"} err="failed to get container status \"236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736\": rpc error: code = NotFound desc = could not find container \"236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736\": container with ID starting with 236920e9ede4fdd3fe81a3818eaa3030412870cc8b92339aea60d5ee29b92736 not found: ID does not exist" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.928816 5033 scope.go:117] "RemoveContainer" containerID="93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66" Feb 26 17:17:42 crc kubenswrapper[5033]: E0226 17:17:42.929047 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66\": container with ID starting with 93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66 not found: ID does not exist" containerID="93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.929064 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66"} err="failed to get container status \"93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66\": rpc error: code = NotFound desc = could not find container \"93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66\": container with ID starting with 93a88f096e47f5d4cb7b5231e6d12eb577b9217e63c11701c9125ff92602fc66 not found: ID does not exist" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.929076 5033 scope.go:117] "RemoveContainer" containerID="8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883" Feb 26 17:17:42 crc kubenswrapper[5033]: E0226 17:17:42.929323 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883\": container with ID starting with 8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883 not found: ID does not exist" containerID="8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.929347 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883"} err="failed to get container status \"8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883\": rpc error: code = NotFound desc = could not find container \"8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883\": container with ID starting with 8d5a6193ecea4cb333eca4e27d81f91828b8d16d737332322fcb510855082883 not found: ID does not exist" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.963824 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca56b97b-d443-4ada-8e36-052eb3503e70" (UID: "ca56b97b-d443-4ada-8e36-052eb3503e70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:17:42 crc kubenswrapper[5033]: I0226 17:17:42.991204 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca56b97b-d443-4ada-8e36-052eb3503e70-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:43 crc kubenswrapper[5033]: I0226 17:17:43.205593 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9l9q7"] Feb 26 17:17:43 crc kubenswrapper[5033]: I0226 17:17:43.213713 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9l9q7"] Feb 26 17:17:43 crc kubenswrapper[5033]: E0226 17:17:43.336478 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca56b97b_d443_4ada_8e36_052eb3503e70.slice\": RecentStats: unable to find data in memory cache]" Feb 26 17:17:43 crc kubenswrapper[5033]: I0226 17:17:43.647284 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca56b97b-d443-4ada-8e36-052eb3503e70" path="/var/lib/kubelet/pods/ca56b97b-d443-4ada-8e36-052eb3503e70/volumes" Feb 26 17:17:43 crc kubenswrapper[5033]: I0226 17:17:43.860487 5033 generic.go:334] "Generic (PLEG): container finished" podID="1293ed8a-2c6b-423b-9f4b-36bdc849e4de" containerID="b41c6ee7a2b099cf5672186d614f6eacaa40cf2d57f0bb5b6e0307048a94c358" exitCode=0 Feb 26 17:17:43 crc kubenswrapper[5033]: I0226 17:17:43.860535 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1293ed8a-2c6b-423b-9f4b-36bdc849e4de","Type":"ContainerDied","Data":"b41c6ee7a2b099cf5672186d614f6eacaa40cf2d57f0bb5b6e0307048a94c358"} Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.226299 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.313680 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-combined-ca-bundle\") pod \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.313836 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-config-data\") pod \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.314004 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcfc4\" (UniqueName: \"kubernetes.io/projected/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-kube-api-access-jcfc4\") pod \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\" (UID: \"1293ed8a-2c6b-423b-9f4b-36bdc849e4de\") " Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.319245 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-kube-api-access-jcfc4" (OuterVolumeSpecName: "kube-api-access-jcfc4") pod "1293ed8a-2c6b-423b-9f4b-36bdc849e4de" (UID: "1293ed8a-2c6b-423b-9f4b-36bdc849e4de"). InnerVolumeSpecName "kube-api-access-jcfc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.348003 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1293ed8a-2c6b-423b-9f4b-36bdc849e4de" (UID: "1293ed8a-2c6b-423b-9f4b-36bdc849e4de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.365635 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-config-data" (OuterVolumeSpecName: "config-data") pod "1293ed8a-2c6b-423b-9f4b-36bdc849e4de" (UID: "1293ed8a-2c6b-423b-9f4b-36bdc849e4de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.416043 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.416090 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcfc4\" (UniqueName: \"kubernetes.io/projected/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-kube-api-access-jcfc4\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.416100 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293ed8a-2c6b-423b-9f4b-36bdc849e4de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.627898 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:17:44 crc kubenswrapper[5033]: E0226 17:17:44.628186 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.869157 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1293ed8a-2c6b-423b-9f4b-36bdc849e4de","Type":"ContainerDied","Data":"0a9f49bc52a81e4fa671e18ceddc66d5a06d27949f0e6e7b1ee5a610b186d232"} Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.869220 5033 scope.go:117] "RemoveContainer" containerID="b41c6ee7a2b099cf5672186d614f6eacaa40cf2d57f0bb5b6e0307048a94c358" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.869249 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.920549 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.928248 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.943239 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:44 crc kubenswrapper[5033]: E0226 17:17:44.943584 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3fafef-bdef-4e63-afa3-aa2460c1b926" containerName="nova-manage" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.943600 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3fafef-bdef-4e63-afa3-aa2460c1b926" containerName="nova-manage" Feb 26 17:17:44 crc kubenswrapper[5033]: E0226 17:17:44.943614 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerName="registry-server" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.943620 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerName="registry-server" Feb 26 17:17:44 crc kubenswrapper[5033]: E0226 17:17:44.943633 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerName="extract-utilities" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.943639 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerName="extract-utilities" Feb 26 17:17:44 crc kubenswrapper[5033]: E0226 17:17:44.943651 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerName="extract-content" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.943659 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerName="extract-content" Feb 26 17:17:44 crc kubenswrapper[5033]: E0226 17:17:44.943676 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1293ed8a-2c6b-423b-9f4b-36bdc849e4de" containerName="nova-scheduler-scheduler" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.943682 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1293ed8a-2c6b-423b-9f4b-36bdc849e4de" containerName="nova-scheduler-scheduler" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.943834 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca56b97b-d443-4ada-8e36-052eb3503e70" containerName="registry-server" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.943849 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3fafef-bdef-4e63-afa3-aa2460c1b926" containerName="nova-manage" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.943862 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1293ed8a-2c6b-423b-9f4b-36bdc849e4de" containerName="nova-scheduler-scheduler" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.944454 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.947740 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 17:17:44 crc kubenswrapper[5033]: I0226 17:17:44.970063 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.128590 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65bxr\" (UniqueName: \"kubernetes.io/projected/96031ac0-f3a0-4e69-9cc3-12f595f19526-kube-api-access-65bxr\") pod \"nova-scheduler-0\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.128718 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-config-data\") pod \"nova-scheduler-0\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.128761 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.231061 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65bxr\" (UniqueName: \"kubernetes.io/projected/96031ac0-f3a0-4e69-9cc3-12f595f19526-kube-api-access-65bxr\") pod \"nova-scheduler-0\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.231169 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-config-data\") pod \"nova-scheduler-0\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.231217 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.238203 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-config-data\") pod \"nova-scheduler-0\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.238391 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.250234 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65bxr\" (UniqueName: \"kubernetes.io/projected/96031ac0-f3a0-4e69-9cc3-12f595f19526-kube-api-access-65bxr\") pod \"nova-scheduler-0\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.272628 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.637696 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1293ed8a-2c6b-423b-9f4b-36bdc849e4de" path="/var/lib/kubelet/pods/1293ed8a-2c6b-423b-9f4b-36bdc849e4de/volumes" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.685346 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.846195 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-combined-ca-bundle\") pod \"6100fc2b-831e-4526-8ded-29fd902b3aa7\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.846608 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6w9g\" (UniqueName: \"kubernetes.io/projected/6100fc2b-831e-4526-8ded-29fd902b3aa7-kube-api-access-v6w9g\") pod \"6100fc2b-831e-4526-8ded-29fd902b3aa7\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.846704 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-config-data\") pod \"6100fc2b-831e-4526-8ded-29fd902b3aa7\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.846723 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6100fc2b-831e-4526-8ded-29fd902b3aa7-logs\") pod \"6100fc2b-831e-4526-8ded-29fd902b3aa7\" (UID: \"6100fc2b-831e-4526-8ded-29fd902b3aa7\") " Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.847406 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6100fc2b-831e-4526-8ded-29fd902b3aa7-logs" (OuterVolumeSpecName: "logs") pod "6100fc2b-831e-4526-8ded-29fd902b3aa7" (UID: "6100fc2b-831e-4526-8ded-29fd902b3aa7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.852108 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6100fc2b-831e-4526-8ded-29fd902b3aa7-kube-api-access-v6w9g" (OuterVolumeSpecName: "kube-api-access-v6w9g") pod "6100fc2b-831e-4526-8ded-29fd902b3aa7" (UID: "6100fc2b-831e-4526-8ded-29fd902b3aa7"). InnerVolumeSpecName "kube-api-access-v6w9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:45 crc kubenswrapper[5033]: W0226 17:17:45.872591 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96031ac0_f3a0_4e69_9cc3_12f595f19526.slice/crio-09fe42a7bcae5b3437b4200972fa81a105854340ff5b0e866b65511362bc2fd5 WatchSource:0}: Error finding container 09fe42a7bcae5b3437b4200972fa81a105854340ff5b0e866b65511362bc2fd5: Status 404 returned error can't find the container with id 09fe42a7bcae5b3437b4200972fa81a105854340ff5b0e866b65511362bc2fd5 Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.873005 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6100fc2b-831e-4526-8ded-29fd902b3aa7" (UID: "6100fc2b-831e-4526-8ded-29fd902b3aa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.873199 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-config-data" (OuterVolumeSpecName: "config-data") pod "6100fc2b-831e-4526-8ded-29fd902b3aa7" (UID: "6100fc2b-831e-4526-8ded-29fd902b3aa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.880089 5033 generic.go:334] "Generic (PLEG): container finished" podID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerID="7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42" exitCode=0 Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.880144 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.880145 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6100fc2b-831e-4526-8ded-29fd902b3aa7","Type":"ContainerDied","Data":"7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42"} Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.880265 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6100fc2b-831e-4526-8ded-29fd902b3aa7","Type":"ContainerDied","Data":"0aafb0740270f2e9101d12e385fad32bc05d933e03ee6f86239b457a44b20f4a"} Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.880285 5033 scope.go:117] "RemoveContainer" containerID="7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.882804 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.888430 5033 generic.go:334] "Generic (PLEG): container finished" podID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerID="a017ab1ec3f640b9209ba666f4769084e138e86bd8978eb6725f77cf1ec7fdb3" exitCode=0 Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.888526 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcd10b14-5a18-46c3-b51a-1623eab7a022","Type":"ContainerDied","Data":"a017ab1ec3f640b9209ba666f4769084e138e86bd8978eb6725f77cf1ec7fdb3"} Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.892613 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.900867 5033 scope.go:117] "RemoveContainer" containerID="92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.939070 5033 scope.go:117] "RemoveContainer" containerID="7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42" Feb 26 17:17:45 crc kubenswrapper[5033]: E0226 17:17:45.939698 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42\": container with ID starting with 7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42 not found: ID does not exist" containerID="7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.939754 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42"} err="failed to get container status \"7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42\": rpc error: code = NotFound desc = could not find container \"7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42\": container with ID starting with 7cc5a8a2d73e6cf0a77fcdbe6ad9dbef0387cd3c53232546e9b705f266f81b42 not found: ID does not exist" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.939822 5033 scope.go:117] "RemoveContainer" containerID="92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0" Feb 26 17:17:45 crc kubenswrapper[5033]: E0226 17:17:45.942391 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0\": container with ID starting with 92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0 not found: ID does not exist" containerID="92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.942425 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0"} err="failed to get container status \"92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0\": rpc error: code = NotFound desc = could not find container \"92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0\": container with ID starting with 92a068eede5ceb98363a25126975355c3da01c2db081a51d04a048ec0255eac0 not found: ID does not exist" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.948516 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.948551 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6w9g\" (UniqueName: \"kubernetes.io/projected/6100fc2b-831e-4526-8ded-29fd902b3aa7-kube-api-access-v6w9g\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.948562 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6100fc2b-831e-4526-8ded-29fd902b3aa7-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.948571 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6100fc2b-831e-4526-8ded-29fd902b3aa7-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.950848 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.982759 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:45 crc kubenswrapper[5033]: I0226 17:17:45.999471 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:46 crc kubenswrapper[5033]: E0226 17:17:46.000071 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-log" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.000143 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-log" Feb 26 17:17:46 crc kubenswrapper[5033]: E0226 17:17:46.000159 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-log" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.000165 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-log" Feb 26 17:17:46 crc kubenswrapper[5033]: E0226 17:17:46.000201 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-api" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.000233 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-api" Feb 26 17:17:46 crc kubenswrapper[5033]: E0226 17:17:46.000250 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-metadata" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.000265 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-metadata" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.000603 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-log" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.000619 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-api" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.000633 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" containerName="nova-api-log" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.000645 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" containerName="nova-metadata-metadata" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.001960 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.003913 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.007462 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.049711 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-combined-ca-bundle\") pod \"dcd10b14-5a18-46c3-b51a-1623eab7a022\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.049964 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ghvb\" (UniqueName: \"kubernetes.io/projected/dcd10b14-5a18-46c3-b51a-1623eab7a022-kube-api-access-5ghvb\") pod \"dcd10b14-5a18-46c3-b51a-1623eab7a022\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.050068 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcd10b14-5a18-46c3-b51a-1623eab7a022-logs\") pod \"dcd10b14-5a18-46c3-b51a-1623eab7a022\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.050363 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-config-data\") pod \"dcd10b14-5a18-46c3-b51a-1623eab7a022\" (UID: \"dcd10b14-5a18-46c3-b51a-1623eab7a022\") " Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.051510 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcd10b14-5a18-46c3-b51a-1623eab7a022-logs" (OuterVolumeSpecName: "logs") pod "dcd10b14-5a18-46c3-b51a-1623eab7a022" (UID: "dcd10b14-5a18-46c3-b51a-1623eab7a022"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.055574 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd10b14-5a18-46c3-b51a-1623eab7a022-kube-api-access-5ghvb" (OuterVolumeSpecName: "kube-api-access-5ghvb") pod "dcd10b14-5a18-46c3-b51a-1623eab7a022" (UID: "dcd10b14-5a18-46c3-b51a-1623eab7a022"). InnerVolumeSpecName "kube-api-access-5ghvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.072068 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-config-data" (OuterVolumeSpecName: "config-data") pod "dcd10b14-5a18-46c3-b51a-1623eab7a022" (UID: "dcd10b14-5a18-46c3-b51a-1623eab7a022"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.075543 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcd10b14-5a18-46c3-b51a-1623eab7a022" (UID: "dcd10b14-5a18-46c3-b51a-1623eab7a022"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.151895 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x95dp\" (UniqueName: \"kubernetes.io/projected/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-kube-api-access-x95dp\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.151964 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-logs\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.152081 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-config-data\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.152130 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.152199 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.152211 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ghvb\" (UniqueName: \"kubernetes.io/projected/dcd10b14-5a18-46c3-b51a-1623eab7a022-kube-api-access-5ghvb\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.152222 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcd10b14-5a18-46c3-b51a-1623eab7a022-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.152231 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd10b14-5a18-46c3-b51a-1623eab7a022-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.253146 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-config-data\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.253215 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.253282 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x95dp\" (UniqueName: \"kubernetes.io/projected/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-kube-api-access-x95dp\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.253310 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-logs\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.253701 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-logs\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.257233 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.257737 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-config-data\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.269471 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x95dp\" (UniqueName: \"kubernetes.io/projected/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-kube-api-access-x95dp\") pod \"nova-metadata-0\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.338990 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.817528 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.905636 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99a5fda7-2e18-405e-82d4-2fdb06ad5c26","Type":"ContainerStarted","Data":"518a864430f5de87ce137ca6931a4accece048e594ca04281b119a4f15ba4b92"} Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.908535 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcd10b14-5a18-46c3-b51a-1623eab7a022","Type":"ContainerDied","Data":"332c7c5b5745281e323321fa775cc68d4a1974c31e828170608a99471aaa8e2d"} Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.908569 5033 scope.go:117] "RemoveContainer" containerID="a017ab1ec3f640b9209ba666f4769084e138e86bd8978eb6725f77cf1ec7fdb3" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.908754 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.916644 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"96031ac0-f3a0-4e69-9cc3-12f595f19526","Type":"ContainerStarted","Data":"eeb44306b2c71cac2267b32b7323164273db5287568eb909a17fb339f26ec8cc"} Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.916692 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"96031ac0-f3a0-4e69-9cc3-12f595f19526","Type":"ContainerStarted","Data":"09fe42a7bcae5b3437b4200972fa81a105854340ff5b0e866b65511362bc2fd5"} Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.944840 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.94481607 podStartE2EDuration="2.94481607s" podCreationTimestamp="2026-02-26 17:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:46.936591079 +0000 UTC m=+5794.149189739" watchObservedRunningTime="2026-02-26 17:17:46.94481607 +0000 UTC m=+5794.157414720" Feb 26 17:17:46 crc kubenswrapper[5033]: I0226 17:17:46.955975 5033 scope.go:117] "RemoveContainer" containerID="570e9791f714be8a3c0ff940bf7c7e1098e9c5a09a6597fdd8fd14ddad1cada4" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.025914 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.041194 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.056021 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.057484 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.060907 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.063868 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.165717 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.165954 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghcss\" (UniqueName: \"kubernetes.io/projected/91e93f69-918c-477c-b6f1-a099de0e8081-kube-api-access-ghcss\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.166103 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-config-data\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.166272 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91e93f69-918c-477c-b6f1-a099de0e8081-logs\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.268494 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-config-data\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.268586 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91e93f69-918c-477c-b6f1-a099de0e8081-logs\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.268667 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.268834 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghcss\" (UniqueName: \"kubernetes.io/projected/91e93f69-918c-477c-b6f1-a099de0e8081-kube-api-access-ghcss\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.269093 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91e93f69-918c-477c-b6f1-a099de0e8081-logs\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.272589 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-config-data\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.275175 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.286181 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghcss\" (UniqueName: \"kubernetes.io/projected/91e93f69-918c-477c-b6f1-a099de0e8081-kube-api-access-ghcss\") pod \"nova-api-0\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.388995 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.649738 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6100fc2b-831e-4526-8ded-29fd902b3aa7" path="/var/lib/kubelet/pods/6100fc2b-831e-4526-8ded-29fd902b3aa7/volumes" Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.650720 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd10b14-5a18-46c3-b51a-1623eab7a022" path="/var/lib/kubelet/pods/dcd10b14-5a18-46c3-b51a-1623eab7a022/volumes" Feb 26 17:17:47 crc kubenswrapper[5033]: W0226 17:17:47.840646 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91e93f69_918c_477c_b6f1_a099de0e8081.slice/crio-29d850f90e9ecb5f6c7190a264c14b80063774cca15b57303e5df7ae4dc7c182 WatchSource:0}: Error finding container 29d850f90e9ecb5f6c7190a264c14b80063774cca15b57303e5df7ae4dc7c182: Status 404 returned error can't find the container with id 29d850f90e9ecb5f6c7190a264c14b80063774cca15b57303e5df7ae4dc7c182 Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.844454 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.926478 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99a5fda7-2e18-405e-82d4-2fdb06ad5c26","Type":"ContainerStarted","Data":"375b3aca0e4c4ac88a1e65e02e5982dc26ba7517d2bf51171f17f833b0306ac2"} Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.926515 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99a5fda7-2e18-405e-82d4-2fdb06ad5c26","Type":"ContainerStarted","Data":"3b53fcf968ac6cfc8e842e323e615d110fc8b8866d6d50296c6e506c0062a7d2"} Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.932067 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91e93f69-918c-477c-b6f1-a099de0e8081","Type":"ContainerStarted","Data":"29d850f90e9ecb5f6c7190a264c14b80063774cca15b57303e5df7ae4dc7c182"} Feb 26 17:17:47 crc kubenswrapper[5033]: I0226 17:17:47.954538 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.954514218 podStartE2EDuration="2.954514218s" podCreationTimestamp="2026-02-26 17:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:47.951564906 +0000 UTC m=+5795.164163696" watchObservedRunningTime="2026-02-26 17:17:47.954514218 +0000 UTC m=+5795.167112868" Feb 26 17:17:48 crc kubenswrapper[5033]: I0226 17:17:48.945912 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91e93f69-918c-477c-b6f1-a099de0e8081","Type":"ContainerStarted","Data":"c6eea805f59d3a0375029d5c1708601585eb15dbef44dd30539d72b7435b3dd0"} Feb 26 17:17:48 crc kubenswrapper[5033]: I0226 17:17:48.945985 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91e93f69-918c-477c-b6f1-a099de0e8081","Type":"ContainerStarted","Data":"0e210b28565fc74cc85b71847bee6ca2ef327c7140062ebfc96129304458ae61"} Feb 26 17:17:48 crc kubenswrapper[5033]: I0226 17:17:48.969515 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.969496624 podStartE2EDuration="1.969496624s" podCreationTimestamp="2026-02-26 17:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:17:48.963805016 +0000 UTC m=+5796.176403676" watchObservedRunningTime="2026-02-26 17:17:48.969496624 +0000 UTC m=+5796.182095284" Feb 26 17:17:50 crc kubenswrapper[5033]: I0226 17:17:50.273542 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 17:17:51 crc kubenswrapper[5033]: I0226 17:17:51.339628 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 17:17:51 crc kubenswrapper[5033]: I0226 17:17:51.339698 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 17:17:55 crc kubenswrapper[5033]: I0226 17:17:55.273499 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 17:17:55 crc kubenswrapper[5033]: I0226 17:17:55.300017 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 17:17:55 crc kubenswrapper[5033]: I0226 17:17:55.628800 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:17:55 crc kubenswrapper[5033]: E0226 17:17:55.629232 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:17:56 crc kubenswrapper[5033]: I0226 17:17:56.036782 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 17:17:56 crc kubenswrapper[5033]: I0226 17:17:56.340544 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 17:17:56 crc kubenswrapper[5033]: I0226 17:17:56.340609 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 17:17:57 crc kubenswrapper[5033]: I0226 17:17:57.390197 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 17:17:57 crc kubenswrapper[5033]: I0226 17:17:57.390342 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 17:17:57 crc kubenswrapper[5033]: I0226 17:17:57.424524 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.107:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:17:57 crc kubenswrapper[5033]: I0226 17:17:57.424540 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.107:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:17:58 crc kubenswrapper[5033]: I0226 17:17:58.472454 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.108:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:17:58 crc kubenswrapper[5033]: I0226 17:17:58.472453 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.108:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.164997 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535438-2j87p"] Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.168075 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535438-2j87p" Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.170447 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.170770 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.172290 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.176070 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535438-2j87p"] Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.197298 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p46n\" (UniqueName: \"kubernetes.io/projected/a64692f6-539c-4f00-8aeb-9987edf929b4-kube-api-access-7p46n\") pod \"auto-csr-approver-29535438-2j87p\" (UID: \"a64692f6-539c-4f00-8aeb-9987edf929b4\") " pod="openshift-infra/auto-csr-approver-29535438-2j87p" Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.299444 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p46n\" (UniqueName: \"kubernetes.io/projected/a64692f6-539c-4f00-8aeb-9987edf929b4-kube-api-access-7p46n\") pod \"auto-csr-approver-29535438-2j87p\" (UID: \"a64692f6-539c-4f00-8aeb-9987edf929b4\") " pod="openshift-infra/auto-csr-approver-29535438-2j87p" Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.335726 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p46n\" (UniqueName: \"kubernetes.io/projected/a64692f6-539c-4f00-8aeb-9987edf929b4-kube-api-access-7p46n\") pod \"auto-csr-approver-29535438-2j87p\" (UID: \"a64692f6-539c-4f00-8aeb-9987edf929b4\") " pod="openshift-infra/auto-csr-approver-29535438-2j87p" Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.518307 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535438-2j87p" Feb 26 17:18:00 crc kubenswrapper[5033]: I0226 17:18:00.971111 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535438-2j87p"] Feb 26 17:18:00 crc kubenswrapper[5033]: W0226 17:18:00.974922 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64692f6_539c_4f00_8aeb_9987edf929b4.slice/crio-2e43aa0ca41d831302586ebe323625430666dae6acb89cfd0dd2baf3c5165390 WatchSource:0}: Error finding container 2e43aa0ca41d831302586ebe323625430666dae6acb89cfd0dd2baf3c5165390: Status 404 returned error can't find the container with id 2e43aa0ca41d831302586ebe323625430666dae6acb89cfd0dd2baf3c5165390 Feb 26 17:18:01 crc kubenswrapper[5033]: I0226 17:18:01.048824 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535438-2j87p" event={"ID":"a64692f6-539c-4f00-8aeb-9987edf929b4","Type":"ContainerStarted","Data":"2e43aa0ca41d831302586ebe323625430666dae6acb89cfd0dd2baf3c5165390"} Feb 26 17:18:03 crc kubenswrapper[5033]: I0226 17:18:03.132161 5033 generic.go:334] "Generic (PLEG): container finished" podID="a64692f6-539c-4f00-8aeb-9987edf929b4" containerID="7b3e9a078a6553640a455680af4da5f83837f942261c5dd428bd423b4afa1ece" exitCode=0 Feb 26 17:18:03 crc kubenswrapper[5033]: I0226 17:18:03.132417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535438-2j87p" event={"ID":"a64692f6-539c-4f00-8aeb-9987edf929b4","Type":"ContainerDied","Data":"7b3e9a078a6553640a455680af4da5f83837f942261c5dd428bd423b4afa1ece"} Feb 26 17:18:04 crc kubenswrapper[5033]: I0226 17:18:04.472844 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535438-2j87p" Feb 26 17:18:04 crc kubenswrapper[5033]: I0226 17:18:04.571151 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p46n\" (UniqueName: \"kubernetes.io/projected/a64692f6-539c-4f00-8aeb-9987edf929b4-kube-api-access-7p46n\") pod \"a64692f6-539c-4f00-8aeb-9987edf929b4\" (UID: \"a64692f6-539c-4f00-8aeb-9987edf929b4\") " Feb 26 17:18:04 crc kubenswrapper[5033]: I0226 17:18:04.580312 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64692f6-539c-4f00-8aeb-9987edf929b4-kube-api-access-7p46n" (OuterVolumeSpecName: "kube-api-access-7p46n") pod "a64692f6-539c-4f00-8aeb-9987edf929b4" (UID: "a64692f6-539c-4f00-8aeb-9987edf929b4"). InnerVolumeSpecName "kube-api-access-7p46n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:04 crc kubenswrapper[5033]: I0226 17:18:04.673722 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p46n\" (UniqueName: \"kubernetes.io/projected/a64692f6-539c-4f00-8aeb-9987edf929b4-kube-api-access-7p46n\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:05 crc kubenswrapper[5033]: I0226 17:18:05.157073 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535438-2j87p" event={"ID":"a64692f6-539c-4f00-8aeb-9987edf929b4","Type":"ContainerDied","Data":"2e43aa0ca41d831302586ebe323625430666dae6acb89cfd0dd2baf3c5165390"} Feb 26 17:18:05 crc kubenswrapper[5033]: I0226 17:18:05.157173 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e43aa0ca41d831302586ebe323625430666dae6acb89cfd0dd2baf3c5165390" Feb 26 17:18:05 crc kubenswrapper[5033]: I0226 17:18:05.157130 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535438-2j87p" Feb 26 17:18:05 crc kubenswrapper[5033]: I0226 17:18:05.563353 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535432-6jv8g"] Feb 26 17:18:05 crc kubenswrapper[5033]: I0226 17:18:05.575004 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535432-6jv8g"] Feb 26 17:18:05 crc kubenswrapper[5033]: I0226 17:18:05.638784 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6741c25-d626-4b56-9b5c-7018756a17c7" path="/var/lib/kubelet/pods/d6741c25-d626-4b56-9b5c-7018756a17c7/volumes" Feb 26 17:18:06 crc kubenswrapper[5033]: I0226 17:18:06.341622 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 17:18:06 crc kubenswrapper[5033]: I0226 17:18:06.341684 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 17:18:06 crc kubenswrapper[5033]: I0226 17:18:06.345105 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 17:18:06 crc kubenswrapper[5033]: I0226 17:18:06.346518 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 17:18:06 crc kubenswrapper[5033]: I0226 17:18:06.628710 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:18:06 crc kubenswrapper[5033]: E0226 17:18:06.629083 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.395337 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.395979 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.396513 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.396550 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.400958 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.406323 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.607162 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d89c65847-fknpw"] Feb 26 17:18:07 crc kubenswrapper[5033]: E0226 17:18:07.611012 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64692f6-539c-4f00-8aeb-9987edf929b4" containerName="oc" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.611052 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64692f6-539c-4f00-8aeb-9987edf929b4" containerName="oc" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.611285 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64692f6-539c-4f00-8aeb-9987edf929b4" containerName="oc" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.612698 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.641009 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d89c65847-fknpw"] Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.730639 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-dns-svc\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.730914 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-config\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.731042 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h96qq\" (UniqueName: \"kubernetes.io/projected/1ef70c56-0643-4d6c-9209-6927f55a76f7-kube-api-access-h96qq\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.731136 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-nb\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.731194 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-sb\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.834068 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-dns-svc\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.835494 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-config\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.835564 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h96qq\" (UniqueName: \"kubernetes.io/projected/1ef70c56-0643-4d6c-9209-6927f55a76f7-kube-api-access-h96qq\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.835393 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-dns-svc\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.835681 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-nb\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.835718 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-sb\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.836396 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-config\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.836835 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-sb\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.837087 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-nb\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.864145 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h96qq\" (UniqueName: \"kubernetes.io/projected/1ef70c56-0643-4d6c-9209-6927f55a76f7-kube-api-access-h96qq\") pod \"dnsmasq-dns-d89c65847-fknpw\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:07 crc kubenswrapper[5033]: I0226 17:18:07.930417 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:08 crc kubenswrapper[5033]: I0226 17:18:08.415201 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d89c65847-fknpw"] Feb 26 17:18:08 crc kubenswrapper[5033]: W0226 17:18:08.415880 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ef70c56_0643_4d6c_9209_6927f55a76f7.slice/crio-d291ae2241bd7f0d1253ecf329d62bce52b9430179f301f294d8c569abd71c50 WatchSource:0}: Error finding container d291ae2241bd7f0d1253ecf329d62bce52b9430179f301f294d8c569abd71c50: Status 404 returned error can't find the container with id d291ae2241bd7f0d1253ecf329d62bce52b9430179f301f294d8c569abd71c50 Feb 26 17:18:09 crc kubenswrapper[5033]: I0226 17:18:09.191516 5033 generic.go:334] "Generic (PLEG): container finished" podID="1ef70c56-0643-4d6c-9209-6927f55a76f7" containerID="28c720d27670c69dd2f640dd70e042b0f85b8ca4ca68e7d850a3a7309058013a" exitCode=0 Feb 26 17:18:09 crc kubenswrapper[5033]: I0226 17:18:09.191562 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d89c65847-fknpw" event={"ID":"1ef70c56-0643-4d6c-9209-6927f55a76f7","Type":"ContainerDied","Data":"28c720d27670c69dd2f640dd70e042b0f85b8ca4ca68e7d850a3a7309058013a"} Feb 26 17:18:09 crc kubenswrapper[5033]: I0226 17:18:09.191816 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d89c65847-fknpw" event={"ID":"1ef70c56-0643-4d6c-9209-6927f55a76f7","Type":"ContainerStarted","Data":"d291ae2241bd7f0d1253ecf329d62bce52b9430179f301f294d8c569abd71c50"} Feb 26 17:18:10 crc kubenswrapper[5033]: I0226 17:18:10.203833 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d89c65847-fknpw" event={"ID":"1ef70c56-0643-4d6c-9209-6927f55a76f7","Type":"ContainerStarted","Data":"3cd9fea3b6f4fdbd8dbe194e054f6bcae721e5c7b87351a303523d91ca20ce6f"} Feb 26 17:18:10 crc kubenswrapper[5033]: I0226 17:18:10.204531 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:10 crc kubenswrapper[5033]: I0226 17:18:10.227331 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d89c65847-fknpw" podStartSLOduration=3.227304499 podStartE2EDuration="3.227304499s" podCreationTimestamp="2026-02-26 17:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:10.223520786 +0000 UTC m=+5817.436119446" watchObservedRunningTime="2026-02-26 17:18:10.227304499 +0000 UTC m=+5817.439903169" Feb 26 17:18:17 crc kubenswrapper[5033]: I0226 17:18:17.932533 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.021186 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c87bcb4cc-724jv"] Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.021468 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" podUID="40bae61a-5728-4f79-88c4-0a66ab71dff4" containerName="dnsmasq-dns" containerID="cri-o://039048c04e1cba311341614048edacb7f1c46b02800a4fee3518e701d866519e" gracePeriod=10 Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.486880 5033 generic.go:334] "Generic (PLEG): container finished" podID="40bae61a-5728-4f79-88c4-0a66ab71dff4" containerID="039048c04e1cba311341614048edacb7f1c46b02800a4fee3518e701d866519e" exitCode=0 Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.487279 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" event={"ID":"40bae61a-5728-4f79-88c4-0a66ab71dff4","Type":"ContainerDied","Data":"039048c04e1cba311341614048edacb7f1c46b02800a4fee3518e701d866519e"} Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.569065 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.699127 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-sb\") pod \"40bae61a-5728-4f79-88c4-0a66ab71dff4\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.699407 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-nb\") pod \"40bae61a-5728-4f79-88c4-0a66ab71dff4\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.699558 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-dns-svc\") pod \"40bae61a-5728-4f79-88c4-0a66ab71dff4\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.699732 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9tgc\" (UniqueName: \"kubernetes.io/projected/40bae61a-5728-4f79-88c4-0a66ab71dff4-kube-api-access-s9tgc\") pod \"40bae61a-5728-4f79-88c4-0a66ab71dff4\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.699996 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-config\") pod \"40bae61a-5728-4f79-88c4-0a66ab71dff4\" (UID: \"40bae61a-5728-4f79-88c4-0a66ab71dff4\") " Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.705446 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40bae61a-5728-4f79-88c4-0a66ab71dff4-kube-api-access-s9tgc" (OuterVolumeSpecName: "kube-api-access-s9tgc") pod "40bae61a-5728-4f79-88c4-0a66ab71dff4" (UID: "40bae61a-5728-4f79-88c4-0a66ab71dff4"). InnerVolumeSpecName "kube-api-access-s9tgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.741407 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "40bae61a-5728-4f79-88c4-0a66ab71dff4" (UID: "40bae61a-5728-4f79-88c4-0a66ab71dff4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.742201 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-config" (OuterVolumeSpecName: "config") pod "40bae61a-5728-4f79-88c4-0a66ab71dff4" (UID: "40bae61a-5728-4f79-88c4-0a66ab71dff4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.750419 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "40bae61a-5728-4f79-88c4-0a66ab71dff4" (UID: "40bae61a-5728-4f79-88c4-0a66ab71dff4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.751864 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "40bae61a-5728-4f79-88c4-0a66ab71dff4" (UID: "40bae61a-5728-4f79-88c4-0a66ab71dff4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.802488 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.802823 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.802885 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9tgc\" (UniqueName: \"kubernetes.io/projected/40bae61a-5728-4f79-88c4-0a66ab71dff4-kube-api-access-s9tgc\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.802946 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:18 crc kubenswrapper[5033]: I0226 17:18:18.803016 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40bae61a-5728-4f79-88c4-0a66ab71dff4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:19 crc kubenswrapper[5033]: I0226 17:18:19.499796 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" event={"ID":"40bae61a-5728-4f79-88c4-0a66ab71dff4","Type":"ContainerDied","Data":"99759a25d9e7b4a778ac66bf621475a0e9305c090e1e2f8cce79da5e17ab1c15"} Feb 26 17:18:19 crc kubenswrapper[5033]: I0226 17:18:19.499850 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c87bcb4cc-724jv" Feb 26 17:18:19 crc kubenswrapper[5033]: I0226 17:18:19.500204 5033 scope.go:117] "RemoveContainer" containerID="039048c04e1cba311341614048edacb7f1c46b02800a4fee3518e701d866519e" Feb 26 17:18:19 crc kubenswrapper[5033]: I0226 17:18:19.543794 5033 scope.go:117] "RemoveContainer" containerID="89d63937d60761beba1c708088cff2a6b3934417617f6e0e403704085e9f1d0b" Feb 26 17:18:19 crc kubenswrapper[5033]: I0226 17:18:19.551043 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c87bcb4cc-724jv"] Feb 26 17:18:19 crc kubenswrapper[5033]: I0226 17:18:19.561240 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c87bcb4cc-724jv"] Feb 26 17:18:19 crc kubenswrapper[5033]: I0226 17:18:19.629464 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:18:19 crc kubenswrapper[5033]: E0226 17:18:19.630344 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:18:19 crc kubenswrapper[5033]: I0226 17:18:19.640582 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40bae61a-5728-4f79-88c4-0a66ab71dff4" path="/var/lib/kubelet/pods/40bae61a-5728-4f79-88c4-0a66ab71dff4/volumes" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.145326 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-f6ww6"] Feb 26 17:18:20 crc kubenswrapper[5033]: E0226 17:18:20.145826 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bae61a-5728-4f79-88c4-0a66ab71dff4" containerName="init" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.145853 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bae61a-5728-4f79-88c4-0a66ab71dff4" containerName="init" Feb 26 17:18:20 crc kubenswrapper[5033]: E0226 17:18:20.145875 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bae61a-5728-4f79-88c4-0a66ab71dff4" containerName="dnsmasq-dns" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.145884 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bae61a-5728-4f79-88c4-0a66ab71dff4" containerName="dnsmasq-dns" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.146127 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="40bae61a-5728-4f79-88c4-0a66ab71dff4" containerName="dnsmasq-dns" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.147496 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.169188 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-f6ww6"] Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.255578 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2cdc-account-create-update-9zcpl"] Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.256980 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.263218 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.269960 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2cdc-account-create-update-9zcpl"] Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.334764 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnjt7\" (UniqueName: \"kubernetes.io/projected/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-kube-api-access-qnjt7\") pod \"cinder-db-create-f6ww6\" (UID: \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\") " pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.335165 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-operator-scripts\") pod \"cinder-db-create-f6ww6\" (UID: \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\") " pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.436474 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e89e07-a139-4821-b77c-f39911e1fb5e-operator-scripts\") pod \"cinder-2cdc-account-create-update-9zcpl\" (UID: \"84e89e07-a139-4821-b77c-f39911e1fb5e\") " pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.436557 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-operator-scripts\") pod \"cinder-db-create-f6ww6\" (UID: \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\") " pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.436709 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnjt7\" (UniqueName: \"kubernetes.io/projected/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-kube-api-access-qnjt7\") pod \"cinder-db-create-f6ww6\" (UID: \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\") " pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.436793 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k72k\" (UniqueName: \"kubernetes.io/projected/84e89e07-a139-4821-b77c-f39911e1fb5e-kube-api-access-5k72k\") pod \"cinder-2cdc-account-create-update-9zcpl\" (UID: \"84e89e07-a139-4821-b77c-f39911e1fb5e\") " pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.437402 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-operator-scripts\") pod \"cinder-db-create-f6ww6\" (UID: \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\") " pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.453883 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnjt7\" (UniqueName: \"kubernetes.io/projected/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-kube-api-access-qnjt7\") pod \"cinder-db-create-f6ww6\" (UID: \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\") " pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.461848 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.538578 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k72k\" (UniqueName: \"kubernetes.io/projected/84e89e07-a139-4821-b77c-f39911e1fb5e-kube-api-access-5k72k\") pod \"cinder-2cdc-account-create-update-9zcpl\" (UID: \"84e89e07-a139-4821-b77c-f39911e1fb5e\") " pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.538683 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e89e07-a139-4821-b77c-f39911e1fb5e-operator-scripts\") pod \"cinder-2cdc-account-create-update-9zcpl\" (UID: \"84e89e07-a139-4821-b77c-f39911e1fb5e\") " pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.539688 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e89e07-a139-4821-b77c-f39911e1fb5e-operator-scripts\") pod \"cinder-2cdc-account-create-update-9zcpl\" (UID: \"84e89e07-a139-4821-b77c-f39911e1fb5e\") " pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.563928 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k72k\" (UniqueName: \"kubernetes.io/projected/84e89e07-a139-4821-b77c-f39911e1fb5e-kube-api-access-5k72k\") pod \"cinder-2cdc-account-create-update-9zcpl\" (UID: \"84e89e07-a139-4821-b77c-f39911e1fb5e\") " pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.589391 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:20 crc kubenswrapper[5033]: I0226 17:18:20.927099 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-f6ww6"] Feb 26 17:18:21 crc kubenswrapper[5033]: I0226 17:18:21.083611 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2cdc-account-create-update-9zcpl"] Feb 26 17:18:21 crc kubenswrapper[5033]: W0226 17:18:21.089298 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e89e07_a139_4821_b77c_f39911e1fb5e.slice/crio-6d648463812b39907b7a0787237af598439b7f6d38969a2996dfb5e134ccfe73 WatchSource:0}: Error finding container 6d648463812b39907b7a0787237af598439b7f6d38969a2996dfb5e134ccfe73: Status 404 returned error can't find the container with id 6d648463812b39907b7a0787237af598439b7f6d38969a2996dfb5e134ccfe73 Feb 26 17:18:21 crc kubenswrapper[5033]: I0226 17:18:21.519739 5033 generic.go:334] "Generic (PLEG): container finished" podID="dbd0439f-5d88-4e1f-bf1b-b0624a056a16" containerID="e357d2dc288499cdde9c3efa8a450c7394f722163c768dcd8a4d02d96005c819" exitCode=0 Feb 26 17:18:21 crc kubenswrapper[5033]: I0226 17:18:21.519838 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f6ww6" event={"ID":"dbd0439f-5d88-4e1f-bf1b-b0624a056a16","Type":"ContainerDied","Data":"e357d2dc288499cdde9c3efa8a450c7394f722163c768dcd8a4d02d96005c819"} Feb 26 17:18:21 crc kubenswrapper[5033]: I0226 17:18:21.519929 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f6ww6" event={"ID":"dbd0439f-5d88-4e1f-bf1b-b0624a056a16","Type":"ContainerStarted","Data":"2506fd003fd522fe66b66953ab05ac20e64878b627d530ed41e2cd34653e0c4e"} Feb 26 17:18:21 crc kubenswrapper[5033]: I0226 17:18:21.521189 5033 generic.go:334] "Generic (PLEG): container finished" podID="84e89e07-a139-4821-b77c-f39911e1fb5e" containerID="996b86f502df4fdfdf5b4f834aa2ad928df2d758d6ec7e94c15899d822732ce2" exitCode=0 Feb 26 17:18:21 crc kubenswrapper[5033]: I0226 17:18:21.521227 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2cdc-account-create-update-9zcpl" event={"ID":"84e89e07-a139-4821-b77c-f39911e1fb5e","Type":"ContainerDied","Data":"996b86f502df4fdfdf5b4f834aa2ad928df2d758d6ec7e94c15899d822732ce2"} Feb 26 17:18:21 crc kubenswrapper[5033]: I0226 17:18:21.521247 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2cdc-account-create-update-9zcpl" event={"ID":"84e89e07-a139-4821-b77c-f39911e1fb5e","Type":"ContainerStarted","Data":"6d648463812b39907b7a0787237af598439b7f6d38969a2996dfb5e134ccfe73"} Feb 26 17:18:22 crc kubenswrapper[5033]: I0226 17:18:22.940206 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:22 crc kubenswrapper[5033]: I0226 17:18:22.945964 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.090096 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k72k\" (UniqueName: \"kubernetes.io/projected/84e89e07-a139-4821-b77c-f39911e1fb5e-kube-api-access-5k72k\") pod \"84e89e07-a139-4821-b77c-f39911e1fb5e\" (UID: \"84e89e07-a139-4821-b77c-f39911e1fb5e\") " Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.090158 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e89e07-a139-4821-b77c-f39911e1fb5e-operator-scripts\") pod \"84e89e07-a139-4821-b77c-f39911e1fb5e\" (UID: \"84e89e07-a139-4821-b77c-f39911e1fb5e\") " Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.090362 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-operator-scripts\") pod \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\" (UID: \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\") " Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.090437 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnjt7\" (UniqueName: \"kubernetes.io/projected/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-kube-api-access-qnjt7\") pod \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\" (UID: \"dbd0439f-5d88-4e1f-bf1b-b0624a056a16\") " Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.090908 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbd0439f-5d88-4e1f-bf1b-b0624a056a16" (UID: "dbd0439f-5d88-4e1f-bf1b-b0624a056a16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.091287 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e89e07-a139-4821-b77c-f39911e1fb5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84e89e07-a139-4821-b77c-f39911e1fb5e" (UID: "84e89e07-a139-4821-b77c-f39911e1fb5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.095389 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-kube-api-access-qnjt7" (OuterVolumeSpecName: "kube-api-access-qnjt7") pod "dbd0439f-5d88-4e1f-bf1b-b0624a056a16" (UID: "dbd0439f-5d88-4e1f-bf1b-b0624a056a16"). InnerVolumeSpecName "kube-api-access-qnjt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.095565 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e89e07-a139-4821-b77c-f39911e1fb5e-kube-api-access-5k72k" (OuterVolumeSpecName: "kube-api-access-5k72k") pod "84e89e07-a139-4821-b77c-f39911e1fb5e" (UID: "84e89e07-a139-4821-b77c-f39911e1fb5e"). InnerVolumeSpecName "kube-api-access-5k72k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.192527 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnjt7\" (UniqueName: \"kubernetes.io/projected/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-kube-api-access-qnjt7\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.192564 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k72k\" (UniqueName: \"kubernetes.io/projected/84e89e07-a139-4821-b77c-f39911e1fb5e-kube-api-access-5k72k\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.192573 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e89e07-a139-4821-b77c-f39911e1fb5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.192581 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbd0439f-5d88-4e1f-bf1b-b0624a056a16-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.550622 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2cdc-account-create-update-9zcpl" event={"ID":"84e89e07-a139-4821-b77c-f39911e1fb5e","Type":"ContainerDied","Data":"6d648463812b39907b7a0787237af598439b7f6d38969a2996dfb5e134ccfe73"} Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.550817 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d648463812b39907b7a0787237af598439b7f6d38969a2996dfb5e134ccfe73" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.550632 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2cdc-account-create-update-9zcpl" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.552344 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f6ww6" event={"ID":"dbd0439f-5d88-4e1f-bf1b-b0624a056a16","Type":"ContainerDied","Data":"2506fd003fd522fe66b66953ab05ac20e64878b627d530ed41e2cd34653e0c4e"} Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.552418 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2506fd003fd522fe66b66953ab05ac20e64878b627d530ed41e2cd34653e0c4e" Feb 26 17:18:23 crc kubenswrapper[5033]: I0226 17:18:23.552533 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f6ww6" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.404090 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-8kn6s"] Feb 26 17:18:25 crc kubenswrapper[5033]: E0226 17:18:25.404717 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e89e07-a139-4821-b77c-f39911e1fb5e" containerName="mariadb-account-create-update" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.404729 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e89e07-a139-4821-b77c-f39911e1fb5e" containerName="mariadb-account-create-update" Feb 26 17:18:25 crc kubenswrapper[5033]: E0226 17:18:25.404743 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd0439f-5d88-4e1f-bf1b-b0624a056a16" containerName="mariadb-database-create" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.404749 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd0439f-5d88-4e1f-bf1b-b0624a056a16" containerName="mariadb-database-create" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.404906 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd0439f-5d88-4e1f-bf1b-b0624a056a16" containerName="mariadb-database-create" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.404927 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e89e07-a139-4821-b77c-f39911e1fb5e" containerName="mariadb-account-create-update" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.405506 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.408708 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.408755 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7b9xf" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.408939 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.428754 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8kn6s"] Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.536645 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-combined-ca-bundle\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.536722 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxp8w\" (UniqueName: \"kubernetes.io/projected/1238df11-1db7-4267-ac4b-5022d1c95276-kube-api-access-hxp8w\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.536781 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-db-sync-config-data\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.537059 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-config-data\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.537175 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-scripts\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.537247 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1238df11-1db7-4267-ac4b-5022d1c95276-etc-machine-id\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.639661 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxp8w\" (UniqueName: \"kubernetes.io/projected/1238df11-1db7-4267-ac4b-5022d1c95276-kube-api-access-hxp8w\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.639778 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-db-sync-config-data\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.639882 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-config-data\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.639944 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-scripts\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.639989 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1238df11-1db7-4267-ac4b-5022d1c95276-etc-machine-id\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.640162 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-combined-ca-bundle\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.640672 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1238df11-1db7-4267-ac4b-5022d1c95276-etc-machine-id\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.645360 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-scripts\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.645361 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-db-sync-config-data\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.646131 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-config-data\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.649976 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-combined-ca-bundle\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.668315 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxp8w\" (UniqueName: \"kubernetes.io/projected/1238df11-1db7-4267-ac4b-5022d1c95276-kube-api-access-hxp8w\") pod \"cinder-db-sync-8kn6s\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:25 crc kubenswrapper[5033]: I0226 17:18:25.730489 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:26 crc kubenswrapper[5033]: I0226 17:18:26.165755 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8kn6s"] Feb 26 17:18:26 crc kubenswrapper[5033]: I0226 17:18:26.582694 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8kn6s" event={"ID":"1238df11-1db7-4267-ac4b-5022d1c95276","Type":"ContainerStarted","Data":"c8f3814b9d3760910c2c0929db67e0bcc574a6259f3409f999a0403fd5e5a8b3"} Feb 26 17:18:27 crc kubenswrapper[5033]: I0226 17:18:27.595886 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8kn6s" event={"ID":"1238df11-1db7-4267-ac4b-5022d1c95276","Type":"ContainerStarted","Data":"8dbd9a067f7457b455c91a64f6eea5771fc925d69648d01f9fb24dc077f3cced"} Feb 26 17:18:27 crc kubenswrapper[5033]: I0226 17:18:27.629616 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-8kn6s" podStartSLOduration=2.629596582 podStartE2EDuration="2.629596582s" podCreationTimestamp="2026-02-26 17:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:27.624830715 +0000 UTC m=+5834.837429365" watchObservedRunningTime="2026-02-26 17:18:27.629596582 +0000 UTC m=+5834.842195242" Feb 26 17:18:30 crc kubenswrapper[5033]: I0226 17:18:30.625520 5033 generic.go:334] "Generic (PLEG): container finished" podID="1238df11-1db7-4267-ac4b-5022d1c95276" containerID="8dbd9a067f7457b455c91a64f6eea5771fc925d69648d01f9fb24dc077f3cced" exitCode=0 Feb 26 17:18:30 crc kubenswrapper[5033]: I0226 17:18:30.625637 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8kn6s" event={"ID":"1238df11-1db7-4267-ac4b-5022d1c95276","Type":"ContainerDied","Data":"8dbd9a067f7457b455c91a64f6eea5771fc925d69648d01f9fb24dc077f3cced"} Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.028777 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.164701 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-config-data\") pod \"1238df11-1db7-4267-ac4b-5022d1c95276\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.164854 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-combined-ca-bundle\") pod \"1238df11-1db7-4267-ac4b-5022d1c95276\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.164968 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxp8w\" (UniqueName: \"kubernetes.io/projected/1238df11-1db7-4267-ac4b-5022d1c95276-kube-api-access-hxp8w\") pod \"1238df11-1db7-4267-ac4b-5022d1c95276\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.165038 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1238df11-1db7-4267-ac4b-5022d1c95276-etc-machine-id\") pod \"1238df11-1db7-4267-ac4b-5022d1c95276\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.165108 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-db-sync-config-data\") pod \"1238df11-1db7-4267-ac4b-5022d1c95276\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.165180 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1238df11-1db7-4267-ac4b-5022d1c95276-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1238df11-1db7-4267-ac4b-5022d1c95276" (UID: "1238df11-1db7-4267-ac4b-5022d1c95276"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.165457 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-scripts\") pod \"1238df11-1db7-4267-ac4b-5022d1c95276\" (UID: \"1238df11-1db7-4267-ac4b-5022d1c95276\") " Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.167407 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1238df11-1db7-4267-ac4b-5022d1c95276-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.172123 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1238df11-1db7-4267-ac4b-5022d1c95276" (UID: "1238df11-1db7-4267-ac4b-5022d1c95276"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.172180 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-scripts" (OuterVolumeSpecName: "scripts") pod "1238df11-1db7-4267-ac4b-5022d1c95276" (UID: "1238df11-1db7-4267-ac4b-5022d1c95276"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.172965 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1238df11-1db7-4267-ac4b-5022d1c95276-kube-api-access-hxp8w" (OuterVolumeSpecName: "kube-api-access-hxp8w") pod "1238df11-1db7-4267-ac4b-5022d1c95276" (UID: "1238df11-1db7-4267-ac4b-5022d1c95276"). InnerVolumeSpecName "kube-api-access-hxp8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.208706 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1238df11-1db7-4267-ac4b-5022d1c95276" (UID: "1238df11-1db7-4267-ac4b-5022d1c95276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.213125 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-config-data" (OuterVolumeSpecName: "config-data") pod "1238df11-1db7-4267-ac4b-5022d1c95276" (UID: "1238df11-1db7-4267-ac4b-5022d1c95276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.268493 5033 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.268537 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.268547 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.268556 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1238df11-1db7-4267-ac4b-5022d1c95276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.268566 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxp8w\" (UniqueName: \"kubernetes.io/projected/1238df11-1db7-4267-ac4b-5022d1c95276-kube-api-access-hxp8w\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.651803 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8kn6s" event={"ID":"1238df11-1db7-4267-ac4b-5022d1c95276","Type":"ContainerDied","Data":"c8f3814b9d3760910c2c0929db67e0bcc574a6259f3409f999a0403fd5e5a8b3"} Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.652189 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8f3814b9d3760910c2c0929db67e0bcc574a6259f3409f999a0403fd5e5a8b3" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.651862 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8kn6s" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.990721 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d89f5b57-5xbrd"] Feb 26 17:18:32 crc kubenswrapper[5033]: E0226 17:18:32.991206 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1238df11-1db7-4267-ac4b-5022d1c95276" containerName="cinder-db-sync" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.991224 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1238df11-1db7-4267-ac4b-5022d1c95276" containerName="cinder-db-sync" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.991520 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1238df11-1db7-4267-ac4b-5022d1c95276" containerName="cinder-db-sync" Feb 26 17:18:32 crc kubenswrapper[5033]: I0226 17:18:32.993135 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.027052 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d89f5b57-5xbrd"] Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.086485 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-config\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.086551 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-dns-svc\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.086650 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-sb\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.086712 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-nb\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.090817 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9l24\" (UniqueName: \"kubernetes.io/projected/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-kube-api-access-d9l24\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.170389 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.172157 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.173879 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.174412 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.175602 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7b9xf" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.175796 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.184723 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.192322 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-dns-svc\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.192430 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-sb\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.192473 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-nb\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.192512 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9l24\" (UniqueName: \"kubernetes.io/projected/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-kube-api-access-d9l24\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.192588 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-config\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.193379 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-sb\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.194013 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-dns-svc\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.195804 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-nb\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.199522 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-config\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.214326 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9l24\" (UniqueName: \"kubernetes.io/projected/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-kube-api-access-d9l24\") pod \"dnsmasq-dns-9d89f5b57-5xbrd\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.294351 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7e95af8-71df-49bd-8639-de3007ccc459-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.294405 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e95af8-71df-49bd-8639-de3007ccc459-logs\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.294435 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tnrz\" (UniqueName: \"kubernetes.io/projected/c7e95af8-71df-49bd-8639-de3007ccc459-kube-api-access-4tnrz\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.294494 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data-custom\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.294511 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.294530 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.294551 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-scripts\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.322933 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.395993 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7e95af8-71df-49bd-8639-de3007ccc459-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.396060 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e95af8-71df-49bd-8639-de3007ccc459-logs\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.396099 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tnrz\" (UniqueName: \"kubernetes.io/projected/c7e95af8-71df-49bd-8639-de3007ccc459-kube-api-access-4tnrz\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.396151 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7e95af8-71df-49bd-8639-de3007ccc459-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.396185 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data-custom\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.396251 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.396350 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.396413 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-scripts\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.396946 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e95af8-71df-49bd-8639-de3007ccc459-logs\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.400538 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-scripts\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.401302 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.401997 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data-custom\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.408514 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.429502 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tnrz\" (UniqueName: \"kubernetes.io/projected/c7e95af8-71df-49bd-8639-de3007ccc459-kube-api-access-4tnrz\") pod \"cinder-api-0\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.491417 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.646047 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:18:33 crc kubenswrapper[5033]: E0226 17:18:33.646231 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.795434 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d89f5b57-5xbrd"] Feb 26 17:18:33 crc kubenswrapper[5033]: I0226 17:18:33.968332 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 17:18:34 crc kubenswrapper[5033]: I0226 17:18:34.333821 5033 scope.go:117] "RemoveContainer" containerID="820baf1ca6542c49ede7e348e7648f6b27e8a1cd6cef94820ed817fb47c87abc" Feb 26 17:18:34 crc kubenswrapper[5033]: I0226 17:18:34.708715 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c7e95af8-71df-49bd-8639-de3007ccc459","Type":"ContainerStarted","Data":"16f0d35c56b1ce43dc20f81ce2503582eedae6096a6825bbdd0537b921e3f87b"} Feb 26 17:18:34 crc kubenswrapper[5033]: I0226 17:18:34.710505 5033 generic.go:334] "Generic (PLEG): container finished" podID="a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" containerID="a775950506db4d562be7ca64cb610991e2cf7b6c88149398e8f1b575f8d7d96d" exitCode=0 Feb 26 17:18:34 crc kubenswrapper[5033]: I0226 17:18:34.710559 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" event={"ID":"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a","Type":"ContainerDied","Data":"a775950506db4d562be7ca64cb610991e2cf7b6c88149398e8f1b575f8d7d96d"} Feb 26 17:18:34 crc kubenswrapper[5033]: I0226 17:18:34.710583 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" event={"ID":"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a","Type":"ContainerStarted","Data":"cb0de5de7a95a36db625e271f9a9f8028f816ebe541e35d3c9a53d635e007ae4"} Feb 26 17:18:35 crc kubenswrapper[5033]: I0226 17:18:35.723806 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c7e95af8-71df-49bd-8639-de3007ccc459","Type":"ContainerStarted","Data":"1f2d786f168748fdbe0f8ee2b4862967a699848d1e2e43790573b71ab7a3cefd"} Feb 26 17:18:35 crc kubenswrapper[5033]: I0226 17:18:35.725519 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c7e95af8-71df-49bd-8639-de3007ccc459","Type":"ContainerStarted","Data":"dd0ff1c953a23b10a8da8fa12693876185e8f46928148f2114271d977c075db0"} Feb 26 17:18:35 crc kubenswrapper[5033]: I0226 17:18:35.725980 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 17:18:35 crc kubenswrapper[5033]: I0226 17:18:35.727182 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" event={"ID":"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a","Type":"ContainerStarted","Data":"076e0b99b0ccd5a7ca168e4d519c3b2ca48c5c75482054f1d6f8dfb9f47a6ba4"} Feb 26 17:18:35 crc kubenswrapper[5033]: I0226 17:18:35.727453 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:35 crc kubenswrapper[5033]: I0226 17:18:35.756193 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.756172299 podStartE2EDuration="2.756172299s" podCreationTimestamp="2026-02-26 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:35.746684597 +0000 UTC m=+5842.959283287" watchObservedRunningTime="2026-02-26 17:18:35.756172299 +0000 UTC m=+5842.968770959" Feb 26 17:18:35 crc kubenswrapper[5033]: I0226 17:18:35.781559 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" podStartSLOduration=3.781533078 podStartE2EDuration="3.781533078s" podCreationTimestamp="2026-02-26 17:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:35.775420408 +0000 UTC m=+5842.988019128" watchObservedRunningTime="2026-02-26 17:18:35.781533078 +0000 UTC m=+5842.994131768" Feb 26 17:18:43 crc kubenswrapper[5033]: I0226 17:18:43.325441 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:18:43 crc kubenswrapper[5033]: I0226 17:18:43.417271 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d89c65847-fknpw"] Feb 26 17:18:43 crc kubenswrapper[5033]: I0226 17:18:43.417649 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d89c65847-fknpw" podUID="1ef70c56-0643-4d6c-9209-6927f55a76f7" containerName="dnsmasq-dns" containerID="cri-o://3cd9fea3b6f4fdbd8dbe194e054f6bcae721e5c7b87351a303523d91ca20ce6f" gracePeriod=10 Feb 26 17:18:43 crc kubenswrapper[5033]: I0226 17:18:43.827715 5033 generic.go:334] "Generic (PLEG): container finished" podID="1ef70c56-0643-4d6c-9209-6927f55a76f7" containerID="3cd9fea3b6f4fdbd8dbe194e054f6bcae721e5c7b87351a303523d91ca20ce6f" exitCode=0 Feb 26 17:18:43 crc kubenswrapper[5033]: I0226 17:18:43.827751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d89c65847-fknpw" event={"ID":"1ef70c56-0643-4d6c-9209-6927f55a76f7","Type":"ContainerDied","Data":"3cd9fea3b6f4fdbd8dbe194e054f6bcae721e5c7b87351a303523d91ca20ce6f"} Feb 26 17:18:43 crc kubenswrapper[5033]: I0226 17:18:43.951820 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.031211 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-sb\") pod \"1ef70c56-0643-4d6c-9209-6927f55a76f7\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.031319 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-dns-svc\") pod \"1ef70c56-0643-4d6c-9209-6927f55a76f7\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.031402 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h96qq\" (UniqueName: \"kubernetes.io/projected/1ef70c56-0643-4d6c-9209-6927f55a76f7-kube-api-access-h96qq\") pod \"1ef70c56-0643-4d6c-9209-6927f55a76f7\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.031420 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-nb\") pod \"1ef70c56-0643-4d6c-9209-6927f55a76f7\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.031520 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-config\") pod \"1ef70c56-0643-4d6c-9209-6927f55a76f7\" (UID: \"1ef70c56-0643-4d6c-9209-6927f55a76f7\") " Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.045869 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef70c56-0643-4d6c-9209-6927f55a76f7-kube-api-access-h96qq" (OuterVolumeSpecName: "kube-api-access-h96qq") pod "1ef70c56-0643-4d6c-9209-6927f55a76f7" (UID: "1ef70c56-0643-4d6c-9209-6927f55a76f7"). InnerVolumeSpecName "kube-api-access-h96qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.079781 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ef70c56-0643-4d6c-9209-6927f55a76f7" (UID: "1ef70c56-0643-4d6c-9209-6927f55a76f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.085879 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1ef70c56-0643-4d6c-9209-6927f55a76f7" (UID: "1ef70c56-0643-4d6c-9209-6927f55a76f7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.096048 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1ef70c56-0643-4d6c-9209-6927f55a76f7" (UID: "1ef70c56-0643-4d6c-9209-6927f55a76f7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.098230 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-config" (OuterVolumeSpecName: "config") pod "1ef70c56-0643-4d6c-9209-6927f55a76f7" (UID: "1ef70c56-0643-4d6c-9209-6927f55a76f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.133827 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.133862 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.133873 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h96qq\" (UniqueName: \"kubernetes.io/projected/1ef70c56-0643-4d6c-9209-6927f55a76f7-kube-api-access-h96qq\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.133883 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.133893 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef70c56-0643-4d6c-9209-6927f55a76f7-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.839435 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d89c65847-fknpw" event={"ID":"1ef70c56-0643-4d6c-9209-6927f55a76f7","Type":"ContainerDied","Data":"d291ae2241bd7f0d1253ecf329d62bce52b9430179f301f294d8c569abd71c50"} Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.839764 5033 scope.go:117] "RemoveContainer" containerID="3cd9fea3b6f4fdbd8dbe194e054f6bcae721e5c7b87351a303523d91ca20ce6f" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.839479 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d89c65847-fknpw" Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.886682 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d89c65847-fknpw"] Feb 26 17:18:44 crc kubenswrapper[5033]: I0226 17:18:44.894997 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d89c65847-fknpw"] Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.146173 5033 scope.go:117] "RemoveContainer" containerID="28c720d27670c69dd2f640dd70e042b0f85b8ca4ca68e7d850a3a7309058013a" Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.628121 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:18:45 crc kubenswrapper[5033]: E0226 17:18:45.628567 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.638119 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef70c56-0643-4d6c-9209-6927f55a76f7" path="/var/lib/kubelet/pods/1ef70c56-0643-4d6c-9209-6927f55a76f7/volumes" Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.646557 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.646772 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="59148331-e81c-4ca3-aeb1-702fe06efa3a" containerName="nova-cell0-conductor-conductor" containerID="cri-o://53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3" gracePeriod=30 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.662777 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.663072 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-log" containerID="cri-o://0e210b28565fc74cc85b71847bee6ca2ef327c7140062ebfc96129304458ae61" gracePeriod=30 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.663139 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-api" containerID="cri-o://c6eea805f59d3a0375029d5c1708601585eb15dbef44dd30539d72b7435b3dd0" gracePeriod=30 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.677601 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.677830 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ec4cbe35-7184-45d4-b140-df2a2bac4020" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7" gracePeriod=30 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.685891 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.686471 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-log" containerID="cri-o://3b53fcf968ac6cfc8e842e323e615d110fc8b8866d6d50296c6e506c0062a7d2" gracePeriod=30 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.686518 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-metadata" containerID="cri-o://375b3aca0e4c4ac88a1e65e02e5982dc26ba7517d2bf51171f17f833b0306ac2" gracePeriod=30 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.703171 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.703420 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="96031ac0-f3a0-4e69-9cc3-12f595f19526" containerName="nova-scheduler-scheduler" containerID="cri-o://eeb44306b2c71cac2267b32b7323164273db5287568eb909a17fb339f26ec8cc" gracePeriod=30 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.733349 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.733542 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f5a830b4-a5a6-4a74-b3d9-4672daadf5d7" containerName="nova-cell1-conductor-conductor" containerID="cri-o://9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3" gracePeriod=30 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.851994 5033 generic.go:334] "Generic (PLEG): container finished" podID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerID="3b53fcf968ac6cfc8e842e323e615d110fc8b8866d6d50296c6e506c0062a7d2" exitCode=143 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.852056 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99a5fda7-2e18-405e-82d4-2fdb06ad5c26","Type":"ContainerDied","Data":"3b53fcf968ac6cfc8e842e323e615d110fc8b8866d6d50296c6e506c0062a7d2"} Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.854981 5033 generic.go:334] "Generic (PLEG): container finished" podID="91e93f69-918c-477c-b6f1-a099de0e8081" containerID="0e210b28565fc74cc85b71847bee6ca2ef327c7140062ebfc96129304458ae61" exitCode=143 Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.855054 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91e93f69-918c-477c-b6f1-a099de0e8081","Type":"ContainerDied","Data":"0e210b28565fc74cc85b71847bee6ca2ef327c7140062ebfc96129304458ae61"} Feb 26 17:18:45 crc kubenswrapper[5033]: I0226 17:18:45.869472 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.102608 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="ec4cbe35-7184-45d4-b140-df2a2bac4020" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.1.98:6080/vnc_lite.html\": dial tcp 10.217.1.98:6080: connect: connection refused" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.615927 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.678738 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-combined-ca-bundle\") pod \"ec4cbe35-7184-45d4-b140-df2a2bac4020\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.678975 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfs44\" (UniqueName: \"kubernetes.io/projected/ec4cbe35-7184-45d4-b140-df2a2bac4020-kube-api-access-wfs44\") pod \"ec4cbe35-7184-45d4-b140-df2a2bac4020\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.679077 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-config-data\") pod \"ec4cbe35-7184-45d4-b140-df2a2bac4020\" (UID: \"ec4cbe35-7184-45d4-b140-df2a2bac4020\") " Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.689652 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec4cbe35-7184-45d4-b140-df2a2bac4020-kube-api-access-wfs44" (OuterVolumeSpecName: "kube-api-access-wfs44") pod "ec4cbe35-7184-45d4-b140-df2a2bac4020" (UID: "ec4cbe35-7184-45d4-b140-df2a2bac4020"). InnerVolumeSpecName "kube-api-access-wfs44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.718373 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec4cbe35-7184-45d4-b140-df2a2bac4020" (UID: "ec4cbe35-7184-45d4-b140-df2a2bac4020"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.720109 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-config-data" (OuterVolumeSpecName: "config-data") pod "ec4cbe35-7184-45d4-b140-df2a2bac4020" (UID: "ec4cbe35-7184-45d4-b140-df2a2bac4020"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.781184 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.781224 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfs44\" (UniqueName: \"kubernetes.io/projected/ec4cbe35-7184-45d4-b140-df2a2bac4020-kube-api-access-wfs44\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.781236 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec4cbe35-7184-45d4-b140-df2a2bac4020-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.876771 5033 generic.go:334] "Generic (PLEG): container finished" podID="ec4cbe35-7184-45d4-b140-df2a2bac4020" containerID="90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7" exitCode=0 Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.876819 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ec4cbe35-7184-45d4-b140-df2a2bac4020","Type":"ContainerDied","Data":"90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7"} Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.876838 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.876850 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ec4cbe35-7184-45d4-b140-df2a2bac4020","Type":"ContainerDied","Data":"cc8a5e0e1e2d59c6d5449520bc2d0844ecdc78ea2061fca4472583634543fe28"} Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.876873 5033 scope.go:117] "RemoveContainer" containerID="90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.937604 5033 scope.go:117] "RemoveContainer" containerID="90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7" Feb 26 17:18:46 crc kubenswrapper[5033]: E0226 17:18:46.941731 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7\": container with ID starting with 90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7 not found: ID does not exist" containerID="90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.941790 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7"} err="failed to get container status \"90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7\": rpc error: code = NotFound desc = could not find container \"90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7\": container with ID starting with 90fb6b1ec7c8ea82d23ceac89f442a579af0fd88be0054a4eafa33d9bba18ac7 not found: ID does not exist" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.955279 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.967646 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.983059 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 17:18:46 crc kubenswrapper[5033]: E0226 17:18:46.983587 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef70c56-0643-4d6c-9209-6927f55a76f7" containerName="dnsmasq-dns" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.983609 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef70c56-0643-4d6c-9209-6927f55a76f7" containerName="dnsmasq-dns" Feb 26 17:18:46 crc kubenswrapper[5033]: E0226 17:18:46.983626 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec4cbe35-7184-45d4-b140-df2a2bac4020" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.983637 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec4cbe35-7184-45d4-b140-df2a2bac4020" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 17:18:46 crc kubenswrapper[5033]: E0226 17:18:46.983655 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef70c56-0643-4d6c-9209-6927f55a76f7" containerName="init" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.983664 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef70c56-0643-4d6c-9209-6927f55a76f7" containerName="init" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.983883 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec4cbe35-7184-45d4-b140-df2a2bac4020" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.983914 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef70c56-0643-4d6c-9209-6927f55a76f7" containerName="dnsmasq-dns" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.984668 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.987616 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 17:18:46 crc kubenswrapper[5033]: I0226 17:18:46.994057 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.088016 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56249f2d-f5a1-42a1-b994-1569b5c7b5cb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"56249f2d-f5a1-42a1-b994-1569b5c7b5cb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.088094 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56249f2d-f5a1-42a1-b994-1569b5c7b5cb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"56249f2d-f5a1-42a1-b994-1569b5c7b5cb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.088275 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxnnj\" (UniqueName: \"kubernetes.io/projected/56249f2d-f5a1-42a1-b994-1569b5c7b5cb-kube-api-access-rxnnj\") pod \"nova-cell1-novncproxy-0\" (UID: \"56249f2d-f5a1-42a1-b994-1569b5c7b5cb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.190032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56249f2d-f5a1-42a1-b994-1569b5c7b5cb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"56249f2d-f5a1-42a1-b994-1569b5c7b5cb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.190155 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxnnj\" (UniqueName: \"kubernetes.io/projected/56249f2d-f5a1-42a1-b994-1569b5c7b5cb-kube-api-access-rxnnj\") pod \"nova-cell1-novncproxy-0\" (UID: \"56249f2d-f5a1-42a1-b994-1569b5c7b5cb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.190214 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56249f2d-f5a1-42a1-b994-1569b5c7b5cb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"56249f2d-f5a1-42a1-b994-1569b5c7b5cb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.199535 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56249f2d-f5a1-42a1-b994-1569b5c7b5cb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"56249f2d-f5a1-42a1-b994-1569b5c7b5cb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.199618 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56249f2d-f5a1-42a1-b994-1569b5c7b5cb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"56249f2d-f5a1-42a1-b994-1569b5c7b5cb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.217559 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxnnj\" (UniqueName: \"kubernetes.io/projected/56249f2d-f5a1-42a1-b994-1569b5c7b5cb-kube-api-access-rxnnj\") pod \"nova-cell1-novncproxy-0\" (UID: \"56249f2d-f5a1-42a1-b994-1569b5c7b5cb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.298647 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.654275 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec4cbe35-7184-45d4-b140-df2a2bac4020" path="/var/lib/kubelet/pods/ec4cbe35-7184-45d4-b140-df2a2bac4020/volumes" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.830473 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.840753 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.903101 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-config-data\") pod \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.903190 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cshb8\" (UniqueName: \"kubernetes.io/projected/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-kube-api-access-cshb8\") pod \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.903450 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-combined-ca-bundle\") pod \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\" (UID: \"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7\") " Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.907746 5033 generic.go:334] "Generic (PLEG): container finished" podID="f5a830b4-a5a6-4a74-b3d9-4672daadf5d7" containerID="9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3" exitCode=0 Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.907809 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.907884 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7","Type":"ContainerDied","Data":"9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3"} Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.907943 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f5a830b4-a5a6-4a74-b3d9-4672daadf5d7","Type":"ContainerDied","Data":"89bb0fc3dad0d006a3668222d0d76f2f1cf4074040bd312975731d521cf9ba73"} Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.907969 5033 scope.go:117] "RemoveContainer" containerID="9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.909329 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-kube-api-access-cshb8" (OuterVolumeSpecName: "kube-api-access-cshb8") pod "f5a830b4-a5a6-4a74-b3d9-4672daadf5d7" (UID: "f5a830b4-a5a6-4a74-b3d9-4672daadf5d7"). InnerVolumeSpecName "kube-api-access-cshb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.913572 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"56249f2d-f5a1-42a1-b994-1569b5c7b5cb","Type":"ContainerStarted","Data":"44fe9b0a40f1391bf8f4dd9e6bfc3938183fb5d1d66b8311a91a089452d527c6"} Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.942670 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-config-data" (OuterVolumeSpecName: "config-data") pod "f5a830b4-a5a6-4a74-b3d9-4672daadf5d7" (UID: "f5a830b4-a5a6-4a74-b3d9-4672daadf5d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:47 crc kubenswrapper[5033]: I0226 17:18:47.955922 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5a830b4-a5a6-4a74-b3d9-4672daadf5d7" (UID: "f5a830b4-a5a6-4a74-b3d9-4672daadf5d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.006115 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.006430 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.006452 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cshb8\" (UniqueName: \"kubernetes.io/projected/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7-kube-api-access-cshb8\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.009858 5033 scope.go:117] "RemoveContainer" containerID="9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3" Feb 26 17:18:48 crc kubenswrapper[5033]: E0226 17:18:48.010324 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3\": container with ID starting with 9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3 not found: ID does not exist" containerID="9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.010377 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3"} err="failed to get container status \"9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3\": rpc error: code = NotFound desc = could not find container \"9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3\": container with ID starting with 9a907f663326c1f0e4bc471b1798cb5e209e5533c4a8983141aeedb28db771d3 not found: ID does not exist" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.262021 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.273928 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.288190 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 17:18:48 crc kubenswrapper[5033]: E0226 17:18:48.288650 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a830b4-a5a6-4a74-b3d9-4672daadf5d7" containerName="nova-cell1-conductor-conductor" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.288671 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a830b4-a5a6-4a74-b3d9-4672daadf5d7" containerName="nova-cell1-conductor-conductor" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.288927 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a830b4-a5a6-4a74-b3d9-4672daadf5d7" containerName="nova-cell1-conductor-conductor" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.289703 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.296515 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.298387 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.413856 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.413956 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc95t\" (UniqueName: \"kubernetes.io/projected/98ee9213-7874-4f09-8b6f-5b4361f24147-kube-api-access-mc95t\") pod \"nova-cell1-conductor-0\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.414071 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.515660 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.515753 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.515828 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc95t\" (UniqueName: \"kubernetes.io/projected/98ee9213-7874-4f09-8b6f-5b4361f24147-kube-api-access-mc95t\") pod \"nova-cell1-conductor-0\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.522392 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.522448 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.539487 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc95t\" (UniqueName: \"kubernetes.io/projected/98ee9213-7874-4f09-8b6f-5b4361f24147-kube-api-access-mc95t\") pod \"nova-cell1-conductor-0\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.624115 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.690543 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.822747 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-combined-ca-bundle\") pod \"59148331-e81c-4ca3-aeb1-702fe06efa3a\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.823055 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pcxd\" (UniqueName: \"kubernetes.io/projected/59148331-e81c-4ca3-aeb1-702fe06efa3a-kube-api-access-9pcxd\") pod \"59148331-e81c-4ca3-aeb1-702fe06efa3a\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.823200 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-config-data\") pod \"59148331-e81c-4ca3-aeb1-702fe06efa3a\" (UID: \"59148331-e81c-4ca3-aeb1-702fe06efa3a\") " Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.834164 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59148331-e81c-4ca3-aeb1-702fe06efa3a-kube-api-access-9pcxd" (OuterVolumeSpecName: "kube-api-access-9pcxd") pod "59148331-e81c-4ca3-aeb1-702fe06efa3a" (UID: "59148331-e81c-4ca3-aeb1-702fe06efa3a"). InnerVolumeSpecName "kube-api-access-9pcxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.844661 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.108:8774/\": read tcp 10.217.0.2:51576->10.217.1.108:8774: read: connection reset by peer" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.844992 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.108:8774/\": read tcp 10.217.0.2:51564->10.217.1.108:8774: read: connection reset by peer" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.857977 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.107:8775/\": read tcp 10.217.0.2:32922->10.217.1.107:8775: read: connection reset by peer" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.858050 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.107:8775/\": read tcp 10.217.0.2:32930->10.217.1.107:8775: read: connection reset by peer" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.874587 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-config-data" (OuterVolumeSpecName: "config-data") pod "59148331-e81c-4ca3-aeb1-702fe06efa3a" (UID: "59148331-e81c-4ca3-aeb1-702fe06efa3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.889985 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59148331-e81c-4ca3-aeb1-702fe06efa3a" (UID: "59148331-e81c-4ca3-aeb1-702fe06efa3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.925219 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.925289 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148331-e81c-4ca3-aeb1-702fe06efa3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.925306 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pcxd\" (UniqueName: \"kubernetes.io/projected/59148331-e81c-4ca3-aeb1-702fe06efa3a-kube-api-access-9pcxd\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.937890 5033 generic.go:334] "Generic (PLEG): container finished" podID="59148331-e81c-4ca3-aeb1-702fe06efa3a" containerID="53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3" exitCode=0 Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.937942 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"59148331-e81c-4ca3-aeb1-702fe06efa3a","Type":"ContainerDied","Data":"53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3"} Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.937964 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"59148331-e81c-4ca3-aeb1-702fe06efa3a","Type":"ContainerDied","Data":"070a151dec086da4715253a937784889cb1ad35a6578e556e56bb1596b99955b"} Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.937981 5033 scope.go:117] "RemoveContainer" containerID="53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.938064 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.950571 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"56249f2d-f5a1-42a1-b994-1569b5c7b5cb","Type":"ContainerStarted","Data":"6ceca5f5c2efde7bb9f8fcc1dbaf9f2d99b662302cdc7435e62be3f4e45313eb"} Feb 26 17:18:48 crc kubenswrapper[5033]: I0226 17:18:48.982704 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.98268509 podStartE2EDuration="2.98268509s" podCreationTimestamp="2026-02-26 17:18:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:48.971338324 +0000 UTC m=+5856.183936974" watchObservedRunningTime="2026-02-26 17:18:48.98268509 +0000 UTC m=+5856.195283740" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.051943 5033 scope.go:117] "RemoveContainer" containerID="53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3" Feb 26 17:18:49 crc kubenswrapper[5033]: E0226 17:18:49.053407 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3\": container with ID starting with 53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3 not found: ID does not exist" containerID="53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.053446 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3"} err="failed to get container status \"53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3\": rpc error: code = NotFound desc = could not find container \"53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3\": container with ID starting with 53ccefc2cfff0279e8c6f6128ac360b025e5a5f281acb89a909c250636be0ce3 not found: ID does not exist" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.056921 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.070317 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.081154 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 17:18:49 crc kubenswrapper[5033]: E0226 17:18:49.081616 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59148331-e81c-4ca3-aeb1-702fe06efa3a" containerName="nova-cell0-conductor-conductor" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.081629 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="59148331-e81c-4ca3-aeb1-702fe06efa3a" containerName="nova-cell0-conductor-conductor" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.081861 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="59148331-e81c-4ca3-aeb1-702fe06efa3a" containerName="nova-cell0-conductor-conductor" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.082555 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.085858 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.105636 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.128732 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.128809 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.128986 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvmb\" (UniqueName: \"kubernetes.io/projected/229c2ab4-2387-4058-b422-f10ad64325bf-kube-api-access-lxvmb\") pod \"nova-cell0-conductor-0\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.170790 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.231794 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.231884 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvmb\" (UniqueName: \"kubernetes.io/projected/229c2ab4-2387-4058-b422-f10ad64325bf-kube-api-access-lxvmb\") pod \"nova-cell0-conductor-0\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.232040 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.238097 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.241314 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.283130 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvmb\" (UniqueName: \"kubernetes.io/projected/229c2ab4-2387-4058-b422-f10ad64325bf-kube-api-access-lxvmb\") pod \"nova-cell0-conductor-0\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.444563 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.641558 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59148331-e81c-4ca3-aeb1-702fe06efa3a" path="/var/lib/kubelet/pods/59148331-e81c-4ca3-aeb1-702fe06efa3a/volumes" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.649048 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a830b4-a5a6-4a74-b3d9-4672daadf5d7" path="/var/lib/kubelet/pods/f5a830b4-a5a6-4a74-b3d9-4672daadf5d7/volumes" Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.938622 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.976694 5033 generic.go:334] "Generic (PLEG): container finished" podID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerID="375b3aca0e4c4ac88a1e65e02e5982dc26ba7517d2bf51171f17f833b0306ac2" exitCode=0 Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.976748 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99a5fda7-2e18-405e-82d4-2fdb06ad5c26","Type":"ContainerDied","Data":"375b3aca0e4c4ac88a1e65e02e5982dc26ba7517d2bf51171f17f833b0306ac2"} Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.992143 5033 generic.go:334] "Generic (PLEG): container finished" podID="91e93f69-918c-477c-b6f1-a099de0e8081" containerID="c6eea805f59d3a0375029d5c1708601585eb15dbef44dd30539d72b7435b3dd0" exitCode=0 Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.992229 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91e93f69-918c-477c-b6f1-a099de0e8081","Type":"ContainerDied","Data":"c6eea805f59d3a0375029d5c1708601585eb15dbef44dd30539d72b7435b3dd0"} Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.992254 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91e93f69-918c-477c-b6f1-a099de0e8081","Type":"ContainerDied","Data":"29d850f90e9ecb5f6c7190a264c14b80063774cca15b57303e5df7ae4dc7c182"} Feb 26 17:18:49 crc kubenswrapper[5033]: I0226 17:18:49.992292 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d850f90e9ecb5f6c7190a264c14b80063774cca15b57303e5df7ae4dc7c182" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.012029 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"98ee9213-7874-4f09-8b6f-5b4361f24147","Type":"ContainerStarted","Data":"68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc"} Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.012090 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"98ee9213-7874-4f09-8b6f-5b4361f24147","Type":"ContainerStarted","Data":"53d8d45194b746ee54964b2bb8180e54fd080427c2b78e4144e4aac6b9c6078c"} Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.012108 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.036644 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.036623388 podStartE2EDuration="2.036623388s" podCreationTimestamp="2026-02-26 17:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:50.027327261 +0000 UTC m=+5857.239925911" watchObservedRunningTime="2026-02-26 17:18:50.036623388 +0000 UTC m=+5857.249222038" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.115778 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.122765 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.166504 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x95dp\" (UniqueName: \"kubernetes.io/projected/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-kube-api-access-x95dp\") pod \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.166658 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91e93f69-918c-477c-b6f1-a099de0e8081-logs\") pod \"91e93f69-918c-477c-b6f1-a099de0e8081\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.166709 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-config-data\") pod \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.166759 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghcss\" (UniqueName: \"kubernetes.io/projected/91e93f69-918c-477c-b6f1-a099de0e8081-kube-api-access-ghcss\") pod \"91e93f69-918c-477c-b6f1-a099de0e8081\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.166786 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-logs\") pod \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.166816 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-combined-ca-bundle\") pod \"91e93f69-918c-477c-b6f1-a099de0e8081\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.166847 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-config-data\") pod \"91e93f69-918c-477c-b6f1-a099de0e8081\" (UID: \"91e93f69-918c-477c-b6f1-a099de0e8081\") " Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.166871 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-combined-ca-bundle\") pod \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\" (UID: \"99a5fda7-2e18-405e-82d4-2fdb06ad5c26\") " Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.167209 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e93f69-918c-477c-b6f1-a099de0e8081-logs" (OuterVolumeSpecName: "logs") pod "91e93f69-918c-477c-b6f1-a099de0e8081" (UID: "91e93f69-918c-477c-b6f1-a099de0e8081"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.167244 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-logs" (OuterVolumeSpecName: "logs") pod "99a5fda7-2e18-405e-82d4-2fdb06ad5c26" (UID: "99a5fda7-2e18-405e-82d4-2fdb06ad5c26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.167733 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.167754 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91e93f69-918c-477c-b6f1-a099de0e8081-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.170883 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-kube-api-access-x95dp" (OuterVolumeSpecName: "kube-api-access-x95dp") pod "99a5fda7-2e18-405e-82d4-2fdb06ad5c26" (UID: "99a5fda7-2e18-405e-82d4-2fdb06ad5c26"). InnerVolumeSpecName "kube-api-access-x95dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.179458 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e93f69-918c-477c-b6f1-a099de0e8081-kube-api-access-ghcss" (OuterVolumeSpecName: "kube-api-access-ghcss") pod "91e93f69-918c-477c-b6f1-a099de0e8081" (UID: "91e93f69-918c-477c-b6f1-a099de0e8081"). InnerVolumeSpecName "kube-api-access-ghcss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.269453 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghcss\" (UniqueName: \"kubernetes.io/projected/91e93f69-918c-477c-b6f1-a099de0e8081-kube-api-access-ghcss\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.269487 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x95dp\" (UniqueName: \"kubernetes.io/projected/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-kube-api-access-x95dp\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.271210 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99a5fda7-2e18-405e-82d4-2fdb06ad5c26" (UID: "99a5fda7-2e18-405e-82d4-2fdb06ad5c26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.275671 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-config-data" (OuterVolumeSpecName: "config-data") pod "91e93f69-918c-477c-b6f1-a099de0e8081" (UID: "91e93f69-918c-477c-b6f1-a099de0e8081"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:50 crc kubenswrapper[5033]: E0226 17:18:50.277046 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eeb44306b2c71cac2267b32b7323164273db5287568eb909a17fb339f26ec8cc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 17:18:50 crc kubenswrapper[5033]: E0226 17:18:50.278449 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eeb44306b2c71cac2267b32b7323164273db5287568eb909a17fb339f26ec8cc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.279345 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-config-data" (OuterVolumeSpecName: "config-data") pod "99a5fda7-2e18-405e-82d4-2fdb06ad5c26" (UID: "99a5fda7-2e18-405e-82d4-2fdb06ad5c26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:50 crc kubenswrapper[5033]: E0226 17:18:50.279960 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eeb44306b2c71cac2267b32b7323164273db5287568eb909a17fb339f26ec8cc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 17:18:50 crc kubenswrapper[5033]: E0226 17:18:50.280034 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="96031ac0-f3a0-4e69-9cc3-12f595f19526" containerName="nova-scheduler-scheduler" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.281098 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91e93f69-918c-477c-b6f1-a099de0e8081" (UID: "91e93f69-918c-477c-b6f1-a099de0e8081"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.370763 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.370979 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.370990 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91e93f69-918c-477c-b6f1-a099de0e8081-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:50 crc kubenswrapper[5033]: I0226 17:18:50.370998 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5fda7-2e18-405e-82d4-2fdb06ad5c26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.019418 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"229c2ab4-2387-4058-b422-f10ad64325bf","Type":"ContainerStarted","Data":"27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3"} Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.019489 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"229c2ab4-2387-4058-b422-f10ad64325bf","Type":"ContainerStarted","Data":"987c509b3c299ff2a56c2c6bb40ecdbdca2dc1e510ebbfb138a2c8d7f6c0a382"} Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.019544 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.021664 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.025339 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99a5fda7-2e18-405e-82d4-2fdb06ad5c26","Type":"ContainerDied","Data":"518a864430f5de87ce137ca6931a4accece048e594ca04281b119a4f15ba4b92"} Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.025442 5033 scope.go:117] "RemoveContainer" containerID="375b3aca0e4c4ac88a1e65e02e5982dc26ba7517d2bf51171f17f833b0306ac2" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.025454 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.038783 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.038750431 podStartE2EDuration="2.038750431s" podCreationTimestamp="2026-02-26 17:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:51.038568257 +0000 UTC m=+5858.251166907" watchObservedRunningTime="2026-02-26 17:18:51.038750431 +0000 UTC m=+5858.251349081" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.054416 5033 scope.go:117] "RemoveContainer" containerID="3b53fcf968ac6cfc8e842e323e615d110fc8b8866d6d50296c6e506c0062a7d2" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.071322 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.091205 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.107158 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 17:18:51 crc kubenswrapper[5033]: E0226 17:18:51.107591 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-metadata" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.107611 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-metadata" Feb 26 17:18:51 crc kubenswrapper[5033]: E0226 17:18:51.107627 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-log" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.107637 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-log" Feb 26 17:18:51 crc kubenswrapper[5033]: E0226 17:18:51.107656 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-log" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.107665 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-log" Feb 26 17:18:51 crc kubenswrapper[5033]: E0226 17:18:51.107683 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-api" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.107689 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-api" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.107871 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-api" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.107888 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-metadata" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.107904 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" containerName="nova-api-log" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.107917 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" containerName="nova-metadata-log" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.109029 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.115073 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.117570 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.143323 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.151848 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.160317 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.161799 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.167082 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.186177 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b24d004-b104-4214-a60d-dfc3a02c0110-logs\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.186312 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.186339 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-config-data\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.186362 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfgx4\" (UniqueName: \"kubernetes.io/projected/8b24d004-b104-4214-a60d-dfc3a02c0110-kube-api-access-xfgx4\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.190293 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.287870 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-config-data\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.287933 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78wzc\" (UniqueName: \"kubernetes.io/projected/dbb98341-a8eb-4bed-94ed-c70daa525734-kube-api-access-78wzc\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.287960 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbb98341-a8eb-4bed-94ed-c70daa525734-logs\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.287986 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.288002 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.288018 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-config-data\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.288036 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfgx4\" (UniqueName: \"kubernetes.io/projected/8b24d004-b104-4214-a60d-dfc3a02c0110-kube-api-access-xfgx4\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.288092 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b24d004-b104-4214-a60d-dfc3a02c0110-logs\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.288557 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b24d004-b104-4214-a60d-dfc3a02c0110-logs\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.295017 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.295090 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-config-data\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.303503 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfgx4\" (UniqueName: \"kubernetes.io/projected/8b24d004-b104-4214-a60d-dfc3a02c0110-kube-api-access-xfgx4\") pod \"nova-api-0\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.389147 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-config-data\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.389206 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78wzc\" (UniqueName: \"kubernetes.io/projected/dbb98341-a8eb-4bed-94ed-c70daa525734-kube-api-access-78wzc\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.389233 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbb98341-a8eb-4bed-94ed-c70daa525734-logs\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.389268 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.389824 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbb98341-a8eb-4bed-94ed-c70daa525734-logs\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.393850 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-config-data\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.397820 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.409116 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78wzc\" (UniqueName: \"kubernetes.io/projected/dbb98341-a8eb-4bed-94ed-c70daa525734-kube-api-access-78wzc\") pod \"nova-metadata-0\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.431292 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.483037 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.644544 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e93f69-918c-477c-b6f1-a099de0e8081" path="/var/lib/kubelet/pods/91e93f69-918c-477c-b6f1-a099de0e8081/volumes" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.645602 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a5fda7-2e18-405e-82d4-2fdb06ad5c26" path="/var/lib/kubelet/pods/99a5fda7-2e18-405e-82d4-2fdb06ad5c26/volumes" Feb 26 17:18:51 crc kubenswrapper[5033]: I0226 17:18:51.969941 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 17:18:52 crc kubenswrapper[5033]: I0226 17:18:52.043623 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b24d004-b104-4214-a60d-dfc3a02c0110","Type":"ContainerStarted","Data":"0f6732edcd752b96edec5a6c3937b80948f621316ae9626386b4670628986b4f"} Feb 26 17:18:52 crc kubenswrapper[5033]: I0226 17:18:52.060623 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 17:18:52 crc kubenswrapper[5033]: W0226 17:18:52.063563 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbb98341_a8eb_4bed_94ed_c70daa525734.slice/crio-3c70690149d1608f112373bfa8efdb0d39482a2ce8a0ea5d097ce22b05dc385e WatchSource:0}: Error finding container 3c70690149d1608f112373bfa8efdb0d39482a2ce8a0ea5d097ce22b05dc385e: Status 404 returned error can't find the container with id 3c70690149d1608f112373bfa8efdb0d39482a2ce8a0ea5d097ce22b05dc385e Feb 26 17:18:52 crc kubenswrapper[5033]: I0226 17:18:52.299753 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:53 crc kubenswrapper[5033]: I0226 17:18:53.064980 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbb98341-a8eb-4bed-94ed-c70daa525734","Type":"ContainerStarted","Data":"44eda056281b096957a0d89db6856d952952d71fddfa9946b5125358e77a3722"} Feb 26 17:18:53 crc kubenswrapper[5033]: I0226 17:18:53.065242 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbb98341-a8eb-4bed-94ed-c70daa525734","Type":"ContainerStarted","Data":"c1ebd84ed8a59c6a3d9052f764e3fd0bfea732cc448fc55d840de3366cf78e85"} Feb 26 17:18:53 crc kubenswrapper[5033]: I0226 17:18:53.065252 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbb98341-a8eb-4bed-94ed-c70daa525734","Type":"ContainerStarted","Data":"3c70690149d1608f112373bfa8efdb0d39482a2ce8a0ea5d097ce22b05dc385e"} Feb 26 17:18:53 crc kubenswrapper[5033]: I0226 17:18:53.067759 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b24d004-b104-4214-a60d-dfc3a02c0110","Type":"ContainerStarted","Data":"cd5bb9ba709f7c053bfba60d1708532c94c2591e5dcb65ee3855c293b02abb70"} Feb 26 17:18:53 crc kubenswrapper[5033]: I0226 17:18:53.067802 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b24d004-b104-4214-a60d-dfc3a02c0110","Type":"ContainerStarted","Data":"c2479315d0e5d67819319cc183d9efdf68960a1b9cf36223c878fd2494275e5a"} Feb 26 17:18:53 crc kubenswrapper[5033]: I0226 17:18:53.090212 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.090190469 podStartE2EDuration="2.090190469s" podCreationTimestamp="2026-02-26 17:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:53.083769041 +0000 UTC m=+5860.296367691" watchObservedRunningTime="2026-02-26 17:18:53.090190469 +0000 UTC m=+5860.302789139" Feb 26 17:18:53 crc kubenswrapper[5033]: I0226 17:18:53.107969 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.107947422 podStartE2EDuration="2.107947422s" podCreationTimestamp="2026-02-26 17:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:53.100451729 +0000 UTC m=+5860.313050389" watchObservedRunningTime="2026-02-26 17:18:53.107947422 +0000 UTC m=+5860.320546072" Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.078480 5033 generic.go:334] "Generic (PLEG): container finished" podID="96031ac0-f3a0-4e69-9cc3-12f595f19526" containerID="eeb44306b2c71cac2267b32b7323164273db5287568eb909a17fb339f26ec8cc" exitCode=0 Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.078580 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"96031ac0-f3a0-4e69-9cc3-12f595f19526","Type":"ContainerDied","Data":"eeb44306b2c71cac2267b32b7323164273db5287568eb909a17fb339f26ec8cc"} Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.356914 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.451964 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-combined-ca-bundle\") pod \"96031ac0-f3a0-4e69-9cc3-12f595f19526\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.452026 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65bxr\" (UniqueName: \"kubernetes.io/projected/96031ac0-f3a0-4e69-9cc3-12f595f19526-kube-api-access-65bxr\") pod \"96031ac0-f3a0-4e69-9cc3-12f595f19526\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.452091 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-config-data\") pod \"96031ac0-f3a0-4e69-9cc3-12f595f19526\" (UID: \"96031ac0-f3a0-4e69-9cc3-12f595f19526\") " Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.471504 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96031ac0-f3a0-4e69-9cc3-12f595f19526-kube-api-access-65bxr" (OuterVolumeSpecName: "kube-api-access-65bxr") pod "96031ac0-f3a0-4e69-9cc3-12f595f19526" (UID: "96031ac0-f3a0-4e69-9cc3-12f595f19526"). InnerVolumeSpecName "kube-api-access-65bxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.515108 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96031ac0-f3a0-4e69-9cc3-12f595f19526" (UID: "96031ac0-f3a0-4e69-9cc3-12f595f19526"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.518534 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-config-data" (OuterVolumeSpecName: "config-data") pod "96031ac0-f3a0-4e69-9cc3-12f595f19526" (UID: "96031ac0-f3a0-4e69-9cc3-12f595f19526"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.556533 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.556586 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65bxr\" (UniqueName: \"kubernetes.io/projected/96031ac0-f3a0-4e69-9cc3-12f595f19526-kube-api-access-65bxr\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:54 crc kubenswrapper[5033]: I0226 17:18:54.556600 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96031ac0-f3a0-4e69-9cc3-12f595f19526-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.089586 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"96031ac0-f3a0-4e69-9cc3-12f595f19526","Type":"ContainerDied","Data":"09fe42a7bcae5b3437b4200972fa81a105854340ff5b0e866b65511362bc2fd5"} Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.090152 5033 scope.go:117] "RemoveContainer" containerID="eeb44306b2c71cac2267b32b7323164273db5287568eb909a17fb339f26ec8cc" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.090320 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.141449 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.156394 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.188425 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:18:55 crc kubenswrapper[5033]: E0226 17:18:55.189592 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96031ac0-f3a0-4e69-9cc3-12f595f19526" containerName="nova-scheduler-scheduler" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.189623 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="96031ac0-f3a0-4e69-9cc3-12f595f19526" containerName="nova-scheduler-scheduler" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.190155 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="96031ac0-f3a0-4e69-9cc3-12f595f19526" containerName="nova-scheduler-scheduler" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.191440 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.194318 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.201645 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.292106 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhz5v\" (UniqueName: \"kubernetes.io/projected/6dbd74aa-2791-4724-946a-21b175fdedd6-kube-api-access-dhz5v\") pod \"nova-scheduler-0\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.292528 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-config-data\") pod \"nova-scheduler-0\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.292699 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.394665 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhz5v\" (UniqueName: \"kubernetes.io/projected/6dbd74aa-2791-4724-946a-21b175fdedd6-kube-api-access-dhz5v\") pod \"nova-scheduler-0\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.394771 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-config-data\") pod \"nova-scheduler-0\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.394812 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.402856 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.403040 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-config-data\") pod \"nova-scheduler-0\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.412359 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhz5v\" (UniqueName: \"kubernetes.io/projected/6dbd74aa-2791-4724-946a-21b175fdedd6-kube-api-access-dhz5v\") pod \"nova-scheduler-0\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.511572 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 17:18:55 crc kubenswrapper[5033]: I0226 17:18:55.641452 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96031ac0-f3a0-4e69-9cc3-12f595f19526" path="/var/lib/kubelet/pods/96031ac0-f3a0-4e69-9cc3-12f595f19526/volumes" Feb 26 17:18:55 crc kubenswrapper[5033]: W0226 17:18:55.993528 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dbd74aa_2791_4724_946a_21b175fdedd6.slice/crio-741aedc47f559a1d5ef80384c91df0e3e604c64417bf1dd1c18a38c0a9d328d8 WatchSource:0}: Error finding container 741aedc47f559a1d5ef80384c91df0e3e604c64417bf1dd1c18a38c0a9d328d8: Status 404 returned error can't find the container with id 741aedc47f559a1d5ef80384c91df0e3e604c64417bf1dd1c18a38c0a9d328d8 Feb 26 17:18:56 crc kubenswrapper[5033]: I0226 17:18:56.002737 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 17:18:56 crc kubenswrapper[5033]: I0226 17:18:56.103536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6dbd74aa-2791-4724-946a-21b175fdedd6","Type":"ContainerStarted","Data":"741aedc47f559a1d5ef80384c91df0e3e604c64417bf1dd1c18a38c0a9d328d8"} Feb 26 17:18:56 crc kubenswrapper[5033]: I0226 17:18:56.483745 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 17:18:56 crc kubenswrapper[5033]: I0226 17:18:56.484089 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 17:18:57 crc kubenswrapper[5033]: I0226 17:18:57.116190 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6dbd74aa-2791-4724-946a-21b175fdedd6","Type":"ContainerStarted","Data":"28cb606fda414c24a338470fdf12da2bbb4e68d9c9794d8b44e088c1b467cb96"} Feb 26 17:18:57 crc kubenswrapper[5033]: I0226 17:18:57.138560 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.138540992 podStartE2EDuration="2.138540992s" podCreationTimestamp="2026-02-26 17:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:18:57.128153249 +0000 UTC m=+5864.340751899" watchObservedRunningTime="2026-02-26 17:18:57.138540992 +0000 UTC m=+5864.351139642" Feb 26 17:18:57 crc kubenswrapper[5033]: I0226 17:18:57.299591 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:57 crc kubenswrapper[5033]: I0226 17:18:57.313117 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:58 crc kubenswrapper[5033]: I0226 17:18:58.140121 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 26 17:18:58 crc kubenswrapper[5033]: I0226 17:18:58.662647 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 26 17:18:59 crc kubenswrapper[5033]: I0226 17:18:59.472896 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 26 17:19:00 crc kubenswrapper[5033]: I0226 17:19:00.511811 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 17:19:00 crc kubenswrapper[5033]: I0226 17:19:00.628673 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:19:00 crc kubenswrapper[5033]: E0226 17:19:00.628982 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:19:01 crc kubenswrapper[5033]: I0226 17:19:01.432534 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 17:19:01 crc kubenswrapper[5033]: I0226 17:19:01.432629 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 17:19:01 crc kubenswrapper[5033]: I0226 17:19:01.484458 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 17:19:01 crc kubenswrapper[5033]: I0226 17:19:01.484519 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 17:19:02 crc kubenswrapper[5033]: I0226 17:19:02.514471 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.119:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:19:02 crc kubenswrapper[5033]: I0226 17:19:02.514470 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.119:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:19:02 crc kubenswrapper[5033]: I0226 17:19:02.596545 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.120:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:19:02 crc kubenswrapper[5033]: I0226 17:19:02.596563 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.120:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.342672 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.345189 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.347771 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.355240 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.450064 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbd2m\" (UniqueName: \"kubernetes.io/projected/97630434-3923-458d-83df-4ad754063580-kube-api-access-mbd2m\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.450508 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.450544 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-scripts\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.450594 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97630434-3923-458d-83df-4ad754063580-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.450625 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.450664 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.551764 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97630434-3923-458d-83df-4ad754063580-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.551819 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.551861 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.551908 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbd2m\" (UniqueName: \"kubernetes.io/projected/97630434-3923-458d-83df-4ad754063580-kube-api-access-mbd2m\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.551958 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.551964 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97630434-3923-458d-83df-4ad754063580-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.551985 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-scripts\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.558787 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.561398 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-scripts\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.568189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.568292 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.568958 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbd2m\" (UniqueName: \"kubernetes.io/projected/97630434-3923-458d-83df-4ad754063580-kube-api-access-mbd2m\") pod \"cinder-scheduler-0\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:03 crc kubenswrapper[5033]: I0226 17:19:03.672994 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 17:19:04 crc kubenswrapper[5033]: I0226 17:19:04.088864 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 17:19:04 crc kubenswrapper[5033]: I0226 17:19:04.187347 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97630434-3923-458d-83df-4ad754063580","Type":"ContainerStarted","Data":"5925e2dd68346ecbf25b7acf9427434dd5bc749115b7c8910d88e02230c1f708"} Feb 26 17:19:04 crc kubenswrapper[5033]: I0226 17:19:04.705829 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 17:19:04 crc kubenswrapper[5033]: I0226 17:19:04.706422 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c7e95af8-71df-49bd-8639-de3007ccc459" containerName="cinder-api-log" containerID="cri-o://dd0ff1c953a23b10a8da8fa12693876185e8f46928148f2114271d977c075db0" gracePeriod=30 Feb 26 17:19:04 crc kubenswrapper[5033]: I0226 17:19:04.707086 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c7e95af8-71df-49bd-8639-de3007ccc459" containerName="cinder-api" containerID="cri-o://1f2d786f168748fdbe0f8ee2b4862967a699848d1e2e43790573b71ab7a3cefd" gracePeriod=30 Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.144230 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.146240 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.148939 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.166460 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.202911 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.202951 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.202977 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.202996 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203017 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-run\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203062 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b9c460f2-2f7e-496f-b148-de398d33f18e-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203089 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203108 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-sys\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203148 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203170 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljzdb\" (UniqueName: \"kubernetes.io/projected/b9c460f2-2f7e-496f-b148-de398d33f18e-kube-api-access-ljzdb\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203186 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203199 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203228 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203249 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203284 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-dev\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.203318 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.211238 5033 generic.go:334] "Generic (PLEG): container finished" podID="c7e95af8-71df-49bd-8639-de3007ccc459" containerID="dd0ff1c953a23b10a8da8fa12693876185e8f46928148f2114271d977c075db0" exitCode=143 Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.211320 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c7e95af8-71df-49bd-8639-de3007ccc459","Type":"ContainerDied","Data":"dd0ff1c953a23b10a8da8fa12693876185e8f46928148f2114271d977c075db0"} Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.214190 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97630434-3923-458d-83df-4ad754063580","Type":"ContainerStarted","Data":"2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc"} Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305212 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305273 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljzdb\" (UniqueName: \"kubernetes.io/projected/b9c460f2-2f7e-496f-b148-de398d33f18e-kube-api-access-ljzdb\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305302 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305332 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305364 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305386 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305402 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-dev\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305427 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305455 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305473 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305493 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305498 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305512 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305549 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305609 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-run\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305638 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305740 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b9c460f2-2f7e-496f-b148-de398d33f18e-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305827 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305858 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-sys\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.305999 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-sys\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.306038 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.306105 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-dev\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.306132 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.306162 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-run\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.306653 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.306801 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b9c460f2-2f7e-496f-b148-de398d33f18e-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.309680 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.310879 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.312248 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b9c460f2-2f7e-496f-b148-de398d33f18e-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.312908 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.314780 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c460f2-2f7e-496f-b148-de398d33f18e-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.326566 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljzdb\" (UniqueName: \"kubernetes.io/projected/b9c460f2-2f7e-496f-b148-de398d33f18e-kube-api-access-ljzdb\") pod \"cinder-volume-volume1-0\" (UID: \"b9c460f2-2f7e-496f-b148-de398d33f18e\") " pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.469012 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.513756 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.540578 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.934545 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.936421 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.942307 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 26 17:19:05 crc kubenswrapper[5033]: I0226 17:19:05.955707 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023051 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-etc-nvme\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023122 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-scripts\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023152 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-sys\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023195 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-config-data-custom\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023218 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023238 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023344 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023380 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df14a279-eaa8-4702-976b-ab92229cd218-ceph\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023414 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwhqx\" (UniqueName: \"kubernetes.io/projected/df14a279-eaa8-4702-976b-ab92229cd218-kube-api-access-wwhqx\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023437 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023466 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023488 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-run\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023517 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023539 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-config-data\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023552 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-dev\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.023606 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-lib-modules\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.106939 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 26 17:19:06 crc kubenswrapper[5033]: W0226 17:19:06.110374 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9c460f2_2f7e_496f_b148_de398d33f18e.slice/crio-4c2b8f631428089a25a215368a86716640be842bf28a3710b6a941db49e8dff0 WatchSource:0}: Error finding container 4c2b8f631428089a25a215368a86716640be842bf28a3710b6a941db49e8dff0: Status 404 returned error can't find the container with id 4c2b8f631428089a25a215368a86716640be842bf28a3710b6a941db49e8dff0 Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.124927 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-config-data-custom\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.124984 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125007 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125067 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125113 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df14a279-eaa8-4702-976b-ab92229cd218-ceph\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125161 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwhqx\" (UniqueName: \"kubernetes.io/projected/df14a279-eaa8-4702-976b-ab92229cd218-kube-api-access-wwhqx\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125192 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125229 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125274 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-run\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125305 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125335 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-config-data\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125355 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-dev\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125390 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-lib-modules\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125410 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-etc-nvme\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125432 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-scripts\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125458 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-sys\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.125548 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-sys\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.126078 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.126150 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.126173 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.126430 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.126478 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.126518 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-run\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.126547 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-lib-modules\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.126538 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-etc-nvme\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.126610 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df14a279-eaa8-4702-976b-ab92229cd218-dev\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.130746 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-scripts\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.130916 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/df14a279-eaa8-4702-976b-ab92229cd218-ceph\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.132089 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-config-data-custom\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.132487 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-config-data\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.139122 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df14a279-eaa8-4702-976b-ab92229cd218-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.145882 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwhqx\" (UniqueName: \"kubernetes.io/projected/df14a279-eaa8-4702-976b-ab92229cd218-kube-api-access-wwhqx\") pod \"cinder-backup-0\" (UID: \"df14a279-eaa8-4702-976b-ab92229cd218\") " pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.227902 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97630434-3923-458d-83df-4ad754063580","Type":"ContainerStarted","Data":"e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635"} Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.230490 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"b9c460f2-2f7e-496f-b148-de398d33f18e","Type":"ContainerStarted","Data":"4c2b8f631428089a25a215368a86716640be842bf28a3710b6a941db49e8dff0"} Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.251391 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.251376415 podStartE2EDuration="3.251376415s" podCreationTimestamp="2026-02-26 17:19:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:19:06.247951001 +0000 UTC m=+5873.460549651" watchObservedRunningTime="2026-02-26 17:19:06.251376415 +0000 UTC m=+5873.463975065" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.260625 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.265238 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 17:19:06 crc kubenswrapper[5033]: I0226 17:19:06.881628 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 26 17:19:07 crc kubenswrapper[5033]: I0226 17:19:07.247125 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"df14a279-eaa8-4702-976b-ab92229cd218","Type":"ContainerStarted","Data":"b36fb0d72fc2bcaa15e6c1d9f694a7be03ba72c1ea87997b70e2bf2a99b25411"} Feb 26 17:19:07 crc kubenswrapper[5033]: I0226 17:19:07.250399 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"b9c460f2-2f7e-496f-b148-de398d33f18e","Type":"ContainerStarted","Data":"5ae09c5f1e0fd986cf64fe9f884f22d491832f5ea530737c51d8f7bbf6b1f931"} Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.262372 5033 generic.go:334] "Generic (PLEG): container finished" podID="c7e95af8-71df-49bd-8639-de3007ccc459" containerID="1f2d786f168748fdbe0f8ee2b4862967a699848d1e2e43790573b71ab7a3cefd" exitCode=0 Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.262440 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c7e95af8-71df-49bd-8639-de3007ccc459","Type":"ContainerDied","Data":"1f2d786f168748fdbe0f8ee2b4862967a699848d1e2e43790573b71ab7a3cefd"} Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.262949 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c7e95af8-71df-49bd-8639-de3007ccc459","Type":"ContainerDied","Data":"16f0d35c56b1ce43dc20f81ce2503582eedae6096a6825bbdd0537b921e3f87b"} Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.262972 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16f0d35c56b1ce43dc20f81ce2503582eedae6096a6825bbdd0537b921e3f87b" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.267546 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"b9c460f2-2f7e-496f-b148-de398d33f18e","Type":"ContainerStarted","Data":"baf807fcaf8cd41854db93d9ea8a96cf0b2a7492a97f30825e0d92481ffd87c2"} Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.305472 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.488271626 podStartE2EDuration="3.305457017s" podCreationTimestamp="2026-02-26 17:19:05 +0000 UTC" firstStartedPulling="2026-02-26 17:19:06.111809009 +0000 UTC m=+5873.324407659" lastFinishedPulling="2026-02-26 17:19:06.9289944 +0000 UTC m=+5874.141593050" observedRunningTime="2026-02-26 17:19:08.304080464 +0000 UTC m=+5875.516679114" watchObservedRunningTime="2026-02-26 17:19:08.305457017 +0000 UTC m=+5875.518055667" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.325272 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.369716 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e95af8-71df-49bd-8639-de3007ccc459-logs\") pod \"c7e95af8-71df-49bd-8639-de3007ccc459\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.369756 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data-custom\") pod \"c7e95af8-71df-49bd-8639-de3007ccc459\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.369775 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-scripts\") pod \"c7e95af8-71df-49bd-8639-de3007ccc459\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.369802 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data\") pod \"c7e95af8-71df-49bd-8639-de3007ccc459\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.369882 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tnrz\" (UniqueName: \"kubernetes.io/projected/c7e95af8-71df-49bd-8639-de3007ccc459-kube-api-access-4tnrz\") pod \"c7e95af8-71df-49bd-8639-de3007ccc459\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.369955 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7e95af8-71df-49bd-8639-de3007ccc459-etc-machine-id\") pod \"c7e95af8-71df-49bd-8639-de3007ccc459\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.370012 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-combined-ca-bundle\") pod \"c7e95af8-71df-49bd-8639-de3007ccc459\" (UID: \"c7e95af8-71df-49bd-8639-de3007ccc459\") " Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.370509 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e95af8-71df-49bd-8639-de3007ccc459-logs" (OuterVolumeSpecName: "logs") pod "c7e95af8-71df-49bd-8639-de3007ccc459" (UID: "c7e95af8-71df-49bd-8639-de3007ccc459"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.370926 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7e95af8-71df-49bd-8639-de3007ccc459-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c7e95af8-71df-49bd-8639-de3007ccc459" (UID: "c7e95af8-71df-49bd-8639-de3007ccc459"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.375127 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c7e95af8-71df-49bd-8639-de3007ccc459" (UID: "c7e95af8-71df-49bd-8639-de3007ccc459"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.380877 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-scripts" (OuterVolumeSpecName: "scripts") pod "c7e95af8-71df-49bd-8639-de3007ccc459" (UID: "c7e95af8-71df-49bd-8639-de3007ccc459"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.389935 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e95af8-71df-49bd-8639-de3007ccc459-kube-api-access-4tnrz" (OuterVolumeSpecName: "kube-api-access-4tnrz") pod "c7e95af8-71df-49bd-8639-de3007ccc459" (UID: "c7e95af8-71df-49bd-8639-de3007ccc459"). InnerVolumeSpecName "kube-api-access-4tnrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.406002 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7e95af8-71df-49bd-8639-de3007ccc459" (UID: "c7e95af8-71df-49bd-8639-de3007ccc459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.461316 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data" (OuterVolumeSpecName: "config-data") pod "c7e95af8-71df-49bd-8639-de3007ccc459" (UID: "c7e95af8-71df-49bd-8639-de3007ccc459"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.472408 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tnrz\" (UniqueName: \"kubernetes.io/projected/c7e95af8-71df-49bd-8639-de3007ccc459-kube-api-access-4tnrz\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.472462 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7e95af8-71df-49bd-8639-de3007ccc459-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.472480 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.472498 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e95af8-71df-49bd-8639-de3007ccc459-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.472519 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.472533 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.472548 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e95af8-71df-49bd-8639-de3007ccc459-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:08 crc kubenswrapper[5033]: I0226 17:19:08.673968 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.280873 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"df14a279-eaa8-4702-976b-ab92229cd218","Type":"ContainerStarted","Data":"c46c77440353d7fa971dbba09934d99c3424b50dddc43fab24c6895566dc08ae"} Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.281544 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"df14a279-eaa8-4702-976b-ab92229cd218","Type":"ContainerStarted","Data":"5377e1d213184eab50a57b9adbcedd60ba0ccb9318c4644fd5874e0f7cc52d06"} Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.280892 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.319662 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.570418691 podStartE2EDuration="4.319632464s" podCreationTimestamp="2026-02-26 17:19:05 +0000 UTC" firstStartedPulling="2026-02-26 17:19:06.878671632 +0000 UTC m=+5874.091270282" lastFinishedPulling="2026-02-26 17:19:08.627885405 +0000 UTC m=+5875.840484055" observedRunningTime="2026-02-26 17:19:09.312201473 +0000 UTC m=+5876.524800133" watchObservedRunningTime="2026-02-26 17:19:09.319632464 +0000 UTC m=+5876.532231154" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.344576 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.357393 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.365768 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 17:19:09 crc kubenswrapper[5033]: E0226 17:19:09.366486 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e95af8-71df-49bd-8639-de3007ccc459" containerName="cinder-api" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.366516 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e95af8-71df-49bd-8639-de3007ccc459" containerName="cinder-api" Feb 26 17:19:09 crc kubenswrapper[5033]: E0226 17:19:09.366616 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e95af8-71df-49bd-8639-de3007ccc459" containerName="cinder-api-log" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.366637 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e95af8-71df-49bd-8639-de3007ccc459" containerName="cinder-api-log" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.366960 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e95af8-71df-49bd-8639-de3007ccc459" containerName="cinder-api" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.367011 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e95af8-71df-49bd-8639-de3007ccc459" containerName="cinder-api-log" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.368710 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.371316 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.405422 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.492214 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.492313 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-logs\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.492341 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.492506 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.492721 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bcj\" (UniqueName: \"kubernetes.io/projected/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-kube-api-access-c2bcj\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.492834 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-scripts\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.492947 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-config-data\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.595326 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.595428 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.595856 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-logs\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.596611 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-logs\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.596719 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.597824 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.597924 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2bcj\" (UniqueName: \"kubernetes.io/projected/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-kube-api-access-c2bcj\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.597998 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-scripts\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.598055 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-config-data\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.604950 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.605226 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-scripts\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.605731 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.615531 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-config-data\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.626082 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2bcj\" (UniqueName: \"kubernetes.io/projected/ed9da2b7-a018-43d7-a0a1-c6166adc3c3f-kube-api-access-c2bcj\") pod \"cinder-api-0\" (UID: \"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f\") " pod="openstack/cinder-api-0" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.675819 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e95af8-71df-49bd-8639-de3007ccc459" path="/var/lib/kubelet/pods/c7e95af8-71df-49bd-8639-de3007ccc459/volumes" Feb 26 17:19:09 crc kubenswrapper[5033]: I0226 17:19:09.690365 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 17:19:10 crc kubenswrapper[5033]: I0226 17:19:10.182741 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 17:19:10 crc kubenswrapper[5033]: I0226 17:19:10.295088 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f","Type":"ContainerStarted","Data":"8be47a17c23dc9335897726638c8577f978b7cfe82cf6bc240333ccb89e0cc64"} Feb 26 17:19:10 crc kubenswrapper[5033]: I0226 17:19:10.471941 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.261431 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.307618 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f","Type":"ContainerStarted","Data":"c086832e58259cf8681aa9fd05482ee29490eda73c7a6d416193a33fa175d9fd"} Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.437124 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.437701 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.441806 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.443542 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.488906 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.489025 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.491213 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.491643 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 17:19:11 crc kubenswrapper[5033]: I0226 17:19:11.628206 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:19:11 crc kubenswrapper[5033]: E0226 17:19:11.628656 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:19:12 crc kubenswrapper[5033]: I0226 17:19:12.318372 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ed9da2b7-a018-43d7-a0a1-c6166adc3c3f","Type":"ContainerStarted","Data":"e940d198a43cb8636ce08b2aa9372d2271014b0433eaabf25f30cf2c8e297afb"} Feb 26 17:19:12 crc kubenswrapper[5033]: I0226 17:19:12.318789 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 17:19:12 crc kubenswrapper[5033]: I0226 17:19:12.319352 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 17:19:12 crc kubenswrapper[5033]: I0226 17:19:12.323815 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 17:19:12 crc kubenswrapper[5033]: I0226 17:19:12.362894 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.362863042 podStartE2EDuration="3.362863042s" podCreationTimestamp="2026-02-26 17:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:19:12.34349898 +0000 UTC m=+5879.556097710" watchObservedRunningTime="2026-02-26 17:19:12.362863042 +0000 UTC m=+5879.575461732" Feb 26 17:19:13 crc kubenswrapper[5033]: I0226 17:19:13.926851 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 17:19:14 crc kubenswrapper[5033]: I0226 17:19:14.000651 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 17:19:14 crc kubenswrapper[5033]: I0226 17:19:14.336606 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="97630434-3923-458d-83df-4ad754063580" containerName="cinder-scheduler" containerID="cri-o://2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc" gracePeriod=30 Feb 26 17:19:14 crc kubenswrapper[5033]: I0226 17:19:14.336710 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="97630434-3923-458d-83df-4ad754063580" containerName="probe" containerID="cri-o://e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635" gracePeriod=30 Feb 26 17:19:15 crc kubenswrapper[5033]: I0226 17:19:15.354386 5033 generic.go:334] "Generic (PLEG): container finished" podID="97630434-3923-458d-83df-4ad754063580" containerID="e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635" exitCode=0 Feb 26 17:19:15 crc kubenswrapper[5033]: I0226 17:19:15.354436 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97630434-3923-458d-83df-4ad754063580","Type":"ContainerDied","Data":"e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635"} Feb 26 17:19:15 crc kubenswrapper[5033]: I0226 17:19:15.746697 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.353623 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.366162 5033 generic.go:334] "Generic (PLEG): container finished" podID="97630434-3923-458d-83df-4ad754063580" containerID="2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc" exitCode=0 Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.366208 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97630434-3923-458d-83df-4ad754063580","Type":"ContainerDied","Data":"2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc"} Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.366277 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"97630434-3923-458d-83df-4ad754063580","Type":"ContainerDied","Data":"5925e2dd68346ecbf25b7acf9427434dd5bc749115b7c8910d88e02230c1f708"} Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.366297 5033 scope.go:117] "RemoveContainer" containerID="e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.366317 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.408554 5033 scope.go:117] "RemoveContainer" containerID="2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.445104 5033 scope.go:117] "RemoveContainer" containerID="e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635" Feb 26 17:19:16 crc kubenswrapper[5033]: E0226 17:19:16.449289 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635\": container with ID starting with e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635 not found: ID does not exist" containerID="e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.449332 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635"} err="failed to get container status \"e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635\": rpc error: code = NotFound desc = could not find container \"e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635\": container with ID starting with e78e0bafc6747c6b62659444d369b11ddf2a82f08d84e359b535433ba3f2e635 not found: ID does not exist" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.449357 5033 scope.go:117] "RemoveContainer" containerID="2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc" Feb 26 17:19:16 crc kubenswrapper[5033]: E0226 17:19:16.452593 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc\": container with ID starting with 2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc not found: ID does not exist" containerID="2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.452629 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc"} err="failed to get container status \"2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc\": rpc error: code = NotFound desc = could not find container \"2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc\": container with ID starting with 2cc00ad6f38164f0eb64967c54d00f16fa75f5ede7ac7c5faa5b5dbbe75167fc not found: ID does not exist" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.495468 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.537571 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data\") pod \"97630434-3923-458d-83df-4ad754063580\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.537628 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-combined-ca-bundle\") pod \"97630434-3923-458d-83df-4ad754063580\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.537830 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data-custom\") pod \"97630434-3923-458d-83df-4ad754063580\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.537878 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97630434-3923-458d-83df-4ad754063580-etc-machine-id\") pod \"97630434-3923-458d-83df-4ad754063580\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.537940 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-scripts\") pod \"97630434-3923-458d-83df-4ad754063580\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.538045 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbd2m\" (UniqueName: \"kubernetes.io/projected/97630434-3923-458d-83df-4ad754063580-kube-api-access-mbd2m\") pod \"97630434-3923-458d-83df-4ad754063580\" (UID: \"97630434-3923-458d-83df-4ad754063580\") " Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.539836 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97630434-3923-458d-83df-4ad754063580-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "97630434-3923-458d-83df-4ad754063580" (UID: "97630434-3923-458d-83df-4ad754063580"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.544624 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-scripts" (OuterVolumeSpecName: "scripts") pod "97630434-3923-458d-83df-4ad754063580" (UID: "97630434-3923-458d-83df-4ad754063580"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.565816 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "97630434-3923-458d-83df-4ad754063580" (UID: "97630434-3923-458d-83df-4ad754063580"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.576356 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97630434-3923-458d-83df-4ad754063580-kube-api-access-mbd2m" (OuterVolumeSpecName: "kube-api-access-mbd2m") pod "97630434-3923-458d-83df-4ad754063580" (UID: "97630434-3923-458d-83df-4ad754063580"). InnerVolumeSpecName "kube-api-access-mbd2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.636911 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97630434-3923-458d-83df-4ad754063580" (UID: "97630434-3923-458d-83df-4ad754063580"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.645479 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.645518 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbd2m\" (UniqueName: \"kubernetes.io/projected/97630434-3923-458d-83df-4ad754063580-kube-api-access-mbd2m\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.645529 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.645539 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.645548 5033 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/97630434-3923-458d-83df-4ad754063580-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.654674 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data" (OuterVolumeSpecName: "config-data") pod "97630434-3923-458d-83df-4ad754063580" (UID: "97630434-3923-458d-83df-4ad754063580"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.712207 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.719958 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.728890 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 17:19:16 crc kubenswrapper[5033]: E0226 17:19:16.729346 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97630434-3923-458d-83df-4ad754063580" containerName="cinder-scheduler" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.729362 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="97630434-3923-458d-83df-4ad754063580" containerName="cinder-scheduler" Feb 26 17:19:16 crc kubenswrapper[5033]: E0226 17:19:16.729382 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97630434-3923-458d-83df-4ad754063580" containerName="probe" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.729389 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="97630434-3923-458d-83df-4ad754063580" containerName="probe" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.729573 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="97630434-3923-458d-83df-4ad754063580" containerName="cinder-scheduler" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.729593 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="97630434-3923-458d-83df-4ad754063580" containerName="probe" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.730554 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.733210 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.738821 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.747421 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97630434-3923-458d-83df-4ad754063580-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.848779 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.848837 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd7f43ba-d95d-48ef-9788-b25bb345f972-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.848862 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-scripts\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.848884 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffb6g\" (UniqueName: \"kubernetes.io/projected/dd7f43ba-d95d-48ef-9788-b25bb345f972-kube-api-access-ffb6g\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.848944 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-config-data\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.848995 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.952408 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.952461 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd7f43ba-d95d-48ef-9788-b25bb345f972-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.952478 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-scripts\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.952498 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffb6g\" (UniqueName: \"kubernetes.io/projected/dd7f43ba-d95d-48ef-9788-b25bb345f972-kube-api-access-ffb6g\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.952553 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-config-data\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.952584 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.955062 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd7f43ba-d95d-48ef-9788-b25bb345f972-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.956068 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.959308 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-scripts\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.959402 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.968097 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd7f43ba-d95d-48ef-9788-b25bb345f972-config-data\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:16 crc kubenswrapper[5033]: I0226 17:19:16.981632 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffb6g\" (UniqueName: \"kubernetes.io/projected/dd7f43ba-d95d-48ef-9788-b25bb345f972-kube-api-access-ffb6g\") pod \"cinder-scheduler-0\" (UID: \"dd7f43ba-d95d-48ef-9788-b25bb345f972\") " pod="openstack/cinder-scheduler-0" Feb 26 17:19:17 crc kubenswrapper[5033]: I0226 17:19:17.046030 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 17:19:17 crc kubenswrapper[5033]: I0226 17:19:17.573142 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 17:19:17 crc kubenswrapper[5033]: I0226 17:19:17.644621 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97630434-3923-458d-83df-4ad754063580" path="/var/lib/kubelet/pods/97630434-3923-458d-83df-4ad754063580/volumes" Feb 26 17:19:18 crc kubenswrapper[5033]: I0226 17:19:18.388792 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd7f43ba-d95d-48ef-9788-b25bb345f972","Type":"ContainerStarted","Data":"a2170e99aefb997bd6196b75c1e96f380bd72fdc023b18e65c835128fda7ebd6"} Feb 26 17:19:18 crc kubenswrapper[5033]: I0226 17:19:18.389131 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd7f43ba-d95d-48ef-9788-b25bb345f972","Type":"ContainerStarted","Data":"f0992ab10b7f38ac5d543e3e580cf55d11128cfbd90d1b630e5bba82f695c2e5"} Feb 26 17:19:19 crc kubenswrapper[5033]: I0226 17:19:19.405578 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd7f43ba-d95d-48ef-9788-b25bb345f972","Type":"ContainerStarted","Data":"d316672732defd27a9aae1a38096de0c849edcb6224f305f9f3a6909fb2e2ff2"} Feb 26 17:19:19 crc kubenswrapper[5033]: I0226 17:19:19.460517 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.460494042 podStartE2EDuration="3.460494042s" podCreationTimestamp="2026-02-26 17:19:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:19:19.437671306 +0000 UTC m=+5886.650269996" watchObservedRunningTime="2026-02-26 17:19:19.460494042 +0000 UTC m=+5886.673092712" Feb 26 17:19:21 crc kubenswrapper[5033]: I0226 17:19:21.519686 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 17:19:22 crc kubenswrapper[5033]: I0226 17:19:22.049029 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 17:19:26 crc kubenswrapper[5033]: I0226 17:19:26.627714 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:19:26 crc kubenswrapper[5033]: E0226 17:19:26.628394 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:19:27 crc kubenswrapper[5033]: I0226 17:19:27.385996 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 17:19:41 crc kubenswrapper[5033]: I0226 17:19:41.629776 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:19:41 crc kubenswrapper[5033]: E0226 17:19:41.630923 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:19:56 crc kubenswrapper[5033]: I0226 17:19:56.628347 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:19:56 crc kubenswrapper[5033]: E0226 17:19:56.629075 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.157901 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535440-q7pvf"] Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.160204 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535440-q7pvf" Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.162999 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.163029 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.176209 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.183154 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535440-q7pvf"] Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.228075 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67v98\" (UniqueName: \"kubernetes.io/projected/627dac39-736b-4c35-b933-2ea5be748f72-kube-api-access-67v98\") pod \"auto-csr-approver-29535440-q7pvf\" (UID: \"627dac39-736b-4c35-b933-2ea5be748f72\") " pod="openshift-infra/auto-csr-approver-29535440-q7pvf" Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.331061 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67v98\" (UniqueName: \"kubernetes.io/projected/627dac39-736b-4c35-b933-2ea5be748f72-kube-api-access-67v98\") pod \"auto-csr-approver-29535440-q7pvf\" (UID: \"627dac39-736b-4c35-b933-2ea5be748f72\") " pod="openshift-infra/auto-csr-approver-29535440-q7pvf" Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.366748 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67v98\" (UniqueName: \"kubernetes.io/projected/627dac39-736b-4c35-b933-2ea5be748f72-kube-api-access-67v98\") pod \"auto-csr-approver-29535440-q7pvf\" (UID: \"627dac39-736b-4c35-b933-2ea5be748f72\") " pod="openshift-infra/auto-csr-approver-29535440-q7pvf" Feb 26 17:20:00 crc kubenswrapper[5033]: I0226 17:20:00.492409 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535440-q7pvf" Feb 26 17:20:01 crc kubenswrapper[5033]: I0226 17:20:01.025726 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535440-q7pvf"] Feb 26 17:20:01 crc kubenswrapper[5033]: I0226 17:20:01.887859 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535440-q7pvf" event={"ID":"627dac39-736b-4c35-b933-2ea5be748f72","Type":"ContainerStarted","Data":"6f43c6c2e06f285c25e2713e67ff60337c95fa38c2ef855d025471095f3eb7bc"} Feb 26 17:20:03 crc kubenswrapper[5033]: I0226 17:20:03.916162 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535440-q7pvf" event={"ID":"627dac39-736b-4c35-b933-2ea5be748f72","Type":"ContainerStarted","Data":"03aa3c9e49fa89a9e21e8ebd98bc118f285f1deb72cf429daa6b468dd1f12189"} Feb 26 17:20:03 crc kubenswrapper[5033]: I0226 17:20:03.944463 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535440-q7pvf" podStartSLOduration=1.412670719 podStartE2EDuration="3.944414327s" podCreationTimestamp="2026-02-26 17:20:00 +0000 UTC" firstStartedPulling="2026-02-26 17:20:01.04263179 +0000 UTC m=+5928.255230440" lastFinishedPulling="2026-02-26 17:20:03.574375358 +0000 UTC m=+5930.786974048" observedRunningTime="2026-02-26 17:20:03.93837754 +0000 UTC m=+5931.150976210" watchObservedRunningTime="2026-02-26 17:20:03.944414327 +0000 UTC m=+5931.157012987" Feb 26 17:20:04 crc kubenswrapper[5033]: I0226 17:20:04.927355 5033 generic.go:334] "Generic (PLEG): container finished" podID="627dac39-736b-4c35-b933-2ea5be748f72" containerID="03aa3c9e49fa89a9e21e8ebd98bc118f285f1deb72cf429daa6b468dd1f12189" exitCode=0 Feb 26 17:20:04 crc kubenswrapper[5033]: I0226 17:20:04.927547 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535440-q7pvf" event={"ID":"627dac39-736b-4c35-b933-2ea5be748f72","Type":"ContainerDied","Data":"03aa3c9e49fa89a9e21e8ebd98bc118f285f1deb72cf429daa6b468dd1f12189"} Feb 26 17:20:06 crc kubenswrapper[5033]: I0226 17:20:06.376734 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535440-q7pvf" Feb 26 17:20:06 crc kubenswrapper[5033]: I0226 17:20:06.458540 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67v98\" (UniqueName: \"kubernetes.io/projected/627dac39-736b-4c35-b933-2ea5be748f72-kube-api-access-67v98\") pod \"627dac39-736b-4c35-b933-2ea5be748f72\" (UID: \"627dac39-736b-4c35-b933-2ea5be748f72\") " Feb 26 17:20:06 crc kubenswrapper[5033]: I0226 17:20:06.466224 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627dac39-736b-4c35-b933-2ea5be748f72-kube-api-access-67v98" (OuterVolumeSpecName: "kube-api-access-67v98") pod "627dac39-736b-4c35-b933-2ea5be748f72" (UID: "627dac39-736b-4c35-b933-2ea5be748f72"). InnerVolumeSpecName "kube-api-access-67v98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:20:06 crc kubenswrapper[5033]: I0226 17:20:06.560611 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67v98\" (UniqueName: \"kubernetes.io/projected/627dac39-736b-4c35-b933-2ea5be748f72-kube-api-access-67v98\") on node \"crc\" DevicePath \"\"" Feb 26 17:20:06 crc kubenswrapper[5033]: I0226 17:20:06.766504 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535434-fl7rv"] Feb 26 17:20:06 crc kubenswrapper[5033]: I0226 17:20:06.785796 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535434-fl7rv"] Feb 26 17:20:06 crc kubenswrapper[5033]: I0226 17:20:06.950849 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535440-q7pvf" event={"ID":"627dac39-736b-4c35-b933-2ea5be748f72","Type":"ContainerDied","Data":"6f43c6c2e06f285c25e2713e67ff60337c95fa38c2ef855d025471095f3eb7bc"} Feb 26 17:20:06 crc kubenswrapper[5033]: I0226 17:20:06.950887 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f43c6c2e06f285c25e2713e67ff60337c95fa38c2ef855d025471095f3eb7bc" Feb 26 17:20:06 crc kubenswrapper[5033]: I0226 17:20:06.950952 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535440-q7pvf" Feb 26 17:20:07 crc kubenswrapper[5033]: I0226 17:20:07.648345 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb28b135-15da-4647-91f8-f4b506248277" path="/var/lib/kubelet/pods/bb28b135-15da-4647-91f8-f4b506248277/volumes" Feb 26 17:20:11 crc kubenswrapper[5033]: I0226 17:20:11.629317 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:20:11 crc kubenswrapper[5033]: E0226 17:20:11.630862 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.332135 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x75n7"] Feb 26 17:20:25 crc kubenswrapper[5033]: E0226 17:20:25.333929 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627dac39-736b-4c35-b933-2ea5be748f72" containerName="oc" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.333955 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="627dac39-736b-4c35-b933-2ea5be748f72" containerName="oc" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.334414 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="627dac39-736b-4c35-b933-2ea5be748f72" containerName="oc" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.338326 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.365454 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x75n7"] Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.454108 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-utilities\") pod \"redhat-operators-x75n7\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.454189 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmm6\" (UniqueName: \"kubernetes.io/projected/feb28d80-6a71-4dc2-918d-1c45b7433a70-kube-api-access-hhmm6\") pod \"redhat-operators-x75n7\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.454327 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-catalog-content\") pod \"redhat-operators-x75n7\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.556911 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-catalog-content\") pod \"redhat-operators-x75n7\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.557089 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-utilities\") pod \"redhat-operators-x75n7\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.557123 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmm6\" (UniqueName: \"kubernetes.io/projected/feb28d80-6a71-4dc2-918d-1c45b7433a70-kube-api-access-hhmm6\") pod \"redhat-operators-x75n7\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.557752 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-catalog-content\") pod \"redhat-operators-x75n7\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.557821 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-utilities\") pod \"redhat-operators-x75n7\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.580863 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmm6\" (UniqueName: \"kubernetes.io/projected/feb28d80-6a71-4dc2-918d-1c45b7433a70-kube-api-access-hhmm6\") pod \"redhat-operators-x75n7\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.629008 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:20:25 crc kubenswrapper[5033]: E0226 17:20:25.629272 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:20:25 crc kubenswrapper[5033]: I0226 17:20:25.673877 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:26 crc kubenswrapper[5033]: I0226 17:20:26.125605 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x75n7"] Feb 26 17:20:26 crc kubenswrapper[5033]: I0226 17:20:26.146655 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x75n7" event={"ID":"feb28d80-6a71-4dc2-918d-1c45b7433a70","Type":"ContainerStarted","Data":"81957a8d9b3786bb0e4778cbf2393351a5b97581b475359f3121bdabb0d477e3"} Feb 26 17:20:27 crc kubenswrapper[5033]: I0226 17:20:27.159737 5033 generic.go:334] "Generic (PLEG): container finished" podID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerID="a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e" exitCode=0 Feb 26 17:20:27 crc kubenswrapper[5033]: I0226 17:20:27.159863 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x75n7" event={"ID":"feb28d80-6a71-4dc2-918d-1c45b7433a70","Type":"ContainerDied","Data":"a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e"} Feb 26 17:20:29 crc kubenswrapper[5033]: I0226 17:20:29.179435 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x75n7" event={"ID":"feb28d80-6a71-4dc2-918d-1c45b7433a70","Type":"ContainerStarted","Data":"1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f"} Feb 26 17:20:31 crc kubenswrapper[5033]: I0226 17:20:31.204310 5033 generic.go:334] "Generic (PLEG): container finished" podID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerID="1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f" exitCode=0 Feb 26 17:20:31 crc kubenswrapper[5033]: I0226 17:20:31.204422 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x75n7" event={"ID":"feb28d80-6a71-4dc2-918d-1c45b7433a70","Type":"ContainerDied","Data":"1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f"} Feb 26 17:20:32 crc kubenswrapper[5033]: I0226 17:20:32.213847 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x75n7" event={"ID":"feb28d80-6a71-4dc2-918d-1c45b7433a70","Type":"ContainerStarted","Data":"f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1"} Feb 26 17:20:32 crc kubenswrapper[5033]: I0226 17:20:32.243422 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x75n7" podStartSLOduration=2.735650588 podStartE2EDuration="7.243404959s" podCreationTimestamp="2026-02-26 17:20:25 +0000 UTC" firstStartedPulling="2026-02-26 17:20:27.162457972 +0000 UTC m=+5954.375056652" lastFinishedPulling="2026-02-26 17:20:31.670212373 +0000 UTC m=+5958.882811023" observedRunningTime="2026-02-26 17:20:32.235167709 +0000 UTC m=+5959.447766359" watchObservedRunningTime="2026-02-26 17:20:32.243404959 +0000 UTC m=+5959.456003609" Feb 26 17:20:34 crc kubenswrapper[5033]: I0226 17:20:34.831658 5033 scope.go:117] "RemoveContainer" containerID="edc4c1d88408b545e866f039fc6d8b7782062ad0015b713375f3348c9a476acf" Feb 26 17:20:34 crc kubenswrapper[5033]: I0226 17:20:34.859775 5033 scope.go:117] "RemoveContainer" containerID="0a12391a322497c8a26f19e7ef6f88f74a3800349201679c3156620f0bd217b9" Feb 26 17:20:34 crc kubenswrapper[5033]: I0226 17:20:34.942001 5033 scope.go:117] "RemoveContainer" containerID="f59d32c4cba61d1471666e03d7a62dcde01129f8519a25ceac421d86f7e99f57" Feb 26 17:20:35 crc kubenswrapper[5033]: I0226 17:20:35.674353 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:35 crc kubenswrapper[5033]: I0226 17:20:35.674692 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:36 crc kubenswrapper[5033]: I0226 17:20:36.628997 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:20:36 crc kubenswrapper[5033]: E0226 17:20:36.629443 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:20:36 crc kubenswrapper[5033]: I0226 17:20:36.730600 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x75n7" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerName="registry-server" probeResult="failure" output=< Feb 26 17:20:36 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 17:20:36 crc kubenswrapper[5033]: > Feb 26 17:20:45 crc kubenswrapper[5033]: I0226 17:20:45.760150 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:45 crc kubenswrapper[5033]: I0226 17:20:45.839630 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:46 crc kubenswrapper[5033]: I0226 17:20:46.006166 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x75n7"] Feb 26 17:20:47 crc kubenswrapper[5033]: I0226 17:20:47.376425 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x75n7" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerName="registry-server" containerID="cri-o://f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1" gracePeriod=2 Feb 26 17:20:47 crc kubenswrapper[5033]: I0226 17:20:47.824430 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.011812 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-utilities\") pod \"feb28d80-6a71-4dc2-918d-1c45b7433a70\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.011907 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-catalog-content\") pod \"feb28d80-6a71-4dc2-918d-1c45b7433a70\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.012129 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhmm6\" (UniqueName: \"kubernetes.io/projected/feb28d80-6a71-4dc2-918d-1c45b7433a70-kube-api-access-hhmm6\") pod \"feb28d80-6a71-4dc2-918d-1c45b7433a70\" (UID: \"feb28d80-6a71-4dc2-918d-1c45b7433a70\") " Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.014236 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-utilities" (OuterVolumeSpecName: "utilities") pod "feb28d80-6a71-4dc2-918d-1c45b7433a70" (UID: "feb28d80-6a71-4dc2-918d-1c45b7433a70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.018039 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb28d80-6a71-4dc2-918d-1c45b7433a70-kube-api-access-hhmm6" (OuterVolumeSpecName: "kube-api-access-hhmm6") pod "feb28d80-6a71-4dc2-918d-1c45b7433a70" (UID: "feb28d80-6a71-4dc2-918d-1c45b7433a70"). InnerVolumeSpecName "kube-api-access-hhmm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.117003 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhmm6\" (UniqueName: \"kubernetes.io/projected/feb28d80-6a71-4dc2-918d-1c45b7433a70-kube-api-access-hhmm6\") on node \"crc\" DevicePath \"\"" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.117038 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.134639 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "feb28d80-6a71-4dc2-918d-1c45b7433a70" (UID: "feb28d80-6a71-4dc2-918d-1c45b7433a70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.218443 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb28d80-6a71-4dc2-918d-1c45b7433a70-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.388629 5033 generic.go:334] "Generic (PLEG): container finished" podID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerID="f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1" exitCode=0 Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.388701 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x75n7" event={"ID":"feb28d80-6a71-4dc2-918d-1c45b7433a70","Type":"ContainerDied","Data":"f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1"} Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.388736 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x75n7" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.388766 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x75n7" event={"ID":"feb28d80-6a71-4dc2-918d-1c45b7433a70","Type":"ContainerDied","Data":"81957a8d9b3786bb0e4778cbf2393351a5b97581b475359f3121bdabb0d477e3"} Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.388790 5033 scope.go:117] "RemoveContainer" containerID="f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.421169 5033 scope.go:117] "RemoveContainer" containerID="1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.440059 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x75n7"] Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.450976 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x75n7"] Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.484456 5033 scope.go:117] "RemoveContainer" containerID="a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.522669 5033 scope.go:117] "RemoveContainer" containerID="f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1" Feb 26 17:20:48 crc kubenswrapper[5033]: E0226 17:20:48.523967 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1\": container with ID starting with f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1 not found: ID does not exist" containerID="f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.523999 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1"} err="failed to get container status \"f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1\": rpc error: code = NotFound desc = could not find container \"f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1\": container with ID starting with f22c1d0530d8159387caf16410850e76015034e036f9f7207df1b794ee35a4a1 not found: ID does not exist" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.524022 5033 scope.go:117] "RemoveContainer" containerID="1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f" Feb 26 17:20:48 crc kubenswrapper[5033]: E0226 17:20:48.524589 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f\": container with ID starting with 1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f not found: ID does not exist" containerID="1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.524649 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f"} err="failed to get container status \"1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f\": rpc error: code = NotFound desc = could not find container \"1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f\": container with ID starting with 1403b769ddf6742d63d2fa1872b402e08d450013cd9fed233c92ca0a305d9d7f not found: ID does not exist" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.524670 5033 scope.go:117] "RemoveContainer" containerID="a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e" Feb 26 17:20:48 crc kubenswrapper[5033]: E0226 17:20:48.525024 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e\": container with ID starting with a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e not found: ID does not exist" containerID="a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e" Feb 26 17:20:48 crc kubenswrapper[5033]: I0226 17:20:48.525090 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e"} err="failed to get container status \"a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e\": rpc error: code = NotFound desc = could not find container \"a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e\": container with ID starting with a3509169ad879be78f44dce5b110e3cfd1dadbc860b36f3569d9c4d6b92ed58e not found: ID does not exist" Feb 26 17:20:49 crc kubenswrapper[5033]: I0226 17:20:49.643849 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" path="/var/lib/kubelet/pods/feb28d80-6a71-4dc2-918d-1c45b7433a70/volumes" Feb 26 17:20:51 crc kubenswrapper[5033]: I0226 17:20:51.630361 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:20:51 crc kubenswrapper[5033]: E0226 17:20:51.631513 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.682371 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-shkdl"] Feb 26 17:20:54 crc kubenswrapper[5033]: E0226 17:20:54.683468 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerName="extract-content" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.683505 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerName="extract-content" Feb 26 17:20:54 crc kubenswrapper[5033]: E0226 17:20:54.683554 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerName="extract-utilities" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.683575 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerName="extract-utilities" Feb 26 17:20:54 crc kubenswrapper[5033]: E0226 17:20:54.683637 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerName="registry-server" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.683652 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerName="registry-server" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.684019 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb28d80-6a71-4dc2-918d-1c45b7433a70" containerName="registry-server" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.706879 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shkdl"] Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.707045 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.838578 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89bpz\" (UniqueName: \"kubernetes.io/projected/9e62a972-154f-4512-8fd9-8892364aa02e-kube-api-access-89bpz\") pod \"community-operators-shkdl\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.838996 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-utilities\") pod \"community-operators-shkdl\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.839107 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-catalog-content\") pod \"community-operators-shkdl\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.940691 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-catalog-content\") pod \"community-operators-shkdl\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.940790 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89bpz\" (UniqueName: \"kubernetes.io/projected/9e62a972-154f-4512-8fd9-8892364aa02e-kube-api-access-89bpz\") pod \"community-operators-shkdl\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.940854 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-utilities\") pod \"community-operators-shkdl\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.941442 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-utilities\") pod \"community-operators-shkdl\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.941495 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-catalog-content\") pod \"community-operators-shkdl\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:54 crc kubenswrapper[5033]: I0226 17:20:54.963747 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89bpz\" (UniqueName: \"kubernetes.io/projected/9e62a972-154f-4512-8fd9-8892364aa02e-kube-api-access-89bpz\") pod \"community-operators-shkdl\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:55 crc kubenswrapper[5033]: I0226 17:20:55.030816 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:20:55 crc kubenswrapper[5033]: I0226 17:20:55.607009 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shkdl"] Feb 26 17:20:56 crc kubenswrapper[5033]: I0226 17:20:56.479738 5033 generic.go:334] "Generic (PLEG): container finished" podID="9e62a972-154f-4512-8fd9-8892364aa02e" containerID="89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458" exitCode=0 Feb 26 17:20:56 crc kubenswrapper[5033]: I0226 17:20:56.480193 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shkdl" event={"ID":"9e62a972-154f-4512-8fd9-8892364aa02e","Type":"ContainerDied","Data":"89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458"} Feb 26 17:20:56 crc kubenswrapper[5033]: I0226 17:20:56.480230 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shkdl" event={"ID":"9e62a972-154f-4512-8fd9-8892364aa02e","Type":"ContainerStarted","Data":"e09683a2e83ccca11d5f7de18cd0ceec41de5fd744cececc668955b2544c9a79"} Feb 26 17:20:58 crc kubenswrapper[5033]: I0226 17:20:58.507247 5033 generic.go:334] "Generic (PLEG): container finished" podID="9e62a972-154f-4512-8fd9-8892364aa02e" containerID="42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7" exitCode=0 Feb 26 17:20:58 crc kubenswrapper[5033]: I0226 17:20:58.507312 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shkdl" event={"ID":"9e62a972-154f-4512-8fd9-8892364aa02e","Type":"ContainerDied","Data":"42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7"} Feb 26 17:20:59 crc kubenswrapper[5033]: I0226 17:20:59.560364 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shkdl" event={"ID":"9e62a972-154f-4512-8fd9-8892364aa02e","Type":"ContainerStarted","Data":"11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c"} Feb 26 17:20:59 crc kubenswrapper[5033]: I0226 17:20:59.594004 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-shkdl" podStartSLOduration=3.102596939 podStartE2EDuration="5.593984622s" podCreationTimestamp="2026-02-26 17:20:54 +0000 UTC" firstStartedPulling="2026-02-26 17:20:56.482686262 +0000 UTC m=+5983.695284902" lastFinishedPulling="2026-02-26 17:20:58.974073935 +0000 UTC m=+5986.186672585" observedRunningTime="2026-02-26 17:20:59.587675198 +0000 UTC m=+5986.800273858" watchObservedRunningTime="2026-02-26 17:20:59.593984622 +0000 UTC m=+5986.806583272" Feb 26 17:21:02 crc kubenswrapper[5033]: I0226 17:21:02.627720 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:21:02 crc kubenswrapper[5033]: E0226 17:21:02.628210 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.148393 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kgzlq"] Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.150234 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.152923 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-98rsg" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.153140 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.173642 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kgzlq"] Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.183946 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tvbth"] Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.192039 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.220378 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe83e329-75c3-41af-a979-dd9aa57873b1-var-run\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.220438 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe83e329-75c3-41af-a979-dd9aa57873b1-scripts\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.220702 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe83e329-75c3-41af-a979-dd9aa57873b1-var-run-ovn\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.220766 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe83e329-75c3-41af-a979-dd9aa57873b1-var-log-ovn\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.220822 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zxn\" (UniqueName: \"kubernetes.io/projected/fe83e329-75c3-41af-a979-dd9aa57873b1-kube-api-access-d9zxn\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.223154 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tvbth"] Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.323631 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlc4n\" (UniqueName: \"kubernetes.io/projected/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-kube-api-access-jlc4n\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.323738 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe83e329-75c3-41af-a979-dd9aa57873b1-var-run\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.323795 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-var-lib\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.323916 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe83e329-75c3-41af-a979-dd9aa57873b1-scripts\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.323986 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-var-log\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.324195 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe83e329-75c3-41af-a979-dd9aa57873b1-var-run-ovn\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.324224 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-etc-ovs\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.324283 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe83e329-75c3-41af-a979-dd9aa57873b1-var-log-ovn\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.324326 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9zxn\" (UniqueName: \"kubernetes.io/projected/fe83e329-75c3-41af-a979-dd9aa57873b1-kube-api-access-d9zxn\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.324397 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-scripts\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.324402 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe83e329-75c3-41af-a979-dd9aa57873b1-var-run-ovn\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.324409 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe83e329-75c3-41af-a979-dd9aa57873b1-var-log-ovn\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.324526 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-var-run\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.325906 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe83e329-75c3-41af-a979-dd9aa57873b1-scripts\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.326308 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe83e329-75c3-41af-a979-dd9aa57873b1-var-run\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.345111 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9zxn\" (UniqueName: \"kubernetes.io/projected/fe83e329-75c3-41af-a979-dd9aa57873b1-kube-api-access-d9zxn\") pod \"ovn-controller-kgzlq\" (UID: \"fe83e329-75c3-41af-a979-dd9aa57873b1\") " pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.426680 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-var-lib\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.426733 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-var-log\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.426793 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-etc-ovs\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.426828 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-scripts\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.426856 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-var-run\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.426899 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlc4n\" (UniqueName: \"kubernetes.io/projected/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-kube-api-access-jlc4n\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.427092 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-var-lib\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.427199 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-etc-ovs\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.427290 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-var-log\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.427351 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-var-run\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.429422 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-scripts\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.443335 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlc4n\" (UniqueName: \"kubernetes.io/projected/562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0-kube-api-access-jlc4n\") pod \"ovn-controller-ovs-tvbth\" (UID: \"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0\") " pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.477972 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.510909 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:03 crc kubenswrapper[5033]: I0226 17:21:03.980221 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kgzlq"] Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.430461 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tvbth"] Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.625804 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-d66r6"] Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.626978 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.627430 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tvbth" event={"ID":"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0","Type":"ContainerStarted","Data":"785412b9c948181c7700d593ab821e97f5c2463863a2a671d5813aabdcf7558f"} Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.629540 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.633237 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kgzlq" event={"ID":"fe83e329-75c3-41af-a979-dd9aa57873b1","Type":"ContainerStarted","Data":"8c25181d72d60716c0ffd1f0886c011a8fbf137fc803f0aa26a302ca712339f5"} Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.633352 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kgzlq" event={"ID":"fe83e329-75c3-41af-a979-dd9aa57873b1","Type":"ContainerStarted","Data":"3621ab2d83333237263a8c2b554eb6fe1c99e5b1528300326c023e1ee3b9b852"} Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.633719 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.649310 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d66r6"] Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.675675 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kgzlq" podStartSLOduration=1.67565738 podStartE2EDuration="1.67565738s" podCreationTimestamp="2026-02-26 17:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:21:04.656420511 +0000 UTC m=+5991.869019161" watchObservedRunningTime="2026-02-26 17:21:04.67565738 +0000 UTC m=+5991.888256030" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.768605 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5367270d-f5c1-40a7-b55d-dd016e41b3b5-ovn-rundir\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.768688 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5367270d-f5c1-40a7-b55d-dd016e41b3b5-ovs-rundir\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.768765 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5367270d-f5c1-40a7-b55d-dd016e41b3b5-config\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.768868 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv4lh\" (UniqueName: \"kubernetes.io/projected/5367270d-f5c1-40a7-b55d-dd016e41b3b5-kube-api-access-pv4lh\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.870439 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv4lh\" (UniqueName: \"kubernetes.io/projected/5367270d-f5c1-40a7-b55d-dd016e41b3b5-kube-api-access-pv4lh\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.870568 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5367270d-f5c1-40a7-b55d-dd016e41b3b5-ovn-rundir\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.870605 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5367270d-f5c1-40a7-b55d-dd016e41b3b5-ovs-rundir\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.870655 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5367270d-f5c1-40a7-b55d-dd016e41b3b5-config\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.870924 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5367270d-f5c1-40a7-b55d-dd016e41b3b5-ovs-rundir\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.870949 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5367270d-f5c1-40a7-b55d-dd016e41b3b5-ovn-rundir\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.871586 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5367270d-f5c1-40a7-b55d-dd016e41b3b5-config\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:04 crc kubenswrapper[5033]: I0226 17:21:04.891992 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv4lh\" (UniqueName: \"kubernetes.io/projected/5367270d-f5c1-40a7-b55d-dd016e41b3b5-kube-api-access-pv4lh\") pod \"ovn-controller-metrics-d66r6\" (UID: \"5367270d-f5c1-40a7-b55d-dd016e41b3b5\") " pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.018362 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d66r6" Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.031561 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.031601 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.067347 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pjksv"] Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.078331 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-af50-account-create-update-gjwkt"] Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.087278 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-af50-account-create-update-gjwkt"] Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.091061 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.101579 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pjksv"] Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.530743 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d66r6"] Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.647778 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75798cbd-85b7-4424-83fc-3f542dcc8820" path="/var/lib/kubelet/pods/75798cbd-85b7-4424-83fc-3f542dcc8820/volumes" Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.648648 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b951c3-accd-47c9-aa64-efe6365d3022" path="/var/lib/kubelet/pods/c5b951c3-accd-47c9-aa64-efe6365d3022/volumes" Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.658645 5033 generic.go:334] "Generic (PLEG): container finished" podID="562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0" containerID="8b55719676a090e52b63b7a095aff2a8fc33b4df8a04a5e3bd86e4bebd89a078" exitCode=0 Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.658737 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tvbth" event={"ID":"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0","Type":"ContainerDied","Data":"8b55719676a090e52b63b7a095aff2a8fc33b4df8a04a5e3bd86e4bebd89a078"} Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.667041 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d66r6" event={"ID":"5367270d-f5c1-40a7-b55d-dd016e41b3b5","Type":"ContainerStarted","Data":"139fabcaab0b58d4d027a67ea3c73b41b8270c9004d7d89b4b3480da161303ad"} Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.743480 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:21:05 crc kubenswrapper[5033]: I0226 17:21:05.791447 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shkdl"] Feb 26 17:21:06 crc kubenswrapper[5033]: I0226 17:21:06.680638 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d66r6" event={"ID":"5367270d-f5c1-40a7-b55d-dd016e41b3b5","Type":"ContainerStarted","Data":"387e670c1e348ed2852139685da3a582cb15972a52ce7b6f48728021fefca812"} Feb 26 17:21:06 crc kubenswrapper[5033]: I0226 17:21:06.684199 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tvbth" event={"ID":"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0","Type":"ContainerStarted","Data":"4fb995da8c80a4c96fc4f8f9694f3139960f9434217bd44c8fbab9cdb8cb948f"} Feb 26 17:21:06 crc kubenswrapper[5033]: I0226 17:21:06.684222 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tvbth" event={"ID":"562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0","Type":"ContainerStarted","Data":"bfd20671d05b5aa1fa4e0da5c69496e2d02a1a68b6582e52017cffb935efd272"} Feb 26 17:21:06 crc kubenswrapper[5033]: I0226 17:21:06.684235 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:06 crc kubenswrapper[5033]: I0226 17:21:06.684268 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:06 crc kubenswrapper[5033]: I0226 17:21:06.714557 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-d66r6" podStartSLOduration=2.714533101 podStartE2EDuration="2.714533101s" podCreationTimestamp="2026-02-26 17:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:21:06.697205109 +0000 UTC m=+5993.909803769" watchObservedRunningTime="2026-02-26 17:21:06.714533101 +0000 UTC m=+5993.927131751" Feb 26 17:21:06 crc kubenswrapper[5033]: I0226 17:21:06.756641 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tvbth" podStartSLOduration=3.756618468 podStartE2EDuration="3.756618468s" podCreationTimestamp="2026-02-26 17:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:21:06.736801324 +0000 UTC m=+5993.949399984" watchObservedRunningTime="2026-02-26 17:21:06.756618468 +0000 UTC m=+5993.969217118" Feb 26 17:21:07 crc kubenswrapper[5033]: I0226 17:21:07.693620 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-shkdl" podUID="9e62a972-154f-4512-8fd9-8892364aa02e" containerName="registry-server" containerID="cri-o://11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c" gracePeriod=2 Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.204398 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.365187 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-catalog-content\") pod \"9e62a972-154f-4512-8fd9-8892364aa02e\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.365377 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89bpz\" (UniqueName: \"kubernetes.io/projected/9e62a972-154f-4512-8fd9-8892364aa02e-kube-api-access-89bpz\") pod \"9e62a972-154f-4512-8fd9-8892364aa02e\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.365451 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-utilities\") pod \"9e62a972-154f-4512-8fd9-8892364aa02e\" (UID: \"9e62a972-154f-4512-8fd9-8892364aa02e\") " Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.366179 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-utilities" (OuterVolumeSpecName: "utilities") pod "9e62a972-154f-4512-8fd9-8892364aa02e" (UID: "9e62a972-154f-4512-8fd9-8892364aa02e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.377234 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e62a972-154f-4512-8fd9-8892364aa02e-kube-api-access-89bpz" (OuterVolumeSpecName: "kube-api-access-89bpz") pod "9e62a972-154f-4512-8fd9-8892364aa02e" (UID: "9e62a972-154f-4512-8fd9-8892364aa02e"). InnerVolumeSpecName "kube-api-access-89bpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.437183 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e62a972-154f-4512-8fd9-8892364aa02e" (UID: "9e62a972-154f-4512-8fd9-8892364aa02e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.467328 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89bpz\" (UniqueName: \"kubernetes.io/projected/9e62a972-154f-4512-8fd9-8892364aa02e-kube-api-access-89bpz\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.467366 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.467378 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e62a972-154f-4512-8fd9-8892364aa02e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.711614 5033 generic.go:334] "Generic (PLEG): container finished" podID="9e62a972-154f-4512-8fd9-8892364aa02e" containerID="11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c" exitCode=0 Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.711718 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shkdl" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.711833 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shkdl" event={"ID":"9e62a972-154f-4512-8fd9-8892364aa02e","Type":"ContainerDied","Data":"11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c"} Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.711948 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shkdl" event={"ID":"9e62a972-154f-4512-8fd9-8892364aa02e","Type":"ContainerDied","Data":"e09683a2e83ccca11d5f7de18cd0ceec41de5fd744cececc668955b2544c9a79"} Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.711988 5033 scope.go:117] "RemoveContainer" containerID="11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.742629 5033 scope.go:117] "RemoveContainer" containerID="42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.784241 5033 scope.go:117] "RemoveContainer" containerID="89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.787660 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shkdl"] Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.801400 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-shkdl"] Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.832282 5033 scope.go:117] "RemoveContainer" containerID="11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c" Feb 26 17:21:08 crc kubenswrapper[5033]: E0226 17:21:08.832812 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c\": container with ID starting with 11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c not found: ID does not exist" containerID="11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.832911 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c"} err="failed to get container status \"11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c\": rpc error: code = NotFound desc = could not find container \"11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c\": container with ID starting with 11a9bb1699bdf4d7f46fc430e9b9a0e97a548cf9fb0680329a2862bf385f275c not found: ID does not exist" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.833018 5033 scope.go:117] "RemoveContainer" containerID="42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7" Feb 26 17:21:08 crc kubenswrapper[5033]: E0226 17:21:08.833540 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7\": container with ID starting with 42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7 not found: ID does not exist" containerID="42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.833594 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7"} err="failed to get container status \"42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7\": rpc error: code = NotFound desc = could not find container \"42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7\": container with ID starting with 42022bed01303b80794bbb3c3b52cc574dde5c9df902abd8f0a33afed9f3a7d7 not found: ID does not exist" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.833626 5033 scope.go:117] "RemoveContainer" containerID="89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458" Feb 26 17:21:08 crc kubenswrapper[5033]: E0226 17:21:08.833937 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458\": container with ID starting with 89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458 not found: ID does not exist" containerID="89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458" Feb 26 17:21:08 crc kubenswrapper[5033]: I0226 17:21:08.834010 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458"} err="failed to get container status \"89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458\": rpc error: code = NotFound desc = could not find container \"89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458\": container with ID starting with 89b55ac11840c6cdf02ccec74ee54776dace1dfba6df41c7fcedd26e3727b458 not found: ID does not exist" Feb 26 17:21:09 crc kubenswrapper[5033]: I0226 17:21:09.643297 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e62a972-154f-4512-8fd9-8892364aa02e" path="/var/lib/kubelet/pods/9e62a972-154f-4512-8fd9-8892364aa02e/volumes" Feb 26 17:21:12 crc kubenswrapper[5033]: I0226 17:21:12.043836 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4zlph"] Feb 26 17:21:12 crc kubenswrapper[5033]: I0226 17:21:12.057311 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4zlph"] Feb 26 17:21:13 crc kubenswrapper[5033]: I0226 17:21:13.672289 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e138959-75c1-4a17-b437-07e9cb38b285" path="/var/lib/kubelet/pods/7e138959-75c1-4a17-b437-07e9cb38b285/volumes" Feb 26 17:21:15 crc kubenswrapper[5033]: I0226 17:21:15.627948 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:21:15 crc kubenswrapper[5033]: E0226 17:21:15.628701 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.037047 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mpzk7"] Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.048727 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mpzk7"] Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.793933 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-k6nsk"] Feb 26 17:21:26 crc kubenswrapper[5033]: E0226 17:21:26.794793 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e62a972-154f-4512-8fd9-8892364aa02e" containerName="registry-server" Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.795331 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e62a972-154f-4512-8fd9-8892364aa02e" containerName="registry-server" Feb 26 17:21:26 crc kubenswrapper[5033]: E0226 17:21:26.795393 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e62a972-154f-4512-8fd9-8892364aa02e" containerName="extract-utilities" Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.795410 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e62a972-154f-4512-8fd9-8892364aa02e" containerName="extract-utilities" Feb 26 17:21:26 crc kubenswrapper[5033]: E0226 17:21:26.795429 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e62a972-154f-4512-8fd9-8892364aa02e" containerName="extract-content" Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.795443 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e62a972-154f-4512-8fd9-8892364aa02e" containerName="extract-content" Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.795708 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e62a972-154f-4512-8fd9-8892364aa02e" containerName="registry-server" Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.796871 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.816230 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-k6nsk"] Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.972651 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2239b58b-057e-49c2-8af4-1831f198a780-operator-scripts\") pod \"octavia-db-create-k6nsk\" (UID: \"2239b58b-057e-49c2-8af4-1831f198a780\") " pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:26 crc kubenswrapper[5033]: I0226 17:21:26.973033 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rt2m\" (UniqueName: \"kubernetes.io/projected/2239b58b-057e-49c2-8af4-1831f198a780-kube-api-access-2rt2m\") pod \"octavia-db-create-k6nsk\" (UID: \"2239b58b-057e-49c2-8af4-1831f198a780\") " pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.074480 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rt2m\" (UniqueName: \"kubernetes.io/projected/2239b58b-057e-49c2-8af4-1831f198a780-kube-api-access-2rt2m\") pod \"octavia-db-create-k6nsk\" (UID: \"2239b58b-057e-49c2-8af4-1831f198a780\") " pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.074650 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2239b58b-057e-49c2-8af4-1831f198a780-operator-scripts\") pod \"octavia-db-create-k6nsk\" (UID: \"2239b58b-057e-49c2-8af4-1831f198a780\") " pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.075392 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2239b58b-057e-49c2-8af4-1831f198a780-operator-scripts\") pod \"octavia-db-create-k6nsk\" (UID: \"2239b58b-057e-49c2-8af4-1831f198a780\") " pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.095579 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rt2m\" (UniqueName: \"kubernetes.io/projected/2239b58b-057e-49c2-8af4-1831f198a780-kube-api-access-2rt2m\") pod \"octavia-db-create-k6nsk\" (UID: \"2239b58b-057e-49c2-8af4-1831f198a780\") " pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.133321 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.626671 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-k6nsk"] Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.639089 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520a1008-f450-4296-9c63-b8d869095196" path="/var/lib/kubelet/pods/520a1008-f450-4296-9c63-b8d869095196/volumes" Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.938672 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-k6nsk" event={"ID":"2239b58b-057e-49c2-8af4-1831f198a780","Type":"ContainerStarted","Data":"fa683db8ff4cb1baad9ad3fa63b1f0c099d8dc9559ebce37cb20e43c039e2470"} Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.939033 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-k6nsk" event={"ID":"2239b58b-057e-49c2-8af4-1831f198a780","Type":"ContainerStarted","Data":"94670088dd2a5b486ef3f75fdaedc95d67a921329293102f02c1ed63c65991ef"} Feb 26 17:21:27 crc kubenswrapper[5033]: I0226 17:21:27.960916 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-create-k6nsk" podStartSLOduration=1.9608937229999999 podStartE2EDuration="1.960893723s" podCreationTimestamp="2026-02-26 17:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:21:27.954918216 +0000 UTC m=+6015.167516866" watchObservedRunningTime="2026-02-26 17:21:27.960893723 +0000 UTC m=+6015.173492363" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.122703 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-3b95-account-create-update-25g8v"] Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.124160 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.126306 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.137109 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-3b95-account-create-update-25g8v"] Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.297567 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrzm6\" (UniqueName: \"kubernetes.io/projected/8545ace8-57f4-47fa-9512-ce48b7998050-kube-api-access-jrzm6\") pod \"octavia-3b95-account-create-update-25g8v\" (UID: \"8545ace8-57f4-47fa-9512-ce48b7998050\") " pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.297637 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8545ace8-57f4-47fa-9512-ce48b7998050-operator-scripts\") pod \"octavia-3b95-account-create-update-25g8v\" (UID: \"8545ace8-57f4-47fa-9512-ce48b7998050\") " pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.399634 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrzm6\" (UniqueName: \"kubernetes.io/projected/8545ace8-57f4-47fa-9512-ce48b7998050-kube-api-access-jrzm6\") pod \"octavia-3b95-account-create-update-25g8v\" (UID: \"8545ace8-57f4-47fa-9512-ce48b7998050\") " pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.399764 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8545ace8-57f4-47fa-9512-ce48b7998050-operator-scripts\") pod \"octavia-3b95-account-create-update-25g8v\" (UID: \"8545ace8-57f4-47fa-9512-ce48b7998050\") " pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.401178 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8545ace8-57f4-47fa-9512-ce48b7998050-operator-scripts\") pod \"octavia-3b95-account-create-update-25g8v\" (UID: \"8545ace8-57f4-47fa-9512-ce48b7998050\") " pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.424097 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrzm6\" (UniqueName: \"kubernetes.io/projected/8545ace8-57f4-47fa-9512-ce48b7998050-kube-api-access-jrzm6\") pod \"octavia-3b95-account-create-update-25g8v\" (UID: \"8545ace8-57f4-47fa-9512-ce48b7998050\") " pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.462352 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.627679 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:21:28 crc kubenswrapper[5033]: E0226 17:21:28.628290 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.935166 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-3b95-account-create-update-25g8v"] Feb 26 17:21:28 crc kubenswrapper[5033]: W0226 17:21:28.936825 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8545ace8_57f4_47fa_9512_ce48b7998050.slice/crio-59736131ae0a3348b461dbf411b7bded23e85c06ccb21e5f8505110508f117ea WatchSource:0}: Error finding container 59736131ae0a3348b461dbf411b7bded23e85c06ccb21e5f8505110508f117ea: Status 404 returned error can't find the container with id 59736131ae0a3348b461dbf411b7bded23e85c06ccb21e5f8505110508f117ea Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.951555 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-3b95-account-create-update-25g8v" event={"ID":"8545ace8-57f4-47fa-9512-ce48b7998050","Type":"ContainerStarted","Data":"59736131ae0a3348b461dbf411b7bded23e85c06ccb21e5f8505110508f117ea"} Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.953421 5033 generic.go:334] "Generic (PLEG): container finished" podID="2239b58b-057e-49c2-8af4-1831f198a780" containerID="fa683db8ff4cb1baad9ad3fa63b1f0c099d8dc9559ebce37cb20e43c039e2470" exitCode=0 Feb 26 17:21:28 crc kubenswrapper[5033]: I0226 17:21:28.953523 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-k6nsk" event={"ID":"2239b58b-057e-49c2-8af4-1831f198a780","Type":"ContainerDied","Data":"fa683db8ff4cb1baad9ad3fa63b1f0c099d8dc9559ebce37cb20e43c039e2470"} Feb 26 17:21:29 crc kubenswrapper[5033]: I0226 17:21:29.965641 5033 generic.go:334] "Generic (PLEG): container finished" podID="8545ace8-57f4-47fa-9512-ce48b7998050" containerID="e000ca1d86c14205abcae6a92631b771520e3ed0773a2cb4d0ac17dd0504d69e" exitCode=0 Feb 26 17:21:29 crc kubenswrapper[5033]: I0226 17:21:29.965715 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-3b95-account-create-update-25g8v" event={"ID":"8545ace8-57f4-47fa-9512-ce48b7998050","Type":"ContainerDied","Data":"e000ca1d86c14205abcae6a92631b771520e3ed0773a2cb4d0ac17dd0504d69e"} Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.358940 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.543219 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rt2m\" (UniqueName: \"kubernetes.io/projected/2239b58b-057e-49c2-8af4-1831f198a780-kube-api-access-2rt2m\") pod \"2239b58b-057e-49c2-8af4-1831f198a780\" (UID: \"2239b58b-057e-49c2-8af4-1831f198a780\") " Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.543696 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2239b58b-057e-49c2-8af4-1831f198a780-operator-scripts\") pod \"2239b58b-057e-49c2-8af4-1831f198a780\" (UID: \"2239b58b-057e-49c2-8af4-1831f198a780\") " Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.544751 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2239b58b-057e-49c2-8af4-1831f198a780-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2239b58b-057e-49c2-8af4-1831f198a780" (UID: "2239b58b-057e-49c2-8af4-1831f198a780"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.551495 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2239b58b-057e-49c2-8af4-1831f198a780-kube-api-access-2rt2m" (OuterVolumeSpecName: "kube-api-access-2rt2m") pod "2239b58b-057e-49c2-8af4-1831f198a780" (UID: "2239b58b-057e-49c2-8af4-1831f198a780"). InnerVolumeSpecName "kube-api-access-2rt2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.647079 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rt2m\" (UniqueName: \"kubernetes.io/projected/2239b58b-057e-49c2-8af4-1831f198a780-kube-api-access-2rt2m\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.647112 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2239b58b-057e-49c2-8af4-1831f198a780-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.982654 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-k6nsk" Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.982633 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-k6nsk" event={"ID":"2239b58b-057e-49c2-8af4-1831f198a780","Type":"ContainerDied","Data":"94670088dd2a5b486ef3f75fdaedc95d67a921329293102f02c1ed63c65991ef"} Feb 26 17:21:30 crc kubenswrapper[5033]: I0226 17:21:30.982855 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94670088dd2a5b486ef3f75fdaedc95d67a921329293102f02c1ed63c65991ef" Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.363133 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.465072 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8545ace8-57f4-47fa-9512-ce48b7998050-operator-scripts\") pod \"8545ace8-57f4-47fa-9512-ce48b7998050\" (UID: \"8545ace8-57f4-47fa-9512-ce48b7998050\") " Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.465166 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrzm6\" (UniqueName: \"kubernetes.io/projected/8545ace8-57f4-47fa-9512-ce48b7998050-kube-api-access-jrzm6\") pod \"8545ace8-57f4-47fa-9512-ce48b7998050\" (UID: \"8545ace8-57f4-47fa-9512-ce48b7998050\") " Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.465529 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8545ace8-57f4-47fa-9512-ce48b7998050-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8545ace8-57f4-47fa-9512-ce48b7998050" (UID: "8545ace8-57f4-47fa-9512-ce48b7998050"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.465974 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8545ace8-57f4-47fa-9512-ce48b7998050-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.476527 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8545ace8-57f4-47fa-9512-ce48b7998050-kube-api-access-jrzm6" (OuterVolumeSpecName: "kube-api-access-jrzm6") pod "8545ace8-57f4-47fa-9512-ce48b7998050" (UID: "8545ace8-57f4-47fa-9512-ce48b7998050"). InnerVolumeSpecName "kube-api-access-jrzm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.567767 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrzm6\" (UniqueName: \"kubernetes.io/projected/8545ace8-57f4-47fa-9512-ce48b7998050-kube-api-access-jrzm6\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.996902 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-3b95-account-create-update-25g8v" event={"ID":"8545ace8-57f4-47fa-9512-ce48b7998050","Type":"ContainerDied","Data":"59736131ae0a3348b461dbf411b7bded23e85c06ccb21e5f8505110508f117ea"} Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.996940 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59736131ae0a3348b461dbf411b7bded23e85c06ccb21e5f8505110508f117ea" Feb 26 17:21:31 crc kubenswrapper[5033]: I0226 17:21:31.996976 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-3b95-account-create-update-25g8v" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.340628 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-qmq7p"] Feb 26 17:21:33 crc kubenswrapper[5033]: E0226 17:21:33.341503 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2239b58b-057e-49c2-8af4-1831f198a780" containerName="mariadb-database-create" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.341522 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2239b58b-057e-49c2-8af4-1831f198a780" containerName="mariadb-database-create" Feb 26 17:21:33 crc kubenswrapper[5033]: E0226 17:21:33.341546 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8545ace8-57f4-47fa-9512-ce48b7998050" containerName="mariadb-account-create-update" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.341554 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8545ace8-57f4-47fa-9512-ce48b7998050" containerName="mariadb-account-create-update" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.341765 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8545ace8-57f4-47fa-9512-ce48b7998050" containerName="mariadb-account-create-update" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.341799 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2239b58b-057e-49c2-8af4-1831f198a780" containerName="mariadb-database-create" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.342553 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.354790 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-qmq7p"] Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.502336 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-operator-scripts\") pod \"octavia-persistence-db-create-qmq7p\" (UID: \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\") " pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.502542 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptjvn\" (UniqueName: \"kubernetes.io/projected/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-kube-api-access-ptjvn\") pod \"octavia-persistence-db-create-qmq7p\" (UID: \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\") " pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.604972 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-operator-scripts\") pod \"octavia-persistence-db-create-qmq7p\" (UID: \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\") " pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.605607 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptjvn\" (UniqueName: \"kubernetes.io/projected/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-kube-api-access-ptjvn\") pod \"octavia-persistence-db-create-qmq7p\" (UID: \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\") " pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.606059 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-operator-scripts\") pod \"octavia-persistence-db-create-qmq7p\" (UID: \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\") " pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.624679 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptjvn\" (UniqueName: \"kubernetes.io/projected/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-kube-api-access-ptjvn\") pod \"octavia-persistence-db-create-qmq7p\" (UID: \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\") " pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:33 crc kubenswrapper[5033]: I0226 17:21:33.666185 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.107551 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-qmq7p"] Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.576694 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-65e6-account-create-update-fd8v5"] Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.578041 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.580052 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.589634 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-65e6-account-create-update-fd8v5"] Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.724997 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-operator-scripts\") pod \"octavia-65e6-account-create-update-fd8v5\" (UID: \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\") " pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.725452 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjq7s\" (UniqueName: \"kubernetes.io/projected/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-kube-api-access-zjq7s\") pod \"octavia-65e6-account-create-update-fd8v5\" (UID: \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\") " pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.827101 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjq7s\" (UniqueName: \"kubernetes.io/projected/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-kube-api-access-zjq7s\") pod \"octavia-65e6-account-create-update-fd8v5\" (UID: \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\") " pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.827184 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-operator-scripts\") pod \"octavia-65e6-account-create-update-fd8v5\" (UID: \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\") " pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.828248 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-operator-scripts\") pod \"octavia-65e6-account-create-update-fd8v5\" (UID: \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\") " pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.858006 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjq7s\" (UniqueName: \"kubernetes.io/projected/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-kube-api-access-zjq7s\") pod \"octavia-65e6-account-create-update-fd8v5\" (UID: \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\") " pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:34 crc kubenswrapper[5033]: I0226 17:21:34.894111 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:35 crc kubenswrapper[5033]: I0226 17:21:35.022009 5033 scope.go:117] "RemoveContainer" containerID="48acbde38630df76c413870a1a1f323e727866ffa6748253f75a81fe1a11634d" Feb 26 17:21:35 crc kubenswrapper[5033]: I0226 17:21:35.038699 5033 generic.go:334] "Generic (PLEG): container finished" podID="ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2" containerID="c15f8ca1196db9ebb1f324d1e18c4ae728bae44225bcca4c0f647687b8a973c7" exitCode=0 Feb 26 17:21:35 crc kubenswrapper[5033]: I0226 17:21:35.038752 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-qmq7p" event={"ID":"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2","Type":"ContainerDied","Data":"c15f8ca1196db9ebb1f324d1e18c4ae728bae44225bcca4c0f647687b8a973c7"} Feb 26 17:21:35 crc kubenswrapper[5033]: I0226 17:21:35.038782 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-qmq7p" event={"ID":"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2","Type":"ContainerStarted","Data":"9b7ba7608f4fe9cea3b404caf20496a16b51cb6eb165c4f31cf3d78208379672"} Feb 26 17:21:35 crc kubenswrapper[5033]: I0226 17:21:35.077401 5033 scope.go:117] "RemoveContainer" containerID="733b7580735c4191096036e3b8ee1b966877551975a71f346439b63072f4eeb9" Feb 26 17:21:35 crc kubenswrapper[5033]: I0226 17:21:35.135710 5033 scope.go:117] "RemoveContainer" containerID="73e44ea2f1b4db87278f792b5a38f0a66ba4bf65d2e04987773723f0d2854c39" Feb 26 17:21:35 crc kubenswrapper[5033]: I0226 17:21:35.156783 5033 scope.go:117] "RemoveContainer" containerID="948f88cdce6c89de4e1fcfdd26aabe5d8fe6cc4b39a68d2d1be7951286421496" Feb 26 17:21:35 crc kubenswrapper[5033]: I0226 17:21:35.334768 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-65e6-account-create-update-fd8v5"] Feb 26 17:21:35 crc kubenswrapper[5033]: W0226 17:21:35.339103 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaedf48b_90d4_4bb6_bf73_51c442ea1a9b.slice/crio-5193097b50588ac6b09b259be6428477c9d23d6103d162e166ddafaa2b951a9e WatchSource:0}: Error finding container 5193097b50588ac6b09b259be6428477c9d23d6103d162e166ddafaa2b951a9e: Status 404 returned error can't find the container with id 5193097b50588ac6b09b259be6428477c9d23d6103d162e166ddafaa2b951a9e Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.054436 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-65e6-account-create-update-fd8v5" event={"ID":"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b","Type":"ContainerStarted","Data":"2cba4e78489b19417a4be2c8cde5f88743f7ab972c93d7958b2a1238dee33927"} Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.055940 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-65e6-account-create-update-fd8v5" event={"ID":"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b","Type":"ContainerStarted","Data":"5193097b50588ac6b09b259be6428477c9d23d6103d162e166ddafaa2b951a9e"} Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.075776 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-65e6-account-create-update-fd8v5" podStartSLOduration=2.075750893 podStartE2EDuration="2.075750893s" podCreationTimestamp="2026-02-26 17:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:21:36.069904611 +0000 UTC m=+6023.282503271" watchObservedRunningTime="2026-02-26 17:21:36.075750893 +0000 UTC m=+6023.288349543" Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.426565 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.560111 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-operator-scripts\") pod \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\" (UID: \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\") " Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.560178 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptjvn\" (UniqueName: \"kubernetes.io/projected/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-kube-api-access-ptjvn\") pod \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\" (UID: \"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2\") " Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.560859 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2" (UID: "ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.566187 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-kube-api-access-ptjvn" (OuterVolumeSpecName: "kube-api-access-ptjvn") pod "ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2" (UID: "ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2"). InnerVolumeSpecName "kube-api-access-ptjvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.662839 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:36 crc kubenswrapper[5033]: I0226 17:21:36.662878 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptjvn\" (UniqueName: \"kubernetes.io/projected/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2-kube-api-access-ptjvn\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:37 crc kubenswrapper[5033]: I0226 17:21:37.065922 5033 generic.go:334] "Generic (PLEG): container finished" podID="aaedf48b-90d4-4bb6-bf73-51c442ea1a9b" containerID="2cba4e78489b19417a4be2c8cde5f88743f7ab972c93d7958b2a1238dee33927" exitCode=0 Feb 26 17:21:37 crc kubenswrapper[5033]: I0226 17:21:37.066004 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-65e6-account-create-update-fd8v5" event={"ID":"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b","Type":"ContainerDied","Data":"2cba4e78489b19417a4be2c8cde5f88743f7ab972c93d7958b2a1238dee33927"} Feb 26 17:21:37 crc kubenswrapper[5033]: I0226 17:21:37.068856 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-qmq7p" event={"ID":"ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2","Type":"ContainerDied","Data":"9b7ba7608f4fe9cea3b404caf20496a16b51cb6eb165c4f31cf3d78208379672"} Feb 26 17:21:37 crc kubenswrapper[5033]: I0226 17:21:37.068893 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b7ba7608f4fe9cea3b404caf20496a16b51cb6eb165c4f31cf3d78208379672" Feb 26 17:21:37 crc kubenswrapper[5033]: I0226 17:21:37.068955 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-qmq7p" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.411287 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.498291 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-operator-scripts\") pod \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\" (UID: \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\") " Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.498413 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjq7s\" (UniqueName: \"kubernetes.io/projected/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-kube-api-access-zjq7s\") pod \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\" (UID: \"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b\") " Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.499165 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aaedf48b-90d4-4bb6-bf73-51c442ea1a9b" (UID: "aaedf48b-90d4-4bb6-bf73-51c442ea1a9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.506630 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-kube-api-access-zjq7s" (OuterVolumeSpecName: "kube-api-access-zjq7s") pod "aaedf48b-90d4-4bb6-bf73-51c442ea1a9b" (UID: "aaedf48b-90d4-4bb6-bf73-51c442ea1a9b"). InnerVolumeSpecName "kube-api-access-zjq7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.522380 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kgzlq" podUID="fe83e329-75c3-41af-a979-dd9aa57873b1" containerName="ovn-controller" probeResult="failure" output=< Feb 26 17:21:38 crc kubenswrapper[5033]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 17:21:38 crc kubenswrapper[5033]: > Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.554447 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.571520 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tvbth" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.600697 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.600728 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjq7s\" (UniqueName: \"kubernetes.io/projected/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b-kube-api-access-zjq7s\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.665510 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kgzlq-config-c6xm9"] Feb 26 17:21:38 crc kubenswrapper[5033]: E0226 17:21:38.665976 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaedf48b-90d4-4bb6-bf73-51c442ea1a9b" containerName="mariadb-account-create-update" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.665990 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaedf48b-90d4-4bb6-bf73-51c442ea1a9b" containerName="mariadb-account-create-update" Feb 26 17:21:38 crc kubenswrapper[5033]: E0226 17:21:38.666007 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2" containerName="mariadb-database-create" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.666013 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2" containerName="mariadb-database-create" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.666224 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2" containerName="mariadb-database-create" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.666300 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaedf48b-90d4-4bb6-bf73-51c442ea1a9b" containerName="mariadb-account-create-update" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.667050 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.675657 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.682806 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kgzlq-config-c6xm9"] Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.812803 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-additional-scripts\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.812853 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74kmz\" (UniqueName: \"kubernetes.io/projected/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-kube-api-access-74kmz\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.813009 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-scripts\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.813072 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run-ovn\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.813100 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-log-ovn\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.813169 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.914845 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run-ovn\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.914916 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-log-ovn\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.914959 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.914985 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-additional-scripts\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.915014 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74kmz\" (UniqueName: \"kubernetes.io/projected/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-kube-api-access-74kmz\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.915089 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-scripts\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.915222 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run-ovn\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.915278 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.915278 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-log-ovn\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.916136 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-additional-scripts\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.917653 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-scripts\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.936793 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74kmz\" (UniqueName: \"kubernetes.io/projected/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-kube-api-access-74kmz\") pod \"ovn-controller-kgzlq-config-c6xm9\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:38 crc kubenswrapper[5033]: I0226 17:21:38.987194 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.101118 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-65e6-account-create-update-fd8v5" event={"ID":"aaedf48b-90d4-4bb6-bf73-51c442ea1a9b","Type":"ContainerDied","Data":"5193097b50588ac6b09b259be6428477c9d23d6103d162e166ddafaa2b951a9e"} Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.101517 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5193097b50588ac6b09b259be6428477c9d23d6103d162e166ddafaa2b951a9e" Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.101133 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-65e6-account-create-update-fd8v5" Feb 26 17:21:39 crc kubenswrapper[5033]: E0226 17:21:39.314521 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaedf48b_90d4_4bb6_bf73_51c442ea1a9b.slice\": RecentStats: unable to find data in memory cache]" Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.550022 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kgzlq-config-c6xm9"] Feb 26 17:21:39 crc kubenswrapper[5033]: W0226 17:21:39.551630 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8bb2fb4_9978_4323_b29c_e8a099ad7b7a.slice/crio-a77b0439a83d6c9981f313e53ad903e73bc8cb04cc531b2f8ef82b95d8171282 WatchSource:0}: Error finding container a77b0439a83d6c9981f313e53ad903e73bc8cb04cc531b2f8ef82b95d8171282: Status 404 returned error can't find the container with id a77b0439a83d6c9981f313e53ad903e73bc8cb04cc531b2f8ef82b95d8171282 Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.629522 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:21:39 crc kubenswrapper[5033]: E0226 17:21:39.629737 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.888205 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-b4445f8b9-64k79"] Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.890293 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.892536 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-9zp9v" Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.892574 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.892717 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Feb 26 17:21:39 crc kubenswrapper[5033]: I0226 17:21:39.921922 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-b4445f8b9-64k79"] Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.052584 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-scripts\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.052643 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-combined-ca-bundle\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.052670 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-config-data\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.052724 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-octavia-run\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.052747 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-config-data-merged\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.112222 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kgzlq-config-c6xm9" event={"ID":"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a","Type":"ContainerStarted","Data":"af5cd57c976428c309db06e239a21d93478faf76a129279a9e6b1e8bb64e9495"} Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.112279 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kgzlq-config-c6xm9" event={"ID":"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a","Type":"ContainerStarted","Data":"a77b0439a83d6c9981f313e53ad903e73bc8cb04cc531b2f8ef82b95d8171282"} Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.131646 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kgzlq-config-c6xm9" podStartSLOduration=2.131630822 podStartE2EDuration="2.131630822s" podCreationTimestamp="2026-02-26 17:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:21:40.124835736 +0000 UTC m=+6027.337434386" watchObservedRunningTime="2026-02-26 17:21:40.131630822 +0000 UTC m=+6027.344229472" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.154466 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-scripts\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.154515 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-combined-ca-bundle\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.154533 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-config-data\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.154579 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-octavia-run\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.154600 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-config-data-merged\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.155319 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-octavia-run\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.155517 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-config-data-merged\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.161966 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-scripts\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.162109 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-combined-ca-bundle\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.163702 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4-config-data\") pod \"octavia-api-b4445f8b9-64k79\" (UID: \"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4\") " pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.218194 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.873200 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-b4445f8b9-64k79"] Feb 26 17:21:40 crc kubenswrapper[5033]: I0226 17:21:40.881820 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:21:41 crc kubenswrapper[5033]: I0226 17:21:41.124053 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-b4445f8b9-64k79" event={"ID":"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4","Type":"ContainerStarted","Data":"22572cf1f55f268f7928375db85079b366ac5cfae10f48696fd5810615e22c96"} Feb 26 17:21:41 crc kubenswrapper[5033]: I0226 17:21:41.125786 5033 generic.go:334] "Generic (PLEG): container finished" podID="d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" containerID="af5cd57c976428c309db06e239a21d93478faf76a129279a9e6b1e8bb64e9495" exitCode=0 Feb 26 17:21:41 crc kubenswrapper[5033]: I0226 17:21:41.125832 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kgzlq-config-c6xm9" event={"ID":"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a","Type":"ContainerDied","Data":"af5cd57c976428c309db06e239a21d93478faf76a129279a9e6b1e8bb64e9495"} Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.575794 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.704874 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-additional-scripts\") pod \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.704965 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run\") pod \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.705013 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run-ovn\") pod \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.705092 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74kmz\" (UniqueName: \"kubernetes.io/projected/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-kube-api-access-74kmz\") pod \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.705131 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-scripts\") pod \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.705181 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-log-ovn\") pod \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\" (UID: \"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a\") " Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.705118 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run" (OuterVolumeSpecName: "var-run") pod "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" (UID: "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.705149 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" (UID: "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.705292 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" (UID: "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.705726 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" (UID: "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.706049 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-scripts" (OuterVolumeSpecName: "scripts") pod "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" (UID: "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.707997 5033 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.708023 5033 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.708037 5033 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.708045 5033 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.708054 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.714676 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-kube-api-access-74kmz" (OuterVolumeSpecName: "kube-api-access-74kmz") pod "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" (UID: "d8bb2fb4-9978-4323-b29c-e8a099ad7b7a"). InnerVolumeSpecName "kube-api-access-74kmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:21:42 crc kubenswrapper[5033]: I0226 17:21:42.809828 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74kmz\" (UniqueName: \"kubernetes.io/projected/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a-kube-api-access-74kmz\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.151943 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kgzlq-config-c6xm9" event={"ID":"d8bb2fb4-9978-4323-b29c-e8a099ad7b7a","Type":"ContainerDied","Data":"a77b0439a83d6c9981f313e53ad903e73bc8cb04cc531b2f8ef82b95d8171282"} Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.152243 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a77b0439a83d6c9981f313e53ad903e73bc8cb04cc531b2f8ef82b95d8171282" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.151996 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq-config-c6xm9" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.212480 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kgzlq-config-c6xm9"] Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.224890 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kgzlq-config-c6xm9"] Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.348427 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kgzlq-config-569gn"] Feb 26 17:21:43 crc kubenswrapper[5033]: E0226 17:21:43.348907 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" containerName="ovn-config" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.348934 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" containerName="ovn-config" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.349193 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" containerName="ovn-config" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.350088 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.352327 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.361227 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kgzlq-config-569gn"] Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.425637 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run-ovn\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.425733 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-scripts\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.425791 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.425829 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-log-ovn\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.425856 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcqsl\" (UniqueName: \"kubernetes.io/projected/1b54ed82-07f7-410e-85cc-5678a3529352-kube-api-access-gcqsl\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.425877 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-additional-scripts\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.527248 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run-ovn\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.527324 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-scripts\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.527372 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.527409 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-log-ovn\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.527440 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcqsl\" (UniqueName: \"kubernetes.io/projected/1b54ed82-07f7-410e-85cc-5678a3529352-kube-api-access-gcqsl\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.527461 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-additional-scripts\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.527903 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-log-ovn\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.527992 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.528178 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-additional-scripts\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.529431 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-scripts\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.529712 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run-ovn\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.534019 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-kgzlq" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.565314 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcqsl\" (UniqueName: \"kubernetes.io/projected/1b54ed82-07f7-410e-85cc-5678a3529352-kube-api-access-gcqsl\") pod \"ovn-controller-kgzlq-config-569gn\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.638907 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8bb2fb4-9978-4323-b29c-e8a099ad7b7a" path="/var/lib/kubelet/pods/d8bb2fb4-9978-4323-b29c-e8a099ad7b7a/volumes" Feb 26 17:21:43 crc kubenswrapper[5033]: I0226 17:21:43.669671 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:44 crc kubenswrapper[5033]: I0226 17:21:44.318245 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kgzlq-config-569gn"] Feb 26 17:21:44 crc kubenswrapper[5033]: W0226 17:21:44.327693 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b54ed82_07f7_410e_85cc_5678a3529352.slice/crio-952f8952a5b6acdb8470d4d16fc2cd9ef9a0846c2dd95fbbedac74c672dd6016 WatchSource:0}: Error finding container 952f8952a5b6acdb8470d4d16fc2cd9ef9a0846c2dd95fbbedac74c672dd6016: Status 404 returned error can't find the container with id 952f8952a5b6acdb8470d4d16fc2cd9ef9a0846c2dd95fbbedac74c672dd6016 Feb 26 17:21:45 crc kubenswrapper[5033]: I0226 17:21:45.176671 5033 generic.go:334] "Generic (PLEG): container finished" podID="1b54ed82-07f7-410e-85cc-5678a3529352" containerID="feca7ef87b1a921487bf90aab193db5a5347dbebdee98a36ece671abb9ffaee3" exitCode=0 Feb 26 17:21:45 crc kubenswrapper[5033]: I0226 17:21:45.176763 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kgzlq-config-569gn" event={"ID":"1b54ed82-07f7-410e-85cc-5678a3529352","Type":"ContainerDied","Data":"feca7ef87b1a921487bf90aab193db5a5347dbebdee98a36ece671abb9ffaee3"} Feb 26 17:21:45 crc kubenswrapper[5033]: I0226 17:21:45.177447 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kgzlq-config-569gn" event={"ID":"1b54ed82-07f7-410e-85cc-5678a3529352","Type":"ContainerStarted","Data":"952f8952a5b6acdb8470d4d16fc2cd9ef9a0846c2dd95fbbedac74c672dd6016"} Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.686807 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.782974 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcqsl\" (UniqueName: \"kubernetes.io/projected/1b54ed82-07f7-410e-85cc-5678a3529352-kube-api-access-gcqsl\") pod \"1b54ed82-07f7-410e-85cc-5678a3529352\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.783332 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-log-ovn\") pod \"1b54ed82-07f7-410e-85cc-5678a3529352\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.783441 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run-ovn\") pod \"1b54ed82-07f7-410e-85cc-5678a3529352\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.783559 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-scripts\") pod \"1b54ed82-07f7-410e-85cc-5678a3529352\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.783657 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run\") pod \"1b54ed82-07f7-410e-85cc-5678a3529352\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.783461 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1b54ed82-07f7-410e-85cc-5678a3529352" (UID: "1b54ed82-07f7-410e-85cc-5678a3529352"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.783490 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1b54ed82-07f7-410e-85cc-5678a3529352" (UID: "1b54ed82-07f7-410e-85cc-5678a3529352"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.783950 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run" (OuterVolumeSpecName: "var-run") pod "1b54ed82-07f7-410e-85cc-5678a3529352" (UID: "1b54ed82-07f7-410e-85cc-5678a3529352"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.784082 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-additional-scripts\") pod \"1b54ed82-07f7-410e-85cc-5678a3529352\" (UID: \"1b54ed82-07f7-410e-85cc-5678a3529352\") " Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.784611 5033 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.784720 5033 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.784798 5033 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1b54ed82-07f7-410e-85cc-5678a3529352-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.784821 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-scripts" (OuterVolumeSpecName: "scripts") pod "1b54ed82-07f7-410e-85cc-5678a3529352" (UID: "1b54ed82-07f7-410e-85cc-5678a3529352"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.785292 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1b54ed82-07f7-410e-85cc-5678a3529352" (UID: "1b54ed82-07f7-410e-85cc-5678a3529352"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.788369 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b54ed82-07f7-410e-85cc-5678a3529352-kube-api-access-gcqsl" (OuterVolumeSpecName: "kube-api-access-gcqsl") pod "1b54ed82-07f7-410e-85cc-5678a3529352" (UID: "1b54ed82-07f7-410e-85cc-5678a3529352"). InnerVolumeSpecName "kube-api-access-gcqsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.886769 5033 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.886809 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcqsl\" (UniqueName: \"kubernetes.io/projected/1b54ed82-07f7-410e-85cc-5678a3529352-kube-api-access-gcqsl\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:49 crc kubenswrapper[5033]: I0226 17:21:49.886822 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b54ed82-07f7-410e-85cc-5678a3529352-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:21:50 crc kubenswrapper[5033]: I0226 17:21:50.225930 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kgzlq-config-569gn" event={"ID":"1b54ed82-07f7-410e-85cc-5678a3529352","Type":"ContainerDied","Data":"952f8952a5b6acdb8470d4d16fc2cd9ef9a0846c2dd95fbbedac74c672dd6016"} Feb 26 17:21:50 crc kubenswrapper[5033]: I0226 17:21:50.225972 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952f8952a5b6acdb8470d4d16fc2cd9ef9a0846c2dd95fbbedac74c672dd6016" Feb 26 17:21:50 crc kubenswrapper[5033]: I0226 17:21:50.226037 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kgzlq-config-569gn" Feb 26 17:21:50 crc kubenswrapper[5033]: I0226 17:21:50.761047 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kgzlq-config-569gn"] Feb 26 17:21:50 crc kubenswrapper[5033]: I0226 17:21:50.775516 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kgzlq-config-569gn"] Feb 26 17:21:51 crc kubenswrapper[5033]: I0226 17:21:51.637647 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b54ed82-07f7-410e-85cc-5678a3529352" path="/var/lib/kubelet/pods/1b54ed82-07f7-410e-85cc-5678a3529352/volumes" Feb 26 17:21:53 crc kubenswrapper[5033]: I0226 17:21:53.258671 5033 generic.go:334] "Generic (PLEG): container finished" podID="2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4" containerID="2c524583c730e3bece1c15b82bd2754da61ad95ec58d0de3153e3c601a11a4bf" exitCode=0 Feb 26 17:21:53 crc kubenswrapper[5033]: I0226 17:21:53.258767 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-b4445f8b9-64k79" event={"ID":"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4","Type":"ContainerDied","Data":"2c524583c730e3bece1c15b82bd2754da61ad95ec58d0de3153e3c601a11a4bf"} Feb 26 17:21:53 crc kubenswrapper[5033]: I0226 17:21:53.634623 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:21:54 crc kubenswrapper[5033]: I0226 17:21:54.269697 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-b4445f8b9-64k79" event={"ID":"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4","Type":"ContainerStarted","Data":"9fe1749dfb9b35943e54fad2f37113fe98328a108273d84e30f378ea76cb33cc"} Feb 26 17:21:54 crc kubenswrapper[5033]: I0226 17:21:54.269975 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-b4445f8b9-64k79" event={"ID":"2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4","Type":"ContainerStarted","Data":"fa38eb5d9c9f2b4bc8a93f4a609db0275ce1c10df88da0cfb150eedae07fa156"} Feb 26 17:21:54 crc kubenswrapper[5033]: I0226 17:21:54.270117 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:54 crc kubenswrapper[5033]: I0226 17:21:54.270155 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:21:54 crc kubenswrapper[5033]: I0226 17:21:54.272203 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"33064e87c19129e8c72344901e1129ded177728fee58532d134814d33183f904"} Feb 26 17:21:54 crc kubenswrapper[5033]: I0226 17:21:54.295949 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-b4445f8b9-64k79" podStartSLOduration=4.083727197 podStartE2EDuration="15.295928035s" podCreationTimestamp="2026-02-26 17:21:39 +0000 UTC" firstStartedPulling="2026-02-26 17:21:40.881351386 +0000 UTC m=+6028.093950046" lastFinishedPulling="2026-02-26 17:21:52.093552244 +0000 UTC m=+6039.306150884" observedRunningTime="2026-02-26 17:21:54.293684849 +0000 UTC m=+6041.506283499" watchObservedRunningTime="2026-02-26 17:21:54.295928035 +0000 UTC m=+6041.508526685" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.135597 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535442-2bk4l"] Feb 26 17:22:00 crc kubenswrapper[5033]: E0226 17:22:00.145306 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b54ed82-07f7-410e-85cc-5678a3529352" containerName="ovn-config" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.145343 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b54ed82-07f7-410e-85cc-5678a3529352" containerName="ovn-config" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.145845 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b54ed82-07f7-410e-85cc-5678a3529352" containerName="ovn-config" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.147136 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535442-2bk4l" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.149199 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.150471 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.151410 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.153885 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535442-2bk4l"] Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.308868 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf2qj\" (UniqueName: \"kubernetes.io/projected/aefb3e10-7132-4716-abc2-e0f3eeea1d72-kube-api-access-tf2qj\") pod \"auto-csr-approver-29535442-2bk4l\" (UID: \"aefb3e10-7132-4716-abc2-e0f3eeea1d72\") " pod="openshift-infra/auto-csr-approver-29535442-2bk4l" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.411579 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf2qj\" (UniqueName: \"kubernetes.io/projected/aefb3e10-7132-4716-abc2-e0f3eeea1d72-kube-api-access-tf2qj\") pod \"auto-csr-approver-29535442-2bk4l\" (UID: \"aefb3e10-7132-4716-abc2-e0f3eeea1d72\") " pod="openshift-infra/auto-csr-approver-29535442-2bk4l" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.447526 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf2qj\" (UniqueName: \"kubernetes.io/projected/aefb3e10-7132-4716-abc2-e0f3eeea1d72-kube-api-access-tf2qj\") pod \"auto-csr-approver-29535442-2bk4l\" (UID: \"aefb3e10-7132-4716-abc2-e0f3eeea1d72\") " pod="openshift-infra/auto-csr-approver-29535442-2bk4l" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.468905 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535442-2bk4l" Feb 26 17:22:00 crc kubenswrapper[5033]: I0226 17:22:00.975160 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535442-2bk4l"] Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.199487 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-x6rfk"] Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.207737 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.209628 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.209965 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.211179 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-x6rfk"] Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.211349 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.332991 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/15d82b70-9bd3-4ae4-a20c-248c1b50264b-hm-ports\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.333046 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/15d82b70-9bd3-4ae4-a20c-248c1b50264b-config-data-merged\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.333075 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d82b70-9bd3-4ae4-a20c-248c1b50264b-scripts\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.333142 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d82b70-9bd3-4ae4-a20c-248c1b50264b-config-data\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.359133 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535442-2bk4l" event={"ID":"aefb3e10-7132-4716-abc2-e0f3eeea1d72","Type":"ContainerStarted","Data":"24537ffa5809bea8c76dd25c1dd57cde4d013b7b7f63cd5333bc181e7377c9ed"} Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.434720 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/15d82b70-9bd3-4ae4-a20c-248c1b50264b-hm-ports\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.434827 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/15d82b70-9bd3-4ae4-a20c-248c1b50264b-config-data-merged\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.434879 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d82b70-9bd3-4ae4-a20c-248c1b50264b-scripts\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.435007 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d82b70-9bd3-4ae4-a20c-248c1b50264b-config-data\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.435519 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/15d82b70-9bd3-4ae4-a20c-248c1b50264b-config-data-merged\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.436122 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/15d82b70-9bd3-4ae4-a20c-248c1b50264b-hm-ports\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.443660 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d82b70-9bd3-4ae4-a20c-248c1b50264b-config-data\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.450337 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d82b70-9bd3-4ae4-a20c-248c1b50264b-scripts\") pod \"octavia-rsyslog-x6rfk\" (UID: \"15d82b70-9bd3-4ae4-a20c-248c1b50264b\") " pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.530418 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.794448 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-q95pd"] Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.804479 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.820652 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.824402 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-q95pd"] Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.951490 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/068064de-3b70-46f6-b777-0faffe4c7ef5-amphora-image\") pod \"octavia-image-upload-59f8cff499-q95pd\" (UID: \"068064de-3b70-46f6-b777-0faffe4c7ef5\") " pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:01 crc kubenswrapper[5033]: I0226 17:22:01.951556 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/068064de-3b70-46f6-b777-0faffe4c7ef5-httpd-config\") pod \"octavia-image-upload-59f8cff499-q95pd\" (UID: \"068064de-3b70-46f6-b777-0faffe4c7ef5\") " pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.052994 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/068064de-3b70-46f6-b777-0faffe4c7ef5-amphora-image\") pod \"octavia-image-upload-59f8cff499-q95pd\" (UID: \"068064de-3b70-46f6-b777-0faffe4c7ef5\") " pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.053052 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/068064de-3b70-46f6-b777-0faffe4c7ef5-httpd-config\") pod \"octavia-image-upload-59f8cff499-q95pd\" (UID: \"068064de-3b70-46f6-b777-0faffe4c7ef5\") " pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.053701 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/068064de-3b70-46f6-b777-0faffe4c7ef5-amphora-image\") pod \"octavia-image-upload-59f8cff499-q95pd\" (UID: \"068064de-3b70-46f6-b777-0faffe4c7ef5\") " pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.058836 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/068064de-3b70-46f6-b777-0faffe4c7ef5-httpd-config\") pod \"octavia-image-upload-59f8cff499-q95pd\" (UID: \"068064de-3b70-46f6-b777-0faffe4c7ef5\") " pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:02 crc kubenswrapper[5033]: W0226 17:22:02.079691 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d82b70_9bd3_4ae4_a20c_248c1b50264b.slice/crio-a9e3ee3adc053b74fe979bd651316e920d31b45941fe2ae0fd47c04a27d8f2cd WatchSource:0}: Error finding container a9e3ee3adc053b74fe979bd651316e920d31b45941fe2ae0fd47c04a27d8f2cd: Status 404 returned error can't find the container with id a9e3ee3adc053b74fe979bd651316e920d31b45941fe2ae0fd47c04a27d8f2cd Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.080077 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-x6rfk"] Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.144213 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.333342 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-x6rfk"] Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.374011 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535442-2bk4l" event={"ID":"aefb3e10-7132-4716-abc2-e0f3eeea1d72","Type":"ContainerStarted","Data":"4aa652ddaad40091a4a4b6543cdf3573656b96dae8216ddf5fcbf3345ad36c4a"} Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.385306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-x6rfk" event={"ID":"15d82b70-9bd3-4ae4-a20c-248c1b50264b","Type":"ContainerStarted","Data":"a9e3ee3adc053b74fe979bd651316e920d31b45941fe2ae0fd47c04a27d8f2cd"} Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.393656 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535442-2bk4l" podStartSLOduration=1.391250987 podStartE2EDuration="2.393635737s" podCreationTimestamp="2026-02-26 17:22:00 +0000 UTC" firstStartedPulling="2026-02-26 17:22:00.979558702 +0000 UTC m=+6048.192157362" lastFinishedPulling="2026-02-26 17:22:01.981943462 +0000 UTC m=+6049.194542112" observedRunningTime="2026-02-26 17:22:02.391076035 +0000 UTC m=+6049.603674675" watchObservedRunningTime="2026-02-26 17:22:02.393635737 +0000 UTC m=+6049.606234387" Feb 26 17:22:02 crc kubenswrapper[5033]: I0226 17:22:02.648994 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-q95pd"] Feb 26 17:22:02 crc kubenswrapper[5033]: W0226 17:22:02.650641 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod068064de_3b70_46f6_b777_0faffe4c7ef5.slice/crio-0edfa8e3d4e2d99f9a5f188801fcebbda31130a60233d6bceda697780e89dd66 WatchSource:0}: Error finding container 0edfa8e3d4e2d99f9a5f188801fcebbda31130a60233d6bceda697780e89dd66: Status 404 returned error can't find the container with id 0edfa8e3d4e2d99f9a5f188801fcebbda31130a60233d6bceda697780e89dd66 Feb 26 17:22:03 crc kubenswrapper[5033]: I0226 17:22:03.406843 5033 generic.go:334] "Generic (PLEG): container finished" podID="aefb3e10-7132-4716-abc2-e0f3eeea1d72" containerID="4aa652ddaad40091a4a4b6543cdf3573656b96dae8216ddf5fcbf3345ad36c4a" exitCode=0 Feb 26 17:22:03 crc kubenswrapper[5033]: I0226 17:22:03.406902 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535442-2bk4l" event={"ID":"aefb3e10-7132-4716-abc2-e0f3eeea1d72","Type":"ContainerDied","Data":"4aa652ddaad40091a4a4b6543cdf3573656b96dae8216ddf5fcbf3345ad36c4a"} Feb 26 17:22:03 crc kubenswrapper[5033]: I0226 17:22:03.408986 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-q95pd" event={"ID":"068064de-3b70-46f6-b777-0faffe4c7ef5","Type":"ContainerStarted","Data":"0edfa8e3d4e2d99f9a5f188801fcebbda31130a60233d6bceda697780e89dd66"} Feb 26 17:22:04 crc kubenswrapper[5033]: I0226 17:22:04.419389 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-x6rfk" event={"ID":"15d82b70-9bd3-4ae4-a20c-248c1b50264b","Type":"ContainerStarted","Data":"0e3076188f750ffd71fe657cbcb86a37039d2463750c2478c7bd487181ca698a"} Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.123890 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535442-2bk4l" Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.262719 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf2qj\" (UniqueName: \"kubernetes.io/projected/aefb3e10-7132-4716-abc2-e0f3eeea1d72-kube-api-access-tf2qj\") pod \"aefb3e10-7132-4716-abc2-e0f3eeea1d72\" (UID: \"aefb3e10-7132-4716-abc2-e0f3eeea1d72\") " Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.274548 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aefb3e10-7132-4716-abc2-e0f3eeea1d72-kube-api-access-tf2qj" (OuterVolumeSpecName: "kube-api-access-tf2qj") pod "aefb3e10-7132-4716-abc2-e0f3eeea1d72" (UID: "aefb3e10-7132-4716-abc2-e0f3eeea1d72"). InnerVolumeSpecName "kube-api-access-tf2qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.365220 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf2qj\" (UniqueName: \"kubernetes.io/projected/aefb3e10-7132-4716-abc2-e0f3eeea1d72-kube-api-access-tf2qj\") on node \"crc\" DevicePath \"\"" Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.435341 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535442-2bk4l" event={"ID":"aefb3e10-7132-4716-abc2-e0f3eeea1d72","Type":"ContainerDied","Data":"24537ffa5809bea8c76dd25c1dd57cde4d013b7b7f63cd5333bc181e7377c9ed"} Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.435391 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24537ffa5809bea8c76dd25c1dd57cde4d013b7b7f63cd5333bc181e7377c9ed" Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.435484 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535442-2bk4l" Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.458395 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535436-r62zf"] Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.493932 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535436-r62zf"] Feb 26 17:22:05 crc kubenswrapper[5033]: I0226 17:22:05.643080 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15c4daae-b1b6-48ff-be61-eb705aeb62f1" path="/var/lib/kubelet/pods/15c4daae-b1b6-48ff-be61-eb705aeb62f1/volumes" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.468820 5033 generic.go:334] "Generic (PLEG): container finished" podID="15d82b70-9bd3-4ae4-a20c-248c1b50264b" containerID="0e3076188f750ffd71fe657cbcb86a37039d2463750c2478c7bd487181ca698a" exitCode=0 Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.468904 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-x6rfk" event={"ID":"15d82b70-9bd3-4ae4-a20c-248c1b50264b","Type":"ContainerDied","Data":"0e3076188f750ffd71fe657cbcb86a37039d2463750c2478c7bd487181ca698a"} Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.762853 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-qhr4l"] Feb 26 17:22:08 crc kubenswrapper[5033]: E0226 17:22:08.763544 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aefb3e10-7132-4716-abc2-e0f3eeea1d72" containerName="oc" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.763562 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefb3e10-7132-4716-abc2-e0f3eeea1d72" containerName="oc" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.763735 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="aefb3e10-7132-4716-abc2-e0f3eeea1d72" containerName="oc" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.764689 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.768413 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.806863 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-qhr4l"] Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.857279 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data-merged\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.857418 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-scripts\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.857487 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-combined-ca-bundle\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.857537 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.958802 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-scripts\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.958874 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-combined-ca-bundle\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.958937 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.959033 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data-merged\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.959767 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data-merged\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.967306 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.967994 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-combined-ca-bundle\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:08 crc kubenswrapper[5033]: I0226 17:22:08.969031 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-scripts\") pod \"octavia-db-sync-qhr4l\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:09 crc kubenswrapper[5033]: I0226 17:22:09.085207 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:14 crc kubenswrapper[5033]: I0226 17:22:14.403180 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:22:14 crc kubenswrapper[5033]: I0226 17:22:14.478563 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-b4445f8b9-64k79" Feb 26 17:22:15 crc kubenswrapper[5033]: I0226 17:22:15.945283 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-qhr4l"] Feb 26 17:22:16 crc kubenswrapper[5033]: I0226 17:22:16.575185 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-x6rfk" event={"ID":"15d82b70-9bd3-4ae4-a20c-248c1b50264b","Type":"ContainerStarted","Data":"8e93a51a2bd45692d76d141577dcebbb5d12891278a92a7f253b9d86dbd7e6ab"} Feb 26 17:22:16 crc kubenswrapper[5033]: I0226 17:22:16.575704 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:16 crc kubenswrapper[5033]: I0226 17:22:16.577579 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-q95pd" event={"ID":"068064de-3b70-46f6-b777-0faffe4c7ef5","Type":"ContainerStarted","Data":"4ea229e2f2e1328d783461467b17426b09bd8d4ebb4d9302ecc3ec0ae35a3cf6"} Feb 26 17:22:16 crc kubenswrapper[5033]: I0226 17:22:16.580241 5033 generic.go:334] "Generic (PLEG): container finished" podID="f54be881-caca-4fe5-ba60-88b0c159d47e" containerID="84c35f456f4001417f80fbd7e7b9e7cd21437e985ce94aee13b5762476da2342" exitCode=0 Feb 26 17:22:16 crc kubenswrapper[5033]: I0226 17:22:16.580306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-qhr4l" event={"ID":"f54be881-caca-4fe5-ba60-88b0c159d47e","Type":"ContainerDied","Data":"84c35f456f4001417f80fbd7e7b9e7cd21437e985ce94aee13b5762476da2342"} Feb 26 17:22:16 crc kubenswrapper[5033]: I0226 17:22:16.580356 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-qhr4l" event={"ID":"f54be881-caca-4fe5-ba60-88b0c159d47e","Type":"ContainerStarted","Data":"04614ebdb896acc09b8f2ddc96b2dc3f2b472c2ed7b75a7dc8e1e958c9d0acdd"} Feb 26 17:22:16 crc kubenswrapper[5033]: I0226 17:22:16.603947 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-x6rfk" podStartSLOduration=1.4625548990000001 podStartE2EDuration="15.603926364s" podCreationTimestamp="2026-02-26 17:22:01 +0000 UTC" firstStartedPulling="2026-02-26 17:22:02.083204502 +0000 UTC m=+6049.295803152" lastFinishedPulling="2026-02-26 17:22:16.224575967 +0000 UTC m=+6063.437174617" observedRunningTime="2026-02-26 17:22:16.597735723 +0000 UTC m=+6063.810334373" watchObservedRunningTime="2026-02-26 17:22:16.603926364 +0000 UTC m=+6063.816525014" Feb 26 17:22:17 crc kubenswrapper[5033]: I0226 17:22:17.594586 5033 generic.go:334] "Generic (PLEG): container finished" podID="068064de-3b70-46f6-b777-0faffe4c7ef5" containerID="4ea229e2f2e1328d783461467b17426b09bd8d4ebb4d9302ecc3ec0ae35a3cf6" exitCode=0 Feb 26 17:22:17 crc kubenswrapper[5033]: I0226 17:22:17.595081 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-q95pd" event={"ID":"068064de-3b70-46f6-b777-0faffe4c7ef5","Type":"ContainerDied","Data":"4ea229e2f2e1328d783461467b17426b09bd8d4ebb4d9302ecc3ec0ae35a3cf6"} Feb 26 17:22:17 crc kubenswrapper[5033]: I0226 17:22:17.605419 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-qhr4l" event={"ID":"f54be881-caca-4fe5-ba60-88b0c159d47e","Type":"ContainerStarted","Data":"10ca6f5a093084dcced6dcf596d7369be40ac7b32d8c944f3cbc327ca04bc3bd"} Feb 26 17:22:17 crc kubenswrapper[5033]: I0226 17:22:17.654040 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-qhr4l" podStartSLOduration=9.654013847 podStartE2EDuration="9.654013847s" podCreationTimestamp="2026-02-26 17:22:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:22:17.646784701 +0000 UTC m=+6064.859383391" watchObservedRunningTime="2026-02-26 17:22:17.654013847 +0000 UTC m=+6064.866612527" Feb 26 17:22:21 crc kubenswrapper[5033]: I0226 17:22:21.648436 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-q95pd" event={"ID":"068064de-3b70-46f6-b777-0faffe4c7ef5","Type":"ContainerStarted","Data":"8e585dad773f9f351fc709530c7653677c9fd06bf30877f3deb5e09730ef677b"} Feb 26 17:22:21 crc kubenswrapper[5033]: I0226 17:22:21.671237 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-q95pd" podStartSLOduration=2.028003947 podStartE2EDuration="20.671215742s" podCreationTimestamp="2026-02-26 17:22:01 +0000 UTC" firstStartedPulling="2026-02-26 17:22:02.652953825 +0000 UTC m=+6049.865552475" lastFinishedPulling="2026-02-26 17:22:21.29616558 +0000 UTC m=+6068.508764270" observedRunningTime="2026-02-26 17:22:21.667964243 +0000 UTC m=+6068.880562893" watchObservedRunningTime="2026-02-26 17:22:21.671215742 +0000 UTC m=+6068.883814392" Feb 26 17:22:23 crc kubenswrapper[5033]: I0226 17:22:23.677644 5033 generic.go:334] "Generic (PLEG): container finished" podID="f54be881-caca-4fe5-ba60-88b0c159d47e" containerID="10ca6f5a093084dcced6dcf596d7369be40ac7b32d8c944f3cbc327ca04bc3bd" exitCode=0 Feb 26 17:22:23 crc kubenswrapper[5033]: I0226 17:22:23.677760 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-qhr4l" event={"ID":"f54be881-caca-4fe5-ba60-88b0c159d47e","Type":"ContainerDied","Data":"10ca6f5a093084dcced6dcf596d7369be40ac7b32d8c944f3cbc327ca04bc3bd"} Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.146648 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.319467 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-combined-ca-bundle\") pod \"f54be881-caca-4fe5-ba60-88b0c159d47e\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.319538 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data\") pod \"f54be881-caca-4fe5-ba60-88b0c159d47e\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.319643 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-scripts\") pod \"f54be881-caca-4fe5-ba60-88b0c159d47e\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.319755 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data-merged\") pod \"f54be881-caca-4fe5-ba60-88b0c159d47e\" (UID: \"f54be881-caca-4fe5-ba60-88b0c159d47e\") " Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.329446 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-scripts" (OuterVolumeSpecName: "scripts") pod "f54be881-caca-4fe5-ba60-88b0c159d47e" (UID: "f54be881-caca-4fe5-ba60-88b0c159d47e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.329498 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data" (OuterVolumeSpecName: "config-data") pod "f54be881-caca-4fe5-ba60-88b0c159d47e" (UID: "f54be881-caca-4fe5-ba60-88b0c159d47e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.367559 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "f54be881-caca-4fe5-ba60-88b0c159d47e" (UID: "f54be881-caca-4fe5-ba60-88b0c159d47e"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.376601 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f54be881-caca-4fe5-ba60-88b0c159d47e" (UID: "f54be881-caca-4fe5-ba60-88b0c159d47e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.422830 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.422881 5033 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data-merged\") on node \"crc\" DevicePath \"\"" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.422906 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.423357 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54be881-caca-4fe5-ba60-88b0c159d47e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.705795 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-qhr4l" event={"ID":"f54be881-caca-4fe5-ba60-88b0c159d47e","Type":"ContainerDied","Data":"04614ebdb896acc09b8f2ddc96b2dc3f2b472c2ed7b75a7dc8e1e958c9d0acdd"} Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.705855 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04614ebdb896acc09b8f2ddc96b2dc3f2b472c2ed7b75a7dc8e1e958c9d0acdd" Feb 26 17:22:25 crc kubenswrapper[5033]: I0226 17:22:25.705869 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-qhr4l" Feb 26 17:22:31 crc kubenswrapper[5033]: I0226 17:22:31.577461 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-x6rfk" Feb 26 17:22:35 crc kubenswrapper[5033]: I0226 17:22:35.293593 5033 scope.go:117] "RemoveContainer" containerID="4364bfb19d0b1387b2a6164f83a73a66153cfcaeab44a1769fd0a2e04bd1e79f" Feb 26 17:22:35 crc kubenswrapper[5033]: I0226 17:22:35.615229 5033 scope.go:117] "RemoveContainer" containerID="9eb978d195a8efe9c8bc667da43718d291d885177c9aab013f49e4fbc1a4c8e5" Feb 26 17:22:35 crc kubenswrapper[5033]: I0226 17:22:35.701714 5033 scope.go:117] "RemoveContainer" containerID="507f536d72f6cbddd2b0f79605b953393afc6bd3104284c25d7b6a13bb616ea4" Feb 26 17:22:52 crc kubenswrapper[5033]: I0226 17:22:52.626226 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-q95pd"] Feb 26 17:22:52 crc kubenswrapper[5033]: I0226 17:22:52.627089 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-q95pd" podUID="068064de-3b70-46f6-b777-0faffe4c7ef5" containerName="octavia-amphora-httpd" containerID="cri-o://8e585dad773f9f351fc709530c7653677c9fd06bf30877f3deb5e09730ef677b" gracePeriod=30 Feb 26 17:22:52 crc kubenswrapper[5033]: I0226 17:22:52.994616 5033 generic.go:334] "Generic (PLEG): container finished" podID="068064de-3b70-46f6-b777-0faffe4c7ef5" containerID="8e585dad773f9f351fc709530c7653677c9fd06bf30877f3deb5e09730ef677b" exitCode=0 Feb 26 17:22:52 crc kubenswrapper[5033]: I0226 17:22:52.994658 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-q95pd" event={"ID":"068064de-3b70-46f6-b777-0faffe4c7ef5","Type":"ContainerDied","Data":"8e585dad773f9f351fc709530c7653677c9fd06bf30877f3deb5e09730ef677b"} Feb 26 17:22:53 crc kubenswrapper[5033]: I0226 17:22:53.147134 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:53 crc kubenswrapper[5033]: I0226 17:22:53.169529 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/068064de-3b70-46f6-b777-0faffe4c7ef5-httpd-config\") pod \"068064de-3b70-46f6-b777-0faffe4c7ef5\" (UID: \"068064de-3b70-46f6-b777-0faffe4c7ef5\") " Feb 26 17:22:53 crc kubenswrapper[5033]: I0226 17:22:53.169674 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/068064de-3b70-46f6-b777-0faffe4c7ef5-amphora-image\") pod \"068064de-3b70-46f6-b777-0faffe4c7ef5\" (UID: \"068064de-3b70-46f6-b777-0faffe4c7ef5\") " Feb 26 17:22:53 crc kubenswrapper[5033]: I0226 17:22:53.210379 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068064de-3b70-46f6-b777-0faffe4c7ef5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "068064de-3b70-46f6-b777-0faffe4c7ef5" (UID: "068064de-3b70-46f6-b777-0faffe4c7ef5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:22:53 crc kubenswrapper[5033]: I0226 17:22:53.272205 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/068064de-3b70-46f6-b777-0faffe4c7ef5-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:22:53 crc kubenswrapper[5033]: I0226 17:22:53.272513 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/068064de-3b70-46f6-b777-0faffe4c7ef5-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "068064de-3b70-46f6-b777-0faffe4c7ef5" (UID: "068064de-3b70-46f6-b777-0faffe4c7ef5"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:22:53 crc kubenswrapper[5033]: I0226 17:22:53.374479 5033 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/068064de-3b70-46f6-b777-0faffe4c7ef5-amphora-image\") on node \"crc\" DevicePath \"\"" Feb 26 17:22:54 crc kubenswrapper[5033]: I0226 17:22:54.016253 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-q95pd" event={"ID":"068064de-3b70-46f6-b777-0faffe4c7ef5","Type":"ContainerDied","Data":"0edfa8e3d4e2d99f9a5f188801fcebbda31130a60233d6bceda697780e89dd66"} Feb 26 17:22:54 crc kubenswrapper[5033]: I0226 17:22:54.016763 5033 scope.go:117] "RemoveContainer" containerID="8e585dad773f9f351fc709530c7653677c9fd06bf30877f3deb5e09730ef677b" Feb 26 17:22:54 crc kubenswrapper[5033]: I0226 17:22:54.016442 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-q95pd" Feb 26 17:22:54 crc kubenswrapper[5033]: I0226 17:22:54.061148 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-q95pd"] Feb 26 17:22:54 crc kubenswrapper[5033]: I0226 17:22:54.066475 5033 scope.go:117] "RemoveContainer" containerID="4ea229e2f2e1328d783461467b17426b09bd8d4ebb4d9302ecc3ec0ae35a3cf6" Feb 26 17:22:54 crc kubenswrapper[5033]: I0226 17:22:54.072363 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-q95pd"] Feb 26 17:22:55 crc kubenswrapper[5033]: I0226 17:22:55.639499 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="068064de-3b70-46f6-b777-0faffe4c7ef5" path="/var/lib/kubelet/pods/068064de-3b70-46f6-b777-0faffe4c7ef5/volumes" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.176622 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-678t2"] Feb 26 17:22:57 crc kubenswrapper[5033]: E0226 17:22:57.177476 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54be881-caca-4fe5-ba60-88b0c159d47e" containerName="octavia-db-sync" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.177495 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54be881-caca-4fe5-ba60-88b0c159d47e" containerName="octavia-db-sync" Feb 26 17:22:57 crc kubenswrapper[5033]: E0226 17:22:57.177504 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068064de-3b70-46f6-b777-0faffe4c7ef5" containerName="octavia-amphora-httpd" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.177513 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="068064de-3b70-46f6-b777-0faffe4c7ef5" containerName="octavia-amphora-httpd" Feb 26 17:22:57 crc kubenswrapper[5033]: E0226 17:22:57.177550 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068064de-3b70-46f6-b777-0faffe4c7ef5" containerName="init" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.177561 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="068064de-3b70-46f6-b777-0faffe4c7ef5" containerName="init" Feb 26 17:22:57 crc kubenswrapper[5033]: E0226 17:22:57.177590 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54be881-caca-4fe5-ba60-88b0c159d47e" containerName="init" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.177598 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54be881-caca-4fe5-ba60-88b0c159d47e" containerName="init" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.177799 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="068064de-3b70-46f6-b777-0faffe4c7ef5" containerName="octavia-amphora-httpd" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.177819 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54be881-caca-4fe5-ba60-88b0c159d47e" containerName="octavia-db-sync" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.179752 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-678t2" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.189070 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.193153 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-678t2"] Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.249201 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/86a0b82d-eb74-4b64-984e-2eb565b14a00-amphora-image\") pod \"octavia-image-upload-59f8cff499-678t2\" (UID: \"86a0b82d-eb74-4b64-984e-2eb565b14a00\") " pod="openstack/octavia-image-upload-59f8cff499-678t2" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.249415 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/86a0b82d-eb74-4b64-984e-2eb565b14a00-httpd-config\") pod \"octavia-image-upload-59f8cff499-678t2\" (UID: \"86a0b82d-eb74-4b64-984e-2eb565b14a00\") " pod="openstack/octavia-image-upload-59f8cff499-678t2" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.351480 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/86a0b82d-eb74-4b64-984e-2eb565b14a00-amphora-image\") pod \"octavia-image-upload-59f8cff499-678t2\" (UID: \"86a0b82d-eb74-4b64-984e-2eb565b14a00\") " pod="openstack/octavia-image-upload-59f8cff499-678t2" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.351581 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/86a0b82d-eb74-4b64-984e-2eb565b14a00-httpd-config\") pod \"octavia-image-upload-59f8cff499-678t2\" (UID: \"86a0b82d-eb74-4b64-984e-2eb565b14a00\") " pod="openstack/octavia-image-upload-59f8cff499-678t2" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.351939 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/86a0b82d-eb74-4b64-984e-2eb565b14a00-amphora-image\") pod \"octavia-image-upload-59f8cff499-678t2\" (UID: \"86a0b82d-eb74-4b64-984e-2eb565b14a00\") " pod="openstack/octavia-image-upload-59f8cff499-678t2" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.359319 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/86a0b82d-eb74-4b64-984e-2eb565b14a00-httpd-config\") pod \"octavia-image-upload-59f8cff499-678t2\" (UID: \"86a0b82d-eb74-4b64-984e-2eb565b14a00\") " pod="openstack/octavia-image-upload-59f8cff499-678t2" Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.500169 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-678t2" Feb 26 17:22:57 crc kubenswrapper[5033]: W0226 17:22:57.980294 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86a0b82d_eb74_4b64_984e_2eb565b14a00.slice/crio-430750206c71b202cb84496dfca20c824a291a0141784567047b5a86cb5dc60d WatchSource:0}: Error finding container 430750206c71b202cb84496dfca20c824a291a0141784567047b5a86cb5dc60d: Status 404 returned error can't find the container with id 430750206c71b202cb84496dfca20c824a291a0141784567047b5a86cb5dc60d Feb 26 17:22:57 crc kubenswrapper[5033]: I0226 17:22:57.981861 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-678t2"] Feb 26 17:22:58 crc kubenswrapper[5033]: I0226 17:22:58.066888 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-678t2" event={"ID":"86a0b82d-eb74-4b64-984e-2eb565b14a00","Type":"ContainerStarted","Data":"430750206c71b202cb84496dfca20c824a291a0141784567047b5a86cb5dc60d"} Feb 26 17:22:59 crc kubenswrapper[5033]: I0226 17:22:59.085367 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-678t2" event={"ID":"86a0b82d-eb74-4b64-984e-2eb565b14a00","Type":"ContainerStarted","Data":"a4485642b347911d5be1ffd47f1191c0090334b1dd2256d5853252c4498bdba0"} Feb 26 17:23:01 crc kubenswrapper[5033]: I0226 17:23:01.114772 5033 generic.go:334] "Generic (PLEG): container finished" podID="86a0b82d-eb74-4b64-984e-2eb565b14a00" containerID="a4485642b347911d5be1ffd47f1191c0090334b1dd2256d5853252c4498bdba0" exitCode=0 Feb 26 17:23:01 crc kubenswrapper[5033]: I0226 17:23:01.115040 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-678t2" event={"ID":"86a0b82d-eb74-4b64-984e-2eb565b14a00","Type":"ContainerDied","Data":"a4485642b347911d5be1ffd47f1191c0090334b1dd2256d5853252c4498bdba0"} Feb 26 17:23:04 crc kubenswrapper[5033]: I0226 17:23:04.146366 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-678t2" event={"ID":"86a0b82d-eb74-4b64-984e-2eb565b14a00","Type":"ContainerStarted","Data":"59cc31d0fc279bf7952f372d30aa92a758a3b5ec8e5e24edac2b263b006cf24e"} Feb 26 17:23:04 crc kubenswrapper[5033]: I0226 17:23:04.183319 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-678t2" podStartSLOduration=1.7714062529999999 podStartE2EDuration="7.18329986s" podCreationTimestamp="2026-02-26 17:22:57 +0000 UTC" firstStartedPulling="2026-02-26 17:22:57.982672167 +0000 UTC m=+6105.195270837" lastFinishedPulling="2026-02-26 17:23:03.394565784 +0000 UTC m=+6110.607164444" observedRunningTime="2026-02-26 17:23:04.162736908 +0000 UTC m=+6111.375335568" watchObservedRunningTime="2026-02-26 17:23:04.18329986 +0000 UTC m=+6111.395898510" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.463657 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-zzzv9"] Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.471320 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.476641 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.478907 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.487496 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.520199 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-zzzv9"] Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.652904 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-combined-ca-bundle\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.653859 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-scripts\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.653997 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-amphora-certs\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.654132 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-config-data-merged\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.654409 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-config-data\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.654542 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-hm-ports\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.755923 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-amphora-certs\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.755989 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-config-data-merged\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.756035 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-config-data\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.756070 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-hm-ports\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.756090 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-combined-ca-bundle\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.756145 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-scripts\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.757757 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-config-data-merged\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.758482 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-hm-ports\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.763429 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-scripts\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.765783 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-config-data\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.766135 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-amphora-certs\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.766921 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ac0f34-8690-4df6-8960-60cd6a1abf7a-combined-ca-bundle\") pod \"octavia-healthmanager-zzzv9\" (UID: \"d7ac0f34-8690-4df6-8960-60cd6a1abf7a\") " pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:06 crc kubenswrapper[5033]: I0226 17:23:06.803098 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.339899 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-zzzv9"] Feb 26 17:23:07 crc kubenswrapper[5033]: W0226 17:23:07.340725 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7ac0f34_8690_4df6_8960_60cd6a1abf7a.slice/crio-17b9c081098a646ef22313b36a3827abb081f61766e8f98ef849919142c13f58 WatchSource:0}: Error finding container 17b9c081098a646ef22313b36a3827abb081f61766e8f98ef849919142c13f58: Status 404 returned error can't find the container with id 17b9c081098a646ef22313b36a3827abb081f61766e8f98ef849919142c13f58 Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.902934 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-77sps"] Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.905074 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.909714 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.909792 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.921154 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-77sps"] Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.982330 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45026070-8e05-46ff-8dcd-39009ddf82cd-config-data-merged\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.982466 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-config-data\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.982492 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-amphora-certs\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.982569 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/45026070-8e05-46ff-8dcd-39009ddf82cd-hm-ports\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.982649 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-scripts\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:07 crc kubenswrapper[5033]: I0226 17:23:07.982694 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-combined-ca-bundle\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.083938 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45026070-8e05-46ff-8dcd-39009ddf82cd-config-data-merged\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.084469 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45026070-8e05-46ff-8dcd-39009ddf82cd-config-data-merged\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.084506 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-config-data\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.084552 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-amphora-certs\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.084662 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/45026070-8e05-46ff-8dcd-39009ddf82cd-hm-ports\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.084713 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-scripts\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.084756 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-combined-ca-bundle\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.087059 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/45026070-8e05-46ff-8dcd-39009ddf82cd-hm-ports\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.092639 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-scripts\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.094477 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-combined-ca-bundle\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.095807 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-config-data\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.102538 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/45026070-8e05-46ff-8dcd-39009ddf82cd-amphora-certs\") pod \"octavia-housekeeping-77sps\" (UID: \"45026070-8e05-46ff-8dcd-39009ddf82cd\") " pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.213725 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-zzzv9" event={"ID":"d7ac0f34-8690-4df6-8960-60cd6a1abf7a","Type":"ContainerStarted","Data":"8442cafd2ca4b51b69b1ec19fbf9f57059ed8ce064ee44dc45db0683058e8ef2"} Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.213777 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-zzzv9" event={"ID":"d7ac0f34-8690-4df6-8960-60cd6a1abf7a","Type":"ContainerStarted","Data":"17b9c081098a646ef22313b36a3827abb081f61766e8f98ef849919142c13f58"} Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.224422 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:08 crc kubenswrapper[5033]: I0226 17:23:08.837662 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-77sps"] Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.224723 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-77sps" event={"ID":"45026070-8e05-46ff-8dcd-39009ddf82cd","Type":"ContainerStarted","Data":"208213e8052d6f17b06c18adea8f7ae9b9759e4caa09ad788fa62b9c37c5e5cc"} Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.595446 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-dpv6r"] Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.597331 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.601645 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.602567 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.606251 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-dpv6r"] Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.646810 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8064f2ba-5595-44a5-9287-c1fd5d02da0f-hm-ports\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.646977 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-amphora-certs\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.647072 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-combined-ca-bundle\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.647098 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8064f2ba-5595-44a5-9287-c1fd5d02da0f-config-data-merged\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.647133 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-config-data\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.647340 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-scripts\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.749467 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-amphora-certs\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.749978 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-combined-ca-bundle\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.750011 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8064f2ba-5595-44a5-9287-c1fd5d02da0f-config-data-merged\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.750111 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-config-data\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.750206 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-scripts\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.750281 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8064f2ba-5595-44a5-9287-c1fd5d02da0f-hm-ports\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.750607 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8064f2ba-5595-44a5-9287-c1fd5d02da0f-config-data-merged\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.751720 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/8064f2ba-5595-44a5-9287-c1fd5d02da0f-hm-ports\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.756205 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-amphora-certs\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.757044 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-config-data\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.758118 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-scripts\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.772372 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8064f2ba-5595-44a5-9287-c1fd5d02da0f-combined-ca-bundle\") pod \"octavia-worker-dpv6r\" (UID: \"8064f2ba-5595-44a5-9287-c1fd5d02da0f\") " pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:09 crc kubenswrapper[5033]: I0226 17:23:09.930353 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:10 crc kubenswrapper[5033]: I0226 17:23:10.484522 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-dpv6r"] Feb 26 17:23:10 crc kubenswrapper[5033]: W0226 17:23:10.491956 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8064f2ba_5595_44a5_9287_c1fd5d02da0f.slice/crio-7dd683d5adcabfa622deb23f08f9259c1af797fec8d23352711e42625641ba13 WatchSource:0}: Error finding container 7dd683d5adcabfa622deb23f08f9259c1af797fec8d23352711e42625641ba13: Status 404 returned error can't find the container with id 7dd683d5adcabfa622deb23f08f9259c1af797fec8d23352711e42625641ba13 Feb 26 17:23:11 crc kubenswrapper[5033]: I0226 17:23:11.030108 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-zzzv9"] Feb 26 17:23:11 crc kubenswrapper[5033]: I0226 17:23:11.248092 5033 generic.go:334] "Generic (PLEG): container finished" podID="d7ac0f34-8690-4df6-8960-60cd6a1abf7a" containerID="8442cafd2ca4b51b69b1ec19fbf9f57059ed8ce064ee44dc45db0683058e8ef2" exitCode=0 Feb 26 17:23:11 crc kubenswrapper[5033]: I0226 17:23:11.248169 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-zzzv9" event={"ID":"d7ac0f34-8690-4df6-8960-60cd6a1abf7a","Type":"ContainerDied","Data":"8442cafd2ca4b51b69b1ec19fbf9f57059ed8ce064ee44dc45db0683058e8ef2"} Feb 26 17:23:11 crc kubenswrapper[5033]: I0226 17:23:11.261362 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dpv6r" event={"ID":"8064f2ba-5595-44a5-9287-c1fd5d02da0f","Type":"ContainerStarted","Data":"7dd683d5adcabfa622deb23f08f9259c1af797fec8d23352711e42625641ba13"} Feb 26 17:23:13 crc kubenswrapper[5033]: I0226 17:23:13.282318 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-77sps" event={"ID":"45026070-8e05-46ff-8dcd-39009ddf82cd","Type":"ContainerStarted","Data":"be6252c5d023e6608d2e5a486a5fe526253efd198920b00eebc547b941a97d28"} Feb 26 17:23:13 crc kubenswrapper[5033]: I0226 17:23:13.286436 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-zzzv9" event={"ID":"d7ac0f34-8690-4df6-8960-60cd6a1abf7a","Type":"ContainerStarted","Data":"80281bb6fec67c585899665bf54f81c60c9177ce5460397a8e4e92c8ecce1075"} Feb 26 17:23:13 crc kubenswrapper[5033]: I0226 17:23:13.286871 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:13 crc kubenswrapper[5033]: I0226 17:23:13.329716 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-zzzv9" podStartSLOduration=7.329692973 podStartE2EDuration="7.329692973s" podCreationTimestamp="2026-02-26 17:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:23:13.321044051 +0000 UTC m=+6120.533642721" watchObservedRunningTime="2026-02-26 17:23:13.329692973 +0000 UTC m=+6120.542291633" Feb 26 17:23:14 crc kubenswrapper[5033]: I0226 17:23:14.306466 5033 generic.go:334] "Generic (PLEG): container finished" podID="45026070-8e05-46ff-8dcd-39009ddf82cd" containerID="be6252c5d023e6608d2e5a486a5fe526253efd198920b00eebc547b941a97d28" exitCode=0 Feb 26 17:23:14 crc kubenswrapper[5033]: I0226 17:23:14.306747 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-77sps" event={"ID":"45026070-8e05-46ff-8dcd-39009ddf82cd","Type":"ContainerDied","Data":"be6252c5d023e6608d2e5a486a5fe526253efd198920b00eebc547b941a97d28"} Feb 26 17:23:14 crc kubenswrapper[5033]: I0226 17:23:14.313617 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dpv6r" event={"ID":"8064f2ba-5595-44a5-9287-c1fd5d02da0f","Type":"ContainerStarted","Data":"fac3b943a9525f7bed9ad85095a8984640338fba9576e9987341e8097e58e019"} Feb 26 17:23:15 crc kubenswrapper[5033]: I0226 17:23:15.336645 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-77sps" event={"ID":"45026070-8e05-46ff-8dcd-39009ddf82cd","Type":"ContainerStarted","Data":"f3321b9d361622f2ccf77a0a72dd7958b07a9e736ad5441213cffd82cd484b61"} Feb 26 17:23:15 crc kubenswrapper[5033]: I0226 17:23:15.337225 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:15 crc kubenswrapper[5033]: I0226 17:23:15.343237 5033 generic.go:334] "Generic (PLEG): container finished" podID="8064f2ba-5595-44a5-9287-c1fd5d02da0f" containerID="fac3b943a9525f7bed9ad85095a8984640338fba9576e9987341e8097e58e019" exitCode=0 Feb 26 17:23:15 crc kubenswrapper[5033]: I0226 17:23:15.343310 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dpv6r" event={"ID":"8064f2ba-5595-44a5-9287-c1fd5d02da0f","Type":"ContainerDied","Data":"fac3b943a9525f7bed9ad85095a8984640338fba9576e9987341e8097e58e019"} Feb 26 17:23:15 crc kubenswrapper[5033]: I0226 17:23:15.343371 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-dpv6r" event={"ID":"8064f2ba-5595-44a5-9287-c1fd5d02da0f","Type":"ContainerStarted","Data":"59a9d71bf8c675a50fecdf91314a3e148b63be2fd0d00f6b92defafa03121164"} Feb 26 17:23:15 crc kubenswrapper[5033]: I0226 17:23:15.343534 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:15 crc kubenswrapper[5033]: I0226 17:23:15.366170 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-77sps" podStartSLOduration=5.022020434 podStartE2EDuration="8.366153684s" podCreationTimestamp="2026-02-26 17:23:07 +0000 UTC" firstStartedPulling="2026-02-26 17:23:08.832994828 +0000 UTC m=+6116.045593518" lastFinishedPulling="2026-02-26 17:23:12.177128108 +0000 UTC m=+6119.389726768" observedRunningTime="2026-02-26 17:23:15.365471437 +0000 UTC m=+6122.578070087" watchObservedRunningTime="2026-02-26 17:23:15.366153684 +0000 UTC m=+6122.578752334" Feb 26 17:23:15 crc kubenswrapper[5033]: I0226 17:23:15.394192 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-dpv6r" podStartSLOduration=3.889935161 podStartE2EDuration="6.394176488s" podCreationTimestamp="2026-02-26 17:23:09 +0000 UTC" firstStartedPulling="2026-02-26 17:23:10.494357506 +0000 UTC m=+6117.706956156" lastFinishedPulling="2026-02-26 17:23:12.998598813 +0000 UTC m=+6120.211197483" observedRunningTime="2026-02-26 17:23:15.381107379 +0000 UTC m=+6122.593706029" watchObservedRunningTime="2026-02-26 17:23:15.394176488 +0000 UTC m=+6122.606775138" Feb 26 17:23:21 crc kubenswrapper[5033]: I0226 17:23:21.842032 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-zzzv9" Feb 26 17:23:23 crc kubenswrapper[5033]: I0226 17:23:23.266982 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-77sps" Feb 26 17:23:24 crc kubenswrapper[5033]: I0226 17:23:24.989551 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-dpv6r" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.294743 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8c66d76ff-j2q6z"] Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.296875 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.301955 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.301964 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.302164 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-86ksh" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.302431 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.318647 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c66d76ff-j2q6z"] Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.386235 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.386511 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="015270d2-adee-49f0-b82e-694fe6f99a02" containerName="glance-log" containerID="cri-o://48829ba27463616e92bfb0c2f4536c0bf955bd61a3b84796c2b4335d01023841" gracePeriod=30 Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.387015 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="015270d2-adee-49f0-b82e-694fe6f99a02" containerName="glance-httpd" containerID="cri-o://4b8827271304642b8b65add542278af58cfa41d160b1f880cf6058c58ad6ab2a" gracePeriod=30 Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.449753 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmhv\" (UniqueName: \"kubernetes.io/projected/25148f5d-0145-4162-980d-d90483f145ca-kube-api-access-frmhv\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.449828 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25148f5d-0145-4162-980d-d90483f145ca-logs\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.449899 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-scripts\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.449963 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-config-data\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.450133 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/25148f5d-0145-4162-980d-d90483f145ca-horizon-secret-key\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.513313 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.513523 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerName="glance-log" containerID="cri-o://3394bc8b51e86c92d9183a7166dc1a9e222a4f0e3d8239a1428859f05de2973d" gracePeriod=30 Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.513715 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerName="glance-httpd" containerID="cri-o://ede5c075a69ef5eafd8a11e1d92ef436d277ecd12d2dec6c885417a9024961db" gracePeriod=30 Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.551454 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmhv\" (UniqueName: \"kubernetes.io/projected/25148f5d-0145-4162-980d-d90483f145ca-kube-api-access-frmhv\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.551803 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25148f5d-0145-4162-980d-d90483f145ca-logs\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.551858 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-scripts\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.551884 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-config-data\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.552004 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/25148f5d-0145-4162-980d-d90483f145ca-horizon-secret-key\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.553799 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25148f5d-0145-4162-980d-d90483f145ca-logs\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.559218 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-scripts\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.564914 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/25148f5d-0145-4162-980d-d90483f145ca-horizon-secret-key\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.569813 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d8ff98ff9-bffm4"] Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.575389 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-config-data\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.577577 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.587839 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmhv\" (UniqueName: \"kubernetes.io/projected/25148f5d-0145-4162-980d-d90483f145ca-kube-api-access-frmhv\") pod \"horizon-8c66d76ff-j2q6z\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.620506 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d8ff98ff9-bffm4"] Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.644161 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.653955 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-config-data\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.654018 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ztp\" (UniqueName: \"kubernetes.io/projected/9ba2eba4-ba35-4045-9f1b-2315542c42f5-kube-api-access-85ztp\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.654059 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-scripts\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.654077 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ba2eba4-ba35-4045-9f1b-2315542c42f5-horizon-secret-key\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.654211 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba2eba4-ba35-4045-9f1b-2315542c42f5-logs\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.759272 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85ztp\" (UniqueName: \"kubernetes.io/projected/9ba2eba4-ba35-4045-9f1b-2315542c42f5-kube-api-access-85ztp\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.759593 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-scripts\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.759693 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ba2eba4-ba35-4045-9f1b-2315542c42f5-horizon-secret-key\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.760016 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba2eba4-ba35-4045-9f1b-2315542c42f5-logs\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.760169 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-config-data\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.762143 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-scripts\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.762807 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba2eba4-ba35-4045-9f1b-2315542c42f5-logs\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.763487 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-config-data\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.781202 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ztp\" (UniqueName: \"kubernetes.io/projected/9ba2eba4-ba35-4045-9f1b-2315542c42f5-kube-api-access-85ztp\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.794024 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ba2eba4-ba35-4045-9f1b-2315542c42f5-horizon-secret-key\") pod \"horizon-7d8ff98ff9-bffm4\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.967470 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d8ff98ff9-bffm4"] Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.968289 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:33 crc kubenswrapper[5033]: I0226 17:23:33.999890 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-96bfb8b67-v6fd5"] Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.002037 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.034042 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-96bfb8b67-v6fd5"] Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.078404 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfft7\" (UniqueName: \"kubernetes.io/projected/54334857-01f6-40cc-afbe-6435181a45da-kube-api-access-kfft7\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.078546 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-scripts\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.078950 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54334857-01f6-40cc-afbe-6435181a45da-logs\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.079043 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54334857-01f6-40cc-afbe-6435181a45da-horizon-secret-key\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.079389 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-config-data\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.181394 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-config-data\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.181486 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfft7\" (UniqueName: \"kubernetes.io/projected/54334857-01f6-40cc-afbe-6435181a45da-kube-api-access-kfft7\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.181548 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-scripts\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.181591 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54334857-01f6-40cc-afbe-6435181a45da-logs\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.181650 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54334857-01f6-40cc-afbe-6435181a45da-horizon-secret-key\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.182277 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54334857-01f6-40cc-afbe-6435181a45da-logs\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.182847 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-config-data\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.184251 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-scripts\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.204248 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54334857-01f6-40cc-afbe-6435181a45da-horizon-secret-key\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.205385 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfft7\" (UniqueName: \"kubernetes.io/projected/54334857-01f6-40cc-afbe-6435181a45da-kube-api-access-kfft7\") pod \"horizon-96bfb8b67-v6fd5\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.249955 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c66d76ff-j2q6z"] Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.335237 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.489062 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d8ff98ff9-bffm4"] Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.577893 5033 generic.go:334] "Generic (PLEG): container finished" podID="015270d2-adee-49f0-b82e-694fe6f99a02" containerID="48829ba27463616e92bfb0c2f4536c0bf955bd61a3b84796c2b4335d01023841" exitCode=143 Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.577970 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"015270d2-adee-49f0-b82e-694fe6f99a02","Type":"ContainerDied","Data":"48829ba27463616e92bfb0c2f4536c0bf955bd61a3b84796c2b4335d01023841"} Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.579353 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c66d76ff-j2q6z" event={"ID":"25148f5d-0145-4162-980d-d90483f145ca","Type":"ContainerStarted","Data":"ea8ca6baa99c8efc37ce40060d356e3d8bb082107bb26995d32e2a8e79751179"} Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.580605 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d8ff98ff9-bffm4" event={"ID":"9ba2eba4-ba35-4045-9f1b-2315542c42f5","Type":"ContainerStarted","Data":"0018c4ac26f02479a7241d2a27525cc606427a97d5cb85ebb4cf6452a7f7d85a"} Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.583963 5033 generic.go:334] "Generic (PLEG): container finished" podID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerID="3394bc8b51e86c92d9183a7166dc1a9e222a4f0e3d8239a1428859f05de2973d" exitCode=143 Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.583999 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6f86d8-6fcf-41cf-b941-975937a1107f","Type":"ContainerDied","Data":"3394bc8b51e86c92d9183a7166dc1a9e222a4f0e3d8239a1428859f05de2973d"} Feb 26 17:23:34 crc kubenswrapper[5033]: I0226 17:23:34.814652 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-96bfb8b67-v6fd5"] Feb 26 17:23:34 crc kubenswrapper[5033]: W0226 17:23:34.819433 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54334857_01f6_40cc_afbe_6435181a45da.slice/crio-d4f9d47b58bc3a1e0bc94dd54bdacda20ac115019afe2fae888d457830dabcc5 WatchSource:0}: Error finding container d4f9d47b58bc3a1e0bc94dd54bdacda20ac115019afe2fae888d457830dabcc5: Status 404 returned error can't find the container with id d4f9d47b58bc3a1e0bc94dd54bdacda20ac115019afe2fae888d457830dabcc5 Feb 26 17:23:35 crc kubenswrapper[5033]: I0226 17:23:35.601330 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-96bfb8b67-v6fd5" event={"ID":"54334857-01f6-40cc-afbe-6435181a45da","Type":"ContainerStarted","Data":"d4f9d47b58bc3a1e0bc94dd54bdacda20ac115019afe2fae888d457830dabcc5"} Feb 26 17:23:37 crc kubenswrapper[5033]: I0226 17:23:37.657147 5033 generic.go:334] "Generic (PLEG): container finished" podID="015270d2-adee-49f0-b82e-694fe6f99a02" containerID="4b8827271304642b8b65add542278af58cfa41d160b1f880cf6058c58ad6ab2a" exitCode=0 Feb 26 17:23:37 crc kubenswrapper[5033]: I0226 17:23:37.662961 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"015270d2-adee-49f0-b82e-694fe6f99a02","Type":"ContainerDied","Data":"4b8827271304642b8b65add542278af58cfa41d160b1f880cf6058c58ad6ab2a"} Feb 26 17:23:37 crc kubenswrapper[5033]: I0226 17:23:37.665357 5033 generic.go:334] "Generic (PLEG): container finished" podID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerID="ede5c075a69ef5eafd8a11e1d92ef436d277ecd12d2dec6c885417a9024961db" exitCode=0 Feb 26 17:23:37 crc kubenswrapper[5033]: I0226 17:23:37.665413 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6f86d8-6fcf-41cf-b941-975937a1107f","Type":"ContainerDied","Data":"ede5c075a69ef5eafd8a11e1d92ef436d277ecd12d2dec6c885417a9024961db"} Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.632841 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.637756 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.722522 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"015270d2-adee-49f0-b82e-694fe6f99a02","Type":"ContainerDied","Data":"8cc1f22b90e8d2e242afcfd15cf162ce110524b432857cab65abd17da37ba8a4"} Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.722785 5033 scope.go:117] "RemoveContainer" containerID="4b8827271304642b8b65add542278af58cfa41d160b1f880cf6058c58ad6ab2a" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.722870 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.732013 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6f86d8-6fcf-41cf-b941-975937a1107f","Type":"ContainerDied","Data":"a7eb0a7208ae0c73ece38b5da2adb7cd97e3e026aa4a3f9e5cff6b75fcd46656"} Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.732128 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.742920 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-config-data\") pod \"015270d2-adee-49f0-b82e-694fe6f99a02\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743027 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-logs\") pod \"015270d2-adee-49f0-b82e-694fe6f99a02\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743071 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-httpd-run\") pod \"ef6f86d8-6fcf-41cf-b941-975937a1107f\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743108 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-scripts\") pod \"ef6f86d8-6fcf-41cf-b941-975937a1107f\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743150 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s6d8\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-kube-api-access-7s6d8\") pod \"015270d2-adee-49f0-b82e-694fe6f99a02\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743190 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-ceph\") pod \"ef6f86d8-6fcf-41cf-b941-975937a1107f\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743229 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-scripts\") pod \"015270d2-adee-49f0-b82e-694fe6f99a02\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743300 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-combined-ca-bundle\") pod \"ef6f86d8-6fcf-41cf-b941-975937a1107f\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743335 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv88l\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-kube-api-access-fv88l\") pod \"ef6f86d8-6fcf-41cf-b941-975937a1107f\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743354 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-logs\") pod \"ef6f86d8-6fcf-41cf-b941-975937a1107f\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743385 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-ceph\") pod \"015270d2-adee-49f0-b82e-694fe6f99a02\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743403 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-config-data\") pod \"ef6f86d8-6fcf-41cf-b941-975937a1107f\" (UID: \"ef6f86d8-6fcf-41cf-b941-975937a1107f\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743430 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-combined-ca-bundle\") pod \"015270d2-adee-49f0-b82e-694fe6f99a02\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.743456 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-httpd-run\") pod \"015270d2-adee-49f0-b82e-694fe6f99a02\" (UID: \"015270d2-adee-49f0-b82e-694fe6f99a02\") " Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.745583 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-logs" (OuterVolumeSpecName: "logs") pod "ef6f86d8-6fcf-41cf-b941-975937a1107f" (UID: "ef6f86d8-6fcf-41cf-b941-975937a1107f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.746006 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-logs" (OuterVolumeSpecName: "logs") pod "015270d2-adee-49f0-b82e-694fe6f99a02" (UID: "015270d2-adee-49f0-b82e-694fe6f99a02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.746319 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ef6f86d8-6fcf-41cf-b941-975937a1107f" (UID: "ef6f86d8-6fcf-41cf-b941-975937a1107f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.750617 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "015270d2-adee-49f0-b82e-694fe6f99a02" (UID: "015270d2-adee-49f0-b82e-694fe6f99a02"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.751135 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-scripts" (OuterVolumeSpecName: "scripts") pod "ef6f86d8-6fcf-41cf-b941-975937a1107f" (UID: "ef6f86d8-6fcf-41cf-b941-975937a1107f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.753719 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-scripts" (OuterVolumeSpecName: "scripts") pod "015270d2-adee-49f0-b82e-694fe6f99a02" (UID: "015270d2-adee-49f0-b82e-694fe6f99a02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.766861 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-kube-api-access-7s6d8" (OuterVolumeSpecName: "kube-api-access-7s6d8") pod "015270d2-adee-49f0-b82e-694fe6f99a02" (UID: "015270d2-adee-49f0-b82e-694fe6f99a02"). InnerVolumeSpecName "kube-api-access-7s6d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.786784 5033 scope.go:117] "RemoveContainer" containerID="48829ba27463616e92bfb0c2f4536c0bf955bd61a3b84796c2b4335d01023841" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.786858 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-ceph" (OuterVolumeSpecName: "ceph") pod "ef6f86d8-6fcf-41cf-b941-975937a1107f" (UID: "ef6f86d8-6fcf-41cf-b941-975937a1107f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.787004 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-ceph" (OuterVolumeSpecName: "ceph") pod "015270d2-adee-49f0-b82e-694fe6f99a02" (UID: "015270d2-adee-49f0-b82e-694fe6f99a02"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.787109 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-kube-api-access-fv88l" (OuterVolumeSpecName: "kube-api-access-fv88l") pod "ef6f86d8-6fcf-41cf-b941-975937a1107f" (UID: "ef6f86d8-6fcf-41cf-b941-975937a1107f"). InnerVolumeSpecName "kube-api-access-fv88l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.796046 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "015270d2-adee-49f0-b82e-694fe6f99a02" (UID: "015270d2-adee-49f0-b82e-694fe6f99a02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.848794 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef6f86d8-6fcf-41cf-b941-975937a1107f" (UID: "ef6f86d8-6fcf-41cf-b941-975937a1107f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852845 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852897 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv88l\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-kube-api-access-fv88l\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852913 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852927 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852941 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852950 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852960 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/015270d2-adee-49f0-b82e-694fe6f99a02-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852970 5033 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6f86d8-6fcf-41cf-b941-975937a1107f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852980 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.852991 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s6d8\" (UniqueName: \"kubernetes.io/projected/015270d2-adee-49f0-b82e-694fe6f99a02-kube-api-access-7s6d8\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.853002 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ef6f86d8-6fcf-41cf-b941-975937a1107f-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.853012 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.866622 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-config-data" (OuterVolumeSpecName: "config-data") pod "ef6f86d8-6fcf-41cf-b941-975937a1107f" (UID: "ef6f86d8-6fcf-41cf-b941-975937a1107f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.889365 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-config-data" (OuterVolumeSpecName: "config-data") pod "015270d2-adee-49f0-b82e-694fe6f99a02" (UID: "015270d2-adee-49f0-b82e-694fe6f99a02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.909532 5033 scope.go:117] "RemoveContainer" containerID="ede5c075a69ef5eafd8a11e1d92ef436d277ecd12d2dec6c885417a9024961db" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.946031 5033 scope.go:117] "RemoveContainer" containerID="3394bc8b51e86c92d9183a7166dc1a9e222a4f0e3d8239a1428859f05de2973d" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.955061 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6f86d8-6fcf-41cf-b941-975937a1107f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:42 crc kubenswrapper[5033]: I0226 17:23:42.955089 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/015270d2-adee-49f0-b82e-694fe6f99a02-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.247638 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.268550 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.313376 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.323180 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.333468 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:23:43 crc kubenswrapper[5033]: E0226 17:23:43.333997 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerName="glance-httpd" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.334012 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerName="glance-httpd" Feb 26 17:23:43 crc kubenswrapper[5033]: E0226 17:23:43.334024 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015270d2-adee-49f0-b82e-694fe6f99a02" containerName="glance-log" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.334029 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="015270d2-adee-49f0-b82e-694fe6f99a02" containerName="glance-log" Feb 26 17:23:43 crc kubenswrapper[5033]: E0226 17:23:43.334044 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015270d2-adee-49f0-b82e-694fe6f99a02" containerName="glance-httpd" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.334050 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="015270d2-adee-49f0-b82e-694fe6f99a02" containerName="glance-httpd" Feb 26 17:23:43 crc kubenswrapper[5033]: E0226 17:23:43.334061 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerName="glance-log" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.334068 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerName="glance-log" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.334266 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerName="glance-httpd" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.334279 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="015270d2-adee-49f0-b82e-694fe6f99a02" containerName="glance-httpd" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.334291 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="015270d2-adee-49f0-b82e-694fe6f99a02" containerName="glance-log" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.334301 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6f86d8-6fcf-41cf-b941-975937a1107f" containerName="glance-log" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.335820 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.338171 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xlkhb" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.338535 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.338864 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.364994 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.367672 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.367755 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.367787 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.367802 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.368029 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d4jf\" (UniqueName: \"kubernetes.io/projected/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-kube-api-access-6d4jf\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.368093 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.368378 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-logs\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.369365 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.371704 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.378675 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.393906 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.470728 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.470821 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.470851 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.470891 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.470937 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.470957 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.471006 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d4jf\" (UniqueName: \"kubernetes.io/projected/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-kube-api-access-6d4jf\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.471031 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.471079 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7g7q\" (UniqueName: \"kubernetes.io/projected/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-kube-api-access-b7g7q\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.471113 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.471157 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-logs\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.471194 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.471233 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-logs\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.471248 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.471712 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.473233 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-logs\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.476649 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.476947 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.480329 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.484185 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.493444 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d4jf\" (UniqueName: \"kubernetes.io/projected/48fd6e88-5852-4cf0-b4f7-41a8bc0d214b-kube-api-access-6d4jf\") pod \"glance-default-internal-api-0\" (UID: \"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b\") " pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.574539 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.575672 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-logs\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.575692 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.575780 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.575811 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.575911 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7g7q\" (UniqueName: \"kubernetes.io/projected/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-kube-api-access-b7g7q\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.575939 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.576869 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.576892 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-logs\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.583306 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.583765 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.583488 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.583972 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.595311 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7g7q\" (UniqueName: \"kubernetes.io/projected/cdaea64e-c02c-4fdc-9b53-08cccb2085a9-kube-api-access-b7g7q\") pod \"glance-default-external-api-0\" (UID: \"cdaea64e-c02c-4fdc-9b53-08cccb2085a9\") " pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.644402 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015270d2-adee-49f0-b82e-694fe6f99a02" path="/var/lib/kubelet/pods/015270d2-adee-49f0-b82e-694fe6f99a02/volumes" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.647093 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef6f86d8-6fcf-41cf-b941-975937a1107f" path="/var/lib/kubelet/pods/ef6f86d8-6fcf-41cf-b941-975937a1107f/volumes" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.660992 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.702147 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.755885 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-96bfb8b67-v6fd5" event={"ID":"54334857-01f6-40cc-afbe-6435181a45da","Type":"ContainerStarted","Data":"70b5d377e96abb246e88d094ba7e2179c9850909132c7593a971931bc748e0f8"} Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.755932 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-96bfb8b67-v6fd5" event={"ID":"54334857-01f6-40cc-afbe-6435181a45da","Type":"ContainerStarted","Data":"427bb00d5a1c68d2f6364752dbd87b71fb5801982cf7c04a2b7821826789f982"} Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.761526 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c66d76ff-j2q6z" event={"ID":"25148f5d-0145-4162-980d-d90483f145ca","Type":"ContainerStarted","Data":"201291f19582d35b9f20a0f2a6b4c8634500cebb79ce7675e9847080193d71df"} Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.761560 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c66d76ff-j2q6z" event={"ID":"25148f5d-0145-4162-980d-d90483f145ca","Type":"ContainerStarted","Data":"7f6b8fd54de62fc692e8e5a76cbc980ab8118d33240cbb744ddcb9987feddbc0"} Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.763495 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d8ff98ff9-bffm4" event={"ID":"9ba2eba4-ba35-4045-9f1b-2315542c42f5","Type":"ContainerStarted","Data":"c4aca482a6bc34f2b25057a6cfec07ac14eca925ce127a3edee3867bfc8df8c4"} Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.763542 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d8ff98ff9-bffm4" event={"ID":"9ba2eba4-ba35-4045-9f1b-2315542c42f5","Type":"ContainerStarted","Data":"a5b7150ad49f3e596a4be1b2bb88c7aac0ee0a3faa39cd5deded96c81693dd10"} Feb 26 17:23:43 crc kubenswrapper[5033]: I0226 17:23:43.781628 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-96bfb8b67-v6fd5" podStartSLOduration=2.961632593 podStartE2EDuration="10.78160999s" podCreationTimestamp="2026-02-26 17:23:33 +0000 UTC" firstStartedPulling="2026-02-26 17:23:34.821160206 +0000 UTC m=+6142.033758856" lastFinishedPulling="2026-02-26 17:23:42.641137603 +0000 UTC m=+6149.853736253" observedRunningTime="2026-02-26 17:23:43.781333522 +0000 UTC m=+6150.993932172" watchObservedRunningTime="2026-02-26 17:23:43.78160999 +0000 UTC m=+6150.994208640" Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.324285 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.335984 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.337007 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.396485 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.776468 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdaea64e-c02c-4fdc-9b53-08cccb2085a9","Type":"ContainerStarted","Data":"b61d8fc4e1ef73030b55a0b8a2394deaee58c533296d737b7d5e6abf7d23deda"} Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.778741 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b","Type":"ContainerStarted","Data":"858a4345992e14c64e0795400c4660c4cfc16e4fdaad4c2d03acce91dccf72f9"} Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.779243 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d8ff98ff9-bffm4" podUID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerName="horizon" containerID="cri-o://c4aca482a6bc34f2b25057a6cfec07ac14eca925ce127a3edee3867bfc8df8c4" gracePeriod=30 Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.779224 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d8ff98ff9-bffm4" podUID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerName="horizon-log" containerID="cri-o://a5b7150ad49f3e596a4be1b2bb88c7aac0ee0a3faa39cd5deded96c81693dd10" gracePeriod=30 Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.805635 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8c66d76ff-j2q6z" podStartSLOduration=3.488828949 podStartE2EDuration="11.805613671s" podCreationTimestamp="2026-02-26 17:23:33 +0000 UTC" firstStartedPulling="2026-02-26 17:23:34.257224831 +0000 UTC m=+6141.469823481" lastFinishedPulling="2026-02-26 17:23:42.574009553 +0000 UTC m=+6149.786608203" observedRunningTime="2026-02-26 17:23:44.801069511 +0000 UTC m=+6152.013668161" watchObservedRunningTime="2026-02-26 17:23:44.805613671 +0000 UTC m=+6152.018212321" Feb 26 17:23:44 crc kubenswrapper[5033]: I0226 17:23:44.826830 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d8ff98ff9-bffm4" podStartSLOduration=3.6882023090000002 podStartE2EDuration="11.826811829s" podCreationTimestamp="2026-02-26 17:23:33 +0000 UTC" firstStartedPulling="2026-02-26 17:23:34.502283037 +0000 UTC m=+6141.714881687" lastFinishedPulling="2026-02-26 17:23:42.640892557 +0000 UTC m=+6149.853491207" observedRunningTime="2026-02-26 17:23:44.817634925 +0000 UTC m=+6152.030233575" watchObservedRunningTime="2026-02-26 17:23:44.826811829 +0000 UTC m=+6152.039410479" Feb 26 17:23:45 crc kubenswrapper[5033]: I0226 17:23:45.788474 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b","Type":"ContainerStarted","Data":"4f4c896d28d6d9488369116ff30ba7e69a4a5382441a72b723c56d46fb62b818"} Feb 26 17:23:45 crc kubenswrapper[5033]: I0226 17:23:45.788944 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"48fd6e88-5852-4cf0-b4f7-41a8bc0d214b","Type":"ContainerStarted","Data":"c79bc1caa7d437e324454e8fc30372ff5ff863c1d81e7aba6658117d8f0051c8"} Feb 26 17:23:45 crc kubenswrapper[5033]: I0226 17:23:45.796612 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdaea64e-c02c-4fdc-9b53-08cccb2085a9","Type":"ContainerStarted","Data":"d34189e0daab1e244e6cfd3ed6d78babd343acac1f54eebdcc3d50434e3cfa53"} Feb 26 17:23:45 crc kubenswrapper[5033]: I0226 17:23:45.796637 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdaea64e-c02c-4fdc-9b53-08cccb2085a9","Type":"ContainerStarted","Data":"e1687b8604bcff491cc90dd07adde61c3a707983decb482c4981434bb6db3af4"} Feb 26 17:23:45 crc kubenswrapper[5033]: I0226 17:23:45.804575 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.804565442 podStartE2EDuration="2.804565442s" podCreationTimestamp="2026-02-26 17:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:23:45.802589273 +0000 UTC m=+6153.015187923" watchObservedRunningTime="2026-02-26 17:23:45.804565442 +0000 UTC m=+6153.017164092" Feb 26 17:23:45 crc kubenswrapper[5033]: I0226 17:23:45.834396 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.83437818 podStartE2EDuration="2.83437818s" podCreationTimestamp="2026-02-26 17:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:23:45.828978128 +0000 UTC m=+6153.041576798" watchObservedRunningTime="2026-02-26 17:23:45.83437818 +0000 UTC m=+6153.046976830" Feb 26 17:23:48 crc kubenswrapper[5033]: I0226 17:23:48.037780 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-g4txj"] Feb 26 17:23:48 crc kubenswrapper[5033]: I0226 17:23:48.049269 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-eeae-account-create-update-z7z4m"] Feb 26 17:23:48 crc kubenswrapper[5033]: I0226 17:23:48.058758 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-g4txj"] Feb 26 17:23:48 crc kubenswrapper[5033]: I0226 17:23:48.070428 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-eeae-account-create-update-z7z4m"] Feb 26 17:23:49 crc kubenswrapper[5033]: I0226 17:23:49.639740 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39bf9167-60ba-4040-aa99-a636b88e6a0b" path="/var/lib/kubelet/pods/39bf9167-60ba-4040-aa99-a636b88e6a0b/volumes" Feb 26 17:23:49 crc kubenswrapper[5033]: I0226 17:23:49.721029 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bac90d32-0947-4b73-9a7d-c1e6c16bdb85" path="/var/lib/kubelet/pods/bac90d32-0947-4b73-9a7d-c1e6c16bdb85/volumes" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.027785 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8q9fh"] Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.040697 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8q9fh"] Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.641936 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe38b338-9cef-49ae-a3f5-356667495c68" path="/var/lib/kubelet/pods/fe38b338-9cef-49ae-a3f5-356667495c68/volumes" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.645168 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.645373 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.647794 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c66d76ff-j2q6z" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.148:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.148:8080: connect: connection refused" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.662906 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.662962 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.697895 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.702937 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.702999 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.714111 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.754107 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.757288 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.888464 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.888504 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.888518 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.888529 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:53 crc kubenswrapper[5033]: I0226 17:23:53.969617 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:23:54 crc kubenswrapper[5033]: I0226 17:23:54.338174 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-96bfb8b67-v6fd5" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.150:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.150:8080: connect: connection refused" Feb 26 17:23:55 crc kubenswrapper[5033]: I0226 17:23:55.901171 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:55 crc kubenswrapper[5033]: I0226 17:23:55.907146 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 17:23:55 crc kubenswrapper[5033]: I0226 17:23:55.907174 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 17:23:55 crc kubenswrapper[5033]: I0226 17:23:55.907215 5033 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 17:23:56 crc kubenswrapper[5033]: I0226 17:23:56.001337 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 17:23:56 crc kubenswrapper[5033]: I0226 17:23:56.110203 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 17:23:56 crc kubenswrapper[5033]: I0226 17:23:56.408007 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.182010 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535444-bvrf5"] Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.197864 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535444-bvrf5" Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.204602 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.204812 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.204869 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.225570 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535444-bvrf5"] Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.344126 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48llt\" (UniqueName: \"kubernetes.io/projected/975decb9-6350-490b-96d1-f239d388f7f2-kube-api-access-48llt\") pod \"auto-csr-approver-29535444-bvrf5\" (UID: \"975decb9-6350-490b-96d1-f239d388f7f2\") " pod="openshift-infra/auto-csr-approver-29535444-bvrf5" Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.446681 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48llt\" (UniqueName: \"kubernetes.io/projected/975decb9-6350-490b-96d1-f239d388f7f2-kube-api-access-48llt\") pod \"auto-csr-approver-29535444-bvrf5\" (UID: \"975decb9-6350-490b-96d1-f239d388f7f2\") " pod="openshift-infra/auto-csr-approver-29535444-bvrf5" Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.467230 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48llt\" (UniqueName: \"kubernetes.io/projected/975decb9-6350-490b-96d1-f239d388f7f2-kube-api-access-48llt\") pod \"auto-csr-approver-29535444-bvrf5\" (UID: \"975decb9-6350-490b-96d1-f239d388f7f2\") " pod="openshift-infra/auto-csr-approver-29535444-bvrf5" Feb 26 17:24:00 crc kubenswrapper[5033]: I0226 17:24:00.527852 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535444-bvrf5" Feb 26 17:24:01 crc kubenswrapper[5033]: I0226 17:24:01.035608 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535444-bvrf5"] Feb 26 17:24:01 crc kubenswrapper[5033]: W0226 17:24:01.041757 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod975decb9_6350_490b_96d1_f239d388f7f2.slice/crio-1bddc9e4bb2dc1336a2728a3f558542f87baa8ddb92357d3d326199bcabaf000 WatchSource:0}: Error finding container 1bddc9e4bb2dc1336a2728a3f558542f87baa8ddb92357d3d326199bcabaf000: Status 404 returned error can't find the container with id 1bddc9e4bb2dc1336a2728a3f558542f87baa8ddb92357d3d326199bcabaf000 Feb 26 17:24:01 crc kubenswrapper[5033]: I0226 17:24:01.965393 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535444-bvrf5" event={"ID":"975decb9-6350-490b-96d1-f239d388f7f2","Type":"ContainerStarted","Data":"1bddc9e4bb2dc1336a2728a3f558542f87baa8ddb92357d3d326199bcabaf000"} Feb 26 17:24:02 crc kubenswrapper[5033]: I0226 17:24:02.976642 5033 generic.go:334] "Generic (PLEG): container finished" podID="975decb9-6350-490b-96d1-f239d388f7f2" containerID="d8e3eb717cc3b82058101de525cb5fcee9d71f91285b8d7a5b4582e414ff105f" exitCode=0 Feb 26 17:24:02 crc kubenswrapper[5033]: I0226 17:24:02.977100 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535444-bvrf5" event={"ID":"975decb9-6350-490b-96d1-f239d388f7f2","Type":"ContainerDied","Data":"d8e3eb717cc3b82058101de525cb5fcee9d71f91285b8d7a5b4582e414ff105f"} Feb 26 17:24:04 crc kubenswrapper[5033]: I0226 17:24:04.362039 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535444-bvrf5" Feb 26 17:24:04 crc kubenswrapper[5033]: I0226 17:24:04.432289 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48llt\" (UniqueName: \"kubernetes.io/projected/975decb9-6350-490b-96d1-f239d388f7f2-kube-api-access-48llt\") pod \"975decb9-6350-490b-96d1-f239d388f7f2\" (UID: \"975decb9-6350-490b-96d1-f239d388f7f2\") " Feb 26 17:24:04 crc kubenswrapper[5033]: I0226 17:24:04.454559 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975decb9-6350-490b-96d1-f239d388f7f2-kube-api-access-48llt" (OuterVolumeSpecName: "kube-api-access-48llt") pod "975decb9-6350-490b-96d1-f239d388f7f2" (UID: "975decb9-6350-490b-96d1-f239d388f7f2"). InnerVolumeSpecName "kube-api-access-48llt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:24:04 crc kubenswrapper[5033]: I0226 17:24:04.534724 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48llt\" (UniqueName: \"kubernetes.io/projected/975decb9-6350-490b-96d1-f239d388f7f2-kube-api-access-48llt\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:04 crc kubenswrapper[5033]: I0226 17:24:04.996199 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535444-bvrf5" event={"ID":"975decb9-6350-490b-96d1-f239d388f7f2","Type":"ContainerDied","Data":"1bddc9e4bb2dc1336a2728a3f558542f87baa8ddb92357d3d326199bcabaf000"} Feb 26 17:24:04 crc kubenswrapper[5033]: I0226 17:24:04.996244 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bddc9e4bb2dc1336a2728a3f558542f87baa8ddb92357d3d326199bcabaf000" Feb 26 17:24:04 crc kubenswrapper[5033]: I0226 17:24:04.996306 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535444-bvrf5" Feb 26 17:24:05 crc kubenswrapper[5033]: I0226 17:24:05.440237 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535438-2j87p"] Feb 26 17:24:05 crc kubenswrapper[5033]: I0226 17:24:05.449577 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535438-2j87p"] Feb 26 17:24:05 crc kubenswrapper[5033]: I0226 17:24:05.560224 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:24:05 crc kubenswrapper[5033]: I0226 17:24:05.637337 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a64692f6-539c-4f00-8aeb-9987edf929b4" path="/var/lib/kubelet/pods/a64692f6-539c-4f00-8aeb-9987edf929b4/volumes" Feb 26 17:24:06 crc kubenswrapper[5033]: I0226 17:24:06.144996 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:24:07 crc kubenswrapper[5033]: I0226 17:24:07.126969 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:24:07 crc kubenswrapper[5033]: I0226 17:24:07.830306 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:24:07 crc kubenswrapper[5033]: I0226 17:24:07.909192 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c66d76ff-j2q6z"] Feb 26 17:24:08 crc kubenswrapper[5033]: I0226 17:24:08.023184 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8c66d76ff-j2q6z" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon-log" containerID="cri-o://7f6b8fd54de62fc692e8e5a76cbc980ab8118d33240cbb744ddcb9987feddbc0" gracePeriod=30 Feb 26 17:24:08 crc kubenswrapper[5033]: I0226 17:24:08.023292 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8c66d76ff-j2q6z" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon" containerID="cri-o://201291f19582d35b9f20a0f2a6b4c8634500cebb79ce7675e9847080193d71df" gracePeriod=30 Feb 26 17:24:12 crc kubenswrapper[5033]: I0226 17:24:12.083147 5033 generic.go:334] "Generic (PLEG): container finished" podID="25148f5d-0145-4162-980d-d90483f145ca" containerID="201291f19582d35b9f20a0f2a6b4c8634500cebb79ce7675e9847080193d71df" exitCode=0 Feb 26 17:24:12 crc kubenswrapper[5033]: I0226 17:24:12.083273 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c66d76ff-j2q6z" event={"ID":"25148f5d-0145-4162-980d-d90483f145ca","Type":"ContainerDied","Data":"201291f19582d35b9f20a0f2a6b4c8634500cebb79ce7675e9847080193d71df"} Feb 26 17:24:13 crc kubenswrapper[5033]: I0226 17:24:13.645785 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8c66d76ff-j2q6z" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.148:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.148:8080: connect: connection refused" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.114079 5033 generic.go:334] "Generic (PLEG): container finished" podID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerID="c4aca482a6bc34f2b25057a6cfec07ac14eca925ce127a3edee3867bfc8df8c4" exitCode=137 Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.114961 5033 generic.go:334] "Generic (PLEG): container finished" podID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerID="a5b7150ad49f3e596a4be1b2bb88c7aac0ee0a3faa39cd5deded96c81693dd10" exitCode=137 Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.114322 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d8ff98ff9-bffm4" event={"ID":"9ba2eba4-ba35-4045-9f1b-2315542c42f5","Type":"ContainerDied","Data":"c4aca482a6bc34f2b25057a6cfec07ac14eca925ce127a3edee3867bfc8df8c4"} Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.115014 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d8ff98ff9-bffm4" event={"ID":"9ba2eba4-ba35-4045-9f1b-2315542c42f5","Type":"ContainerDied","Data":"a5b7150ad49f3e596a4be1b2bb88c7aac0ee0a3faa39cd5deded96c81693dd10"} Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.278666 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.364461 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba2eba4-ba35-4045-9f1b-2315542c42f5-logs\") pod \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.364712 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-config-data\") pod \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.364805 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-scripts\") pod \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.364869 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85ztp\" (UniqueName: \"kubernetes.io/projected/9ba2eba4-ba35-4045-9f1b-2315542c42f5-kube-api-access-85ztp\") pod \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.364946 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ba2eba4-ba35-4045-9f1b-2315542c42f5-horizon-secret-key\") pod \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\" (UID: \"9ba2eba4-ba35-4045-9f1b-2315542c42f5\") " Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.366222 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba2eba4-ba35-4045-9f1b-2315542c42f5-logs" (OuterVolumeSpecName: "logs") pod "9ba2eba4-ba35-4045-9f1b-2315542c42f5" (UID: "9ba2eba4-ba35-4045-9f1b-2315542c42f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.371448 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba2eba4-ba35-4045-9f1b-2315542c42f5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9ba2eba4-ba35-4045-9f1b-2315542c42f5" (UID: "9ba2eba4-ba35-4045-9f1b-2315542c42f5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.371456 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba2eba4-ba35-4045-9f1b-2315542c42f5-kube-api-access-85ztp" (OuterVolumeSpecName: "kube-api-access-85ztp") pod "9ba2eba4-ba35-4045-9f1b-2315542c42f5" (UID: "9ba2eba4-ba35-4045-9f1b-2315542c42f5"). InnerVolumeSpecName "kube-api-access-85ztp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.394701 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-scripts" (OuterVolumeSpecName: "scripts") pod "9ba2eba4-ba35-4045-9f1b-2315542c42f5" (UID: "9ba2eba4-ba35-4045-9f1b-2315542c42f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.405586 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-config-data" (OuterVolumeSpecName: "config-data") pod "9ba2eba4-ba35-4045-9f1b-2315542c42f5" (UID: "9ba2eba4-ba35-4045-9f1b-2315542c42f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.467770 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85ztp\" (UniqueName: \"kubernetes.io/projected/9ba2eba4-ba35-4045-9f1b-2315542c42f5-kube-api-access-85ztp\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.467812 5033 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ba2eba4-ba35-4045-9f1b-2315542c42f5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.467825 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba2eba4-ba35-4045-9f1b-2315542c42f5-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.467841 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:15 crc kubenswrapper[5033]: I0226 17:24:15.467853 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ba2eba4-ba35-4045-9f1b-2315542c42f5-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:16 crc kubenswrapper[5033]: I0226 17:24:16.129442 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d8ff98ff9-bffm4" event={"ID":"9ba2eba4-ba35-4045-9f1b-2315542c42f5","Type":"ContainerDied","Data":"0018c4ac26f02479a7241d2a27525cc606427a97d5cb85ebb4cf6452a7f7d85a"} Feb 26 17:24:16 crc kubenswrapper[5033]: I0226 17:24:16.129499 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d8ff98ff9-bffm4" Feb 26 17:24:16 crc kubenswrapper[5033]: I0226 17:24:16.129505 5033 scope.go:117] "RemoveContainer" containerID="c4aca482a6bc34f2b25057a6cfec07ac14eca925ce127a3edee3867bfc8df8c4" Feb 26 17:24:16 crc kubenswrapper[5033]: I0226 17:24:16.166600 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d8ff98ff9-bffm4"] Feb 26 17:24:16 crc kubenswrapper[5033]: I0226 17:24:16.176825 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7d8ff98ff9-bffm4"] Feb 26 17:24:16 crc kubenswrapper[5033]: I0226 17:24:16.357693 5033 scope.go:117] "RemoveContainer" containerID="a5b7150ad49f3e596a4be1b2bb88c7aac0ee0a3faa39cd5deded96c81693dd10" Feb 26 17:24:17 crc kubenswrapper[5033]: I0226 17:24:17.653558 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" path="/var/lib/kubelet/pods/9ba2eba4-ba35-4045-9f1b-2315542c42f5/volumes" Feb 26 17:24:20 crc kubenswrapper[5033]: I0226 17:24:20.189337 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:24:20 crc kubenswrapper[5033]: I0226 17:24:20.189737 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:24:22 crc kubenswrapper[5033]: I0226 17:24:22.048105 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-90ef-account-create-update-qqnl6"] Feb 26 17:24:22 crc kubenswrapper[5033]: I0226 17:24:22.061426 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2cdrs"] Feb 26 17:24:22 crc kubenswrapper[5033]: I0226 17:24:22.072526 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2cdrs"] Feb 26 17:24:22 crc kubenswrapper[5033]: I0226 17:24:22.082870 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-90ef-account-create-update-qqnl6"] Feb 26 17:24:23 crc kubenswrapper[5033]: I0226 17:24:23.640839 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56674e29-05ce-4200-94bf-26173801be40" path="/var/lib/kubelet/pods/56674e29-05ce-4200-94bf-26173801be40/volumes" Feb 26 17:24:23 crc kubenswrapper[5033]: I0226 17:24:23.642411 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88dbb18e-b5df-435b-899c-36247cb9b20c" path="/var/lib/kubelet/pods/88dbb18e-b5df-435b-899c-36247cb9b20c/volumes" Feb 26 17:24:23 crc kubenswrapper[5033]: I0226 17:24:23.645240 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8c66d76ff-j2q6z" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.148:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.148:8080: connect: connection refused" Feb 26 17:24:31 crc kubenswrapper[5033]: I0226 17:24:31.044383 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-s7bnh"] Feb 26 17:24:31 crc kubenswrapper[5033]: I0226 17:24:31.055370 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-s7bnh"] Feb 26 17:24:31 crc kubenswrapper[5033]: I0226 17:24:31.641350 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2496e558-4fc2-4065-8065-f08ee04e4ee2" path="/var/lib/kubelet/pods/2496e558-4fc2-4065-8065-f08ee04e4ee2/volumes" Feb 26 17:24:33 crc kubenswrapper[5033]: I0226 17:24:33.646247 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8c66d76ff-j2q6z" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.148:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.148:8080: connect: connection refused" Feb 26 17:24:33 crc kubenswrapper[5033]: I0226 17:24:33.646722 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:24:35 crc kubenswrapper[5033]: I0226 17:24:35.835074 5033 scope.go:117] "RemoveContainer" containerID="dd0ff1c953a23b10a8da8fa12693876185e8f46928148f2114271d977c075db0" Feb 26 17:24:35 crc kubenswrapper[5033]: I0226 17:24:35.866454 5033 scope.go:117] "RemoveContainer" containerID="0b6fe038fb10d3f42b68369112bd2a627f278d8d71427b0ec76eb5a5b917e43c" Feb 26 17:24:35 crc kubenswrapper[5033]: I0226 17:24:35.898202 5033 scope.go:117] "RemoveContainer" containerID="f9bf7194a0d7d166dd88cbf6a26439db02c4bf0a3587748b84b35f4ab5d18827" Feb 26 17:24:35 crc kubenswrapper[5033]: I0226 17:24:35.953636 5033 scope.go:117] "RemoveContainer" containerID="7b3e9a078a6553640a455680af4da5f83837f942261c5dd428bd423b4afa1ece" Feb 26 17:24:36 crc kubenswrapper[5033]: I0226 17:24:36.043785 5033 scope.go:117] "RemoveContainer" containerID="3a43d91c6aa1b8b4a760a1a3bd2b908d27c8715640bf845a07165f8a2cdae545" Feb 26 17:24:36 crc kubenswrapper[5033]: I0226 17:24:36.065675 5033 scope.go:117] "RemoveContainer" containerID="16d2b353304092e7f233aaa326da63f3100bfd4b1eb8e11fa1712531b4ba4c0b" Feb 26 17:24:36 crc kubenswrapper[5033]: I0226 17:24:36.147472 5033 scope.go:117] "RemoveContainer" containerID="70c8726ee065ee26976877ae88ccb44e6a3d523959c097924c8c9d7b80f3f227" Feb 26 17:24:36 crc kubenswrapper[5033]: I0226 17:24:36.175653 5033 scope.go:117] "RemoveContainer" containerID="1f2d786f168748fdbe0f8ee2b4862967a699848d1e2e43790573b71ab7a3cefd" Feb 26 17:24:36 crc kubenswrapper[5033]: I0226 17:24:36.201311 5033 scope.go:117] "RemoveContainer" containerID="a4ad016ba9768939669d0306245875217ac01494167e2c66d6fce09e0484935e" Feb 26 17:24:36 crc kubenswrapper[5033]: I0226 17:24:36.221207 5033 scope.go:117] "RemoveContainer" containerID="c6eea805f59d3a0375029d5c1708601585eb15dbef44dd30539d72b7435b3dd0" Feb 26 17:24:36 crc kubenswrapper[5033]: I0226 17:24:36.247290 5033 scope.go:117] "RemoveContainer" containerID="0e210b28565fc74cc85b71847bee6ca2ef327c7140062ebfc96129304458ae61" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.388056 5033 generic.go:334] "Generic (PLEG): container finished" podID="25148f5d-0145-4162-980d-d90483f145ca" containerID="7f6b8fd54de62fc692e8e5a76cbc980ab8118d33240cbb744ddcb9987feddbc0" exitCode=137 Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.388186 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c66d76ff-j2q6z" event={"ID":"25148f5d-0145-4162-980d-d90483f145ca","Type":"ContainerDied","Data":"7f6b8fd54de62fc692e8e5a76cbc980ab8118d33240cbb744ddcb9987feddbc0"} Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.709384 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.882218 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25148f5d-0145-4162-980d-d90483f145ca-logs\") pod \"25148f5d-0145-4162-980d-d90483f145ca\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.882285 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frmhv\" (UniqueName: \"kubernetes.io/projected/25148f5d-0145-4162-980d-d90483f145ca-kube-api-access-frmhv\") pod \"25148f5d-0145-4162-980d-d90483f145ca\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.882331 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/25148f5d-0145-4162-980d-d90483f145ca-horizon-secret-key\") pod \"25148f5d-0145-4162-980d-d90483f145ca\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.882357 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-config-data\") pod \"25148f5d-0145-4162-980d-d90483f145ca\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.882491 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-scripts\") pod \"25148f5d-0145-4162-980d-d90483f145ca\" (UID: \"25148f5d-0145-4162-980d-d90483f145ca\") " Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.882758 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25148f5d-0145-4162-980d-d90483f145ca-logs" (OuterVolumeSpecName: "logs") pod "25148f5d-0145-4162-980d-d90483f145ca" (UID: "25148f5d-0145-4162-980d-d90483f145ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.883212 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25148f5d-0145-4162-980d-d90483f145ca-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.892415 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25148f5d-0145-4162-980d-d90483f145ca-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "25148f5d-0145-4162-980d-d90483f145ca" (UID: "25148f5d-0145-4162-980d-d90483f145ca"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.896848 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25148f5d-0145-4162-980d-d90483f145ca-kube-api-access-frmhv" (OuterVolumeSpecName: "kube-api-access-frmhv") pod "25148f5d-0145-4162-980d-d90483f145ca" (UID: "25148f5d-0145-4162-980d-d90483f145ca"). InnerVolumeSpecName "kube-api-access-frmhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.914405 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-scripts" (OuterVolumeSpecName: "scripts") pod "25148f5d-0145-4162-980d-d90483f145ca" (UID: "25148f5d-0145-4162-980d-d90483f145ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.914825 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-config-data" (OuterVolumeSpecName: "config-data") pod "25148f5d-0145-4162-980d-d90483f145ca" (UID: "25148f5d-0145-4162-980d-d90483f145ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.985273 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.985311 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frmhv\" (UniqueName: \"kubernetes.io/projected/25148f5d-0145-4162-980d-d90483f145ca-kube-api-access-frmhv\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.985323 5033 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/25148f5d-0145-4162-980d-d90483f145ca-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:38 crc kubenswrapper[5033]: I0226 17:24:38.985334 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25148f5d-0145-4162-980d-d90483f145ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:39 crc kubenswrapper[5033]: I0226 17:24:39.399425 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c66d76ff-j2q6z" event={"ID":"25148f5d-0145-4162-980d-d90483f145ca","Type":"ContainerDied","Data":"ea8ca6baa99c8efc37ce40060d356e3d8bb082107bb26995d32e2a8e79751179"} Feb 26 17:24:39 crc kubenswrapper[5033]: I0226 17:24:39.399500 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c66d76ff-j2q6z" Feb 26 17:24:39 crc kubenswrapper[5033]: I0226 17:24:39.399772 5033 scope.go:117] "RemoveContainer" containerID="201291f19582d35b9f20a0f2a6b4c8634500cebb79ce7675e9847080193d71df" Feb 26 17:24:39 crc kubenswrapper[5033]: I0226 17:24:39.442802 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c66d76ff-j2q6z"] Feb 26 17:24:39 crc kubenswrapper[5033]: I0226 17:24:39.459905 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8c66d76ff-j2q6z"] Feb 26 17:24:39 crc kubenswrapper[5033]: I0226 17:24:39.592038 5033 scope.go:117] "RemoveContainer" containerID="7f6b8fd54de62fc692e8e5a76cbc980ab8118d33240cbb744ddcb9987feddbc0" Feb 26 17:24:39 crc kubenswrapper[5033]: I0226 17:24:39.643491 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25148f5d-0145-4162-980d-d90483f145ca" path="/var/lib/kubelet/pods/25148f5d-0145-4162-980d-d90483f145ca/volumes" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.181478 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7879589cbf-mbc9r"] Feb 26 17:24:50 crc kubenswrapper[5033]: E0226 17:24:50.182616 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon-log" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.182637 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon-log" Feb 26 17:24:50 crc kubenswrapper[5033]: E0226 17:24:50.182659 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.182667 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon" Feb 26 17:24:50 crc kubenswrapper[5033]: E0226 17:24:50.182690 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975decb9-6350-490b-96d1-f239d388f7f2" containerName="oc" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.182699 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="975decb9-6350-490b-96d1-f239d388f7f2" containerName="oc" Feb 26 17:24:50 crc kubenswrapper[5033]: E0226 17:24:50.182717 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerName="horizon" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.182725 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerName="horizon" Feb 26 17:24:50 crc kubenswrapper[5033]: E0226 17:24:50.182751 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerName="horizon-log" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.182758 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerName="horizon-log" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.182986 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.183003 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerName="horizon" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.183012 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="25148f5d-0145-4162-980d-d90483f145ca" containerName="horizon-log" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.183031 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba2eba4-ba35-4045-9f1b-2315542c42f5" containerName="horizon-log" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.183043 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="975decb9-6350-490b-96d1-f239d388f7f2" containerName="oc" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.184377 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.188710 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.188897 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.196883 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7879589cbf-mbc9r"] Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.260130 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05be8330-3b07-4f80-978e-11db638c80ac-scripts\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.260207 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/05be8330-3b07-4f80-978e-11db638c80ac-horizon-secret-key\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.260240 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05be8330-3b07-4f80-978e-11db638c80ac-logs\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.260286 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05be8330-3b07-4f80-978e-11db638c80ac-config-data\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.260418 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp94c\" (UniqueName: \"kubernetes.io/projected/05be8330-3b07-4f80-978e-11db638c80ac-kube-api-access-kp94c\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.362349 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/05be8330-3b07-4f80-978e-11db638c80ac-horizon-secret-key\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.362418 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05be8330-3b07-4f80-978e-11db638c80ac-logs\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.362460 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05be8330-3b07-4f80-978e-11db638c80ac-config-data\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.362569 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp94c\" (UniqueName: \"kubernetes.io/projected/05be8330-3b07-4f80-978e-11db638c80ac-kube-api-access-kp94c\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.362651 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05be8330-3b07-4f80-978e-11db638c80ac-scripts\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.363178 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05be8330-3b07-4f80-978e-11db638c80ac-logs\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.363728 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05be8330-3b07-4f80-978e-11db638c80ac-scripts\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.364406 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05be8330-3b07-4f80-978e-11db638c80ac-config-data\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.368731 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/05be8330-3b07-4f80-978e-11db638c80ac-horizon-secret-key\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.381175 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp94c\" (UniqueName: \"kubernetes.io/projected/05be8330-3b07-4f80-978e-11db638c80ac-kube-api-access-kp94c\") pod \"horizon-7879589cbf-mbc9r\" (UID: \"05be8330-3b07-4f80-978e-11db638c80ac\") " pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:50 crc kubenswrapper[5033]: I0226 17:24:50.516945 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.034804 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7879589cbf-mbc9r"] Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.510013 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7879589cbf-mbc9r" event={"ID":"05be8330-3b07-4f80-978e-11db638c80ac","Type":"ContainerStarted","Data":"9a2689edec67782956836751c27328809a815fb069b0a29f0a37e58eb1057a14"} Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.510391 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7879589cbf-mbc9r" event={"ID":"05be8330-3b07-4f80-978e-11db638c80ac","Type":"ContainerStarted","Data":"ab26db9e911e3d812812a7efb9f583009c45dcfc3ca99b33970dc3ea8f359c21"} Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.510407 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7879589cbf-mbc9r" event={"ID":"05be8330-3b07-4f80-978e-11db638c80ac","Type":"ContainerStarted","Data":"26d79c9ab90790ced97933c01efa573101d6e3ca61f0e5691a9dd28329b1f8d1"} Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.533330 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7879589cbf-mbc9r" podStartSLOduration=1.533300091 podStartE2EDuration="1.533300091s" podCreationTimestamp="2026-02-26 17:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:24:51.527639993 +0000 UTC m=+6218.740238653" watchObservedRunningTime="2026-02-26 17:24:51.533300091 +0000 UTC m=+6218.745898751" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.660866 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-drxqc"] Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.663006 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-drxqc" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.671862 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-drxqc"] Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.769682 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-dc7a-account-create-update-4c7gz"] Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.771554 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.774454 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.789247 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f351bf98-a88f-4ff7-a97b-b86356235a46-operator-scripts\") pod \"heat-db-create-drxqc\" (UID: \"f351bf98-a88f-4ff7-a97b-b86356235a46\") " pod="openstack/heat-db-create-drxqc" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.789372 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2qw\" (UniqueName: \"kubernetes.io/projected/f351bf98-a88f-4ff7-a97b-b86356235a46-kube-api-access-tl2qw\") pod \"heat-db-create-drxqc\" (UID: \"f351bf98-a88f-4ff7-a97b-b86356235a46\") " pod="openstack/heat-db-create-drxqc" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.791637 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-dc7a-account-create-update-4c7gz"] Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.891601 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98c9c0b9-1c57-4874-ba89-996a3658c80c-operator-scripts\") pod \"heat-dc7a-account-create-update-4c7gz\" (UID: \"98c9c0b9-1c57-4874-ba89-996a3658c80c\") " pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.891668 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f351bf98-a88f-4ff7-a97b-b86356235a46-operator-scripts\") pod \"heat-db-create-drxqc\" (UID: \"f351bf98-a88f-4ff7-a97b-b86356235a46\") " pod="openstack/heat-db-create-drxqc" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.891696 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqb8\" (UniqueName: \"kubernetes.io/projected/98c9c0b9-1c57-4874-ba89-996a3658c80c-kube-api-access-8nqb8\") pod \"heat-dc7a-account-create-update-4c7gz\" (UID: \"98c9c0b9-1c57-4874-ba89-996a3658c80c\") " pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.891773 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2qw\" (UniqueName: \"kubernetes.io/projected/f351bf98-a88f-4ff7-a97b-b86356235a46-kube-api-access-tl2qw\") pod \"heat-db-create-drxqc\" (UID: \"f351bf98-a88f-4ff7-a97b-b86356235a46\") " pod="openstack/heat-db-create-drxqc" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.892721 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f351bf98-a88f-4ff7-a97b-b86356235a46-operator-scripts\") pod \"heat-db-create-drxqc\" (UID: \"f351bf98-a88f-4ff7-a97b-b86356235a46\") " pod="openstack/heat-db-create-drxqc" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.913049 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2qw\" (UniqueName: \"kubernetes.io/projected/f351bf98-a88f-4ff7-a97b-b86356235a46-kube-api-access-tl2qw\") pod \"heat-db-create-drxqc\" (UID: \"f351bf98-a88f-4ff7-a97b-b86356235a46\") " pod="openstack/heat-db-create-drxqc" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.993754 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98c9c0b9-1c57-4874-ba89-996a3658c80c-operator-scripts\") pod \"heat-dc7a-account-create-update-4c7gz\" (UID: \"98c9c0b9-1c57-4874-ba89-996a3658c80c\") " pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.993812 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqb8\" (UniqueName: \"kubernetes.io/projected/98c9c0b9-1c57-4874-ba89-996a3658c80c-kube-api-access-8nqb8\") pod \"heat-dc7a-account-create-update-4c7gz\" (UID: \"98c9c0b9-1c57-4874-ba89-996a3658c80c\") " pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:51 crc kubenswrapper[5033]: I0226 17:24:51.994622 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98c9c0b9-1c57-4874-ba89-996a3658c80c-operator-scripts\") pod \"heat-dc7a-account-create-update-4c7gz\" (UID: \"98c9c0b9-1c57-4874-ba89-996a3658c80c\") " pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:52 crc kubenswrapper[5033]: I0226 17:24:52.010451 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqb8\" (UniqueName: \"kubernetes.io/projected/98c9c0b9-1c57-4874-ba89-996a3658c80c-kube-api-access-8nqb8\") pod \"heat-dc7a-account-create-update-4c7gz\" (UID: \"98c9c0b9-1c57-4874-ba89-996a3658c80c\") " pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:52 crc kubenswrapper[5033]: I0226 17:24:52.028129 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-drxqc" Feb 26 17:24:52 crc kubenswrapper[5033]: I0226 17:24:52.088640 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:52 crc kubenswrapper[5033]: I0226 17:24:52.512117 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-drxqc"] Feb 26 17:24:52 crc kubenswrapper[5033]: I0226 17:24:52.651953 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-dc7a-account-create-update-4c7gz"] Feb 26 17:24:53 crc kubenswrapper[5033]: I0226 17:24:53.530548 5033 generic.go:334] "Generic (PLEG): container finished" podID="f351bf98-a88f-4ff7-a97b-b86356235a46" containerID="bfdb44787466711c09ce1c8137e64ccf7c6c391fb7a693e6157ba80b43dbd2f0" exitCode=0 Feb 26 17:24:53 crc kubenswrapper[5033]: I0226 17:24:53.530997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-drxqc" event={"ID":"f351bf98-a88f-4ff7-a97b-b86356235a46","Type":"ContainerDied","Data":"bfdb44787466711c09ce1c8137e64ccf7c6c391fb7a693e6157ba80b43dbd2f0"} Feb 26 17:24:53 crc kubenswrapper[5033]: I0226 17:24:53.531080 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-drxqc" event={"ID":"f351bf98-a88f-4ff7-a97b-b86356235a46","Type":"ContainerStarted","Data":"cd1304903397d78812f3a68c8e74429316416374efc7f391600f7399e0018e64"} Feb 26 17:24:53 crc kubenswrapper[5033]: I0226 17:24:53.532577 5033 generic.go:334] "Generic (PLEG): container finished" podID="98c9c0b9-1c57-4874-ba89-996a3658c80c" containerID="c8598d83ee14863f97512851fb3659ad563715e39421b2d9750209b7c042d408" exitCode=0 Feb 26 17:24:53 crc kubenswrapper[5033]: I0226 17:24:53.532610 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-dc7a-account-create-update-4c7gz" event={"ID":"98c9c0b9-1c57-4874-ba89-996a3658c80c","Type":"ContainerDied","Data":"c8598d83ee14863f97512851fb3659ad563715e39421b2d9750209b7c042d408"} Feb 26 17:24:53 crc kubenswrapper[5033]: I0226 17:24:53.532631 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-dc7a-account-create-update-4c7gz" event={"ID":"98c9c0b9-1c57-4874-ba89-996a3658c80c","Type":"ContainerStarted","Data":"730d58c4e293e6a02dcf3dd9812a1f99aa5173350b7a18665b90ef984e4795df"} Feb 26 17:24:54 crc kubenswrapper[5033]: I0226 17:24:54.987222 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:54 crc kubenswrapper[5033]: I0226 17:24:54.992849 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-drxqc" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.100514 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f351bf98-a88f-4ff7-a97b-b86356235a46-operator-scripts\") pod \"f351bf98-a88f-4ff7-a97b-b86356235a46\" (UID: \"f351bf98-a88f-4ff7-a97b-b86356235a46\") " Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.100564 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98c9c0b9-1c57-4874-ba89-996a3658c80c-operator-scripts\") pod \"98c9c0b9-1c57-4874-ba89-996a3658c80c\" (UID: \"98c9c0b9-1c57-4874-ba89-996a3658c80c\") " Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.100798 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nqb8\" (UniqueName: \"kubernetes.io/projected/98c9c0b9-1c57-4874-ba89-996a3658c80c-kube-api-access-8nqb8\") pod \"98c9c0b9-1c57-4874-ba89-996a3658c80c\" (UID: \"98c9c0b9-1c57-4874-ba89-996a3658c80c\") " Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.100822 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2qw\" (UniqueName: \"kubernetes.io/projected/f351bf98-a88f-4ff7-a97b-b86356235a46-kube-api-access-tl2qw\") pod \"f351bf98-a88f-4ff7-a97b-b86356235a46\" (UID: \"f351bf98-a88f-4ff7-a97b-b86356235a46\") " Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.102579 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f351bf98-a88f-4ff7-a97b-b86356235a46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f351bf98-a88f-4ff7-a97b-b86356235a46" (UID: "f351bf98-a88f-4ff7-a97b-b86356235a46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.103411 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c9c0b9-1c57-4874-ba89-996a3658c80c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "98c9c0b9-1c57-4874-ba89-996a3658c80c" (UID: "98c9c0b9-1c57-4874-ba89-996a3658c80c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.107337 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c9c0b9-1c57-4874-ba89-996a3658c80c-kube-api-access-8nqb8" (OuterVolumeSpecName: "kube-api-access-8nqb8") pod "98c9c0b9-1c57-4874-ba89-996a3658c80c" (UID: "98c9c0b9-1c57-4874-ba89-996a3658c80c"). InnerVolumeSpecName "kube-api-access-8nqb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.108455 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f351bf98-a88f-4ff7-a97b-b86356235a46-kube-api-access-tl2qw" (OuterVolumeSpecName: "kube-api-access-tl2qw") pod "f351bf98-a88f-4ff7-a97b-b86356235a46" (UID: "f351bf98-a88f-4ff7-a97b-b86356235a46"). InnerVolumeSpecName "kube-api-access-tl2qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.203147 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f351bf98-a88f-4ff7-a97b-b86356235a46-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.203184 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98c9c0b9-1c57-4874-ba89-996a3658c80c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.203193 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nqb8\" (UniqueName: \"kubernetes.io/projected/98c9c0b9-1c57-4874-ba89-996a3658c80c-kube-api-access-8nqb8\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.203207 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2qw\" (UniqueName: \"kubernetes.io/projected/f351bf98-a88f-4ff7-a97b-b86356235a46-kube-api-access-tl2qw\") on node \"crc\" DevicePath \"\"" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.554958 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-drxqc" event={"ID":"f351bf98-a88f-4ff7-a97b-b86356235a46","Type":"ContainerDied","Data":"cd1304903397d78812f3a68c8e74429316416374efc7f391600f7399e0018e64"} Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.554987 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-drxqc" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.555009 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd1304903397d78812f3a68c8e74429316416374efc7f391600f7399e0018e64" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.558393 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-dc7a-account-create-update-4c7gz" event={"ID":"98c9c0b9-1c57-4874-ba89-996a3658c80c","Type":"ContainerDied","Data":"730d58c4e293e6a02dcf3dd9812a1f99aa5173350b7a18665b90ef984e4795df"} Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.558425 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="730d58c4e293e6a02dcf3dd9812a1f99aa5173350b7a18665b90ef984e4795df" Feb 26 17:24:55 crc kubenswrapper[5033]: I0226 17:24:55.558473 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-dc7a-account-create-update-4c7gz" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.160565 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-9htgj"] Feb 26 17:24:57 crc kubenswrapper[5033]: E0226 17:24:57.161511 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c9c0b9-1c57-4874-ba89-996a3658c80c" containerName="mariadb-account-create-update" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.161530 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c9c0b9-1c57-4874-ba89-996a3658c80c" containerName="mariadb-account-create-update" Feb 26 17:24:57 crc kubenswrapper[5033]: E0226 17:24:57.161561 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f351bf98-a88f-4ff7-a97b-b86356235a46" containerName="mariadb-database-create" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.161572 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f351bf98-a88f-4ff7-a97b-b86356235a46" containerName="mariadb-database-create" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.161798 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f351bf98-a88f-4ff7-a97b-b86356235a46" containerName="mariadb-database-create" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.161819 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c9c0b9-1c57-4874-ba89-996a3658c80c" containerName="mariadb-account-create-update" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.162677 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.164538 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-7sbkv" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.165860 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.173495 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-9htgj"] Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.248140 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-config-data\") pod \"heat-db-sync-9htgj\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.248509 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k847\" (UniqueName: \"kubernetes.io/projected/8417b29d-8e5f-4f31-888c-668d457abeb3-kube-api-access-4k847\") pod \"heat-db-sync-9htgj\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.248833 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-combined-ca-bundle\") pod \"heat-db-sync-9htgj\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.350576 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-combined-ca-bundle\") pod \"heat-db-sync-9htgj\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.350714 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-config-data\") pod \"heat-db-sync-9htgj\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.350813 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k847\" (UniqueName: \"kubernetes.io/projected/8417b29d-8e5f-4f31-888c-668d457abeb3-kube-api-access-4k847\") pod \"heat-db-sync-9htgj\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.357811 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-config-data\") pod \"heat-db-sync-9htgj\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.364027 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-combined-ca-bundle\") pod \"heat-db-sync-9htgj\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.372925 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k847\" (UniqueName: \"kubernetes.io/projected/8417b29d-8e5f-4f31-888c-668d457abeb3-kube-api-access-4k847\") pod \"heat-db-sync-9htgj\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:57 crc kubenswrapper[5033]: I0226 17:24:57.495106 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9htgj" Feb 26 17:24:58 crc kubenswrapper[5033]: I0226 17:24:58.019993 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-9htgj"] Feb 26 17:24:58 crc kubenswrapper[5033]: I0226 17:24:58.597863 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9htgj" event={"ID":"8417b29d-8e5f-4f31-888c-668d457abeb3","Type":"ContainerStarted","Data":"2b0be47bcb80e4e65e056377c4eb73b638a500efab0d28e6b048f5ebffd3f1af"} Feb 26 17:25:00 crc kubenswrapper[5033]: I0226 17:25:00.517370 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:25:00 crc kubenswrapper[5033]: I0226 17:25:00.517707 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:25:08 crc kubenswrapper[5033]: I0226 17:25:08.698224 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9htgj" event={"ID":"8417b29d-8e5f-4f31-888c-668d457abeb3","Type":"ContainerStarted","Data":"ee51e15235be0713b31ed96af90b25441df3ccde8551dc6b5d364bc445733518"} Feb 26 17:25:08 crc kubenswrapper[5033]: I0226 17:25:08.733996 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-9htgj" podStartSLOduration=1.367294525 podStartE2EDuration="11.733974117s" podCreationTimestamp="2026-02-26 17:24:57 +0000 UTC" firstStartedPulling="2026-02-26 17:24:58.020596857 +0000 UTC m=+6225.233195507" lastFinishedPulling="2026-02-26 17:25:08.387276449 +0000 UTC m=+6235.599875099" observedRunningTime="2026-02-26 17:25:08.716608073 +0000 UTC m=+6235.929206763" watchObservedRunningTime="2026-02-26 17:25:08.733974117 +0000 UTC m=+6235.946572787" Feb 26 17:25:10 crc kubenswrapper[5033]: I0226 17:25:10.722049 5033 generic.go:334] "Generic (PLEG): container finished" podID="8417b29d-8e5f-4f31-888c-668d457abeb3" containerID="ee51e15235be0713b31ed96af90b25441df3ccde8551dc6b5d364bc445733518" exitCode=0 Feb 26 17:25:10 crc kubenswrapper[5033]: I0226 17:25:10.722142 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9htgj" event={"ID":"8417b29d-8e5f-4f31-888c-668d457abeb3","Type":"ContainerDied","Data":"ee51e15235be0713b31ed96af90b25441df3ccde8551dc6b5d364bc445733518"} Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.090879 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9htgj" Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.119966 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k847\" (UniqueName: \"kubernetes.io/projected/8417b29d-8e5f-4f31-888c-668d457abeb3-kube-api-access-4k847\") pod \"8417b29d-8e5f-4f31-888c-668d457abeb3\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.120306 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-combined-ca-bundle\") pod \"8417b29d-8e5f-4f31-888c-668d457abeb3\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.120422 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-config-data\") pod \"8417b29d-8e5f-4f31-888c-668d457abeb3\" (UID: \"8417b29d-8e5f-4f31-888c-668d457abeb3\") " Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.128818 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8417b29d-8e5f-4f31-888c-668d457abeb3-kube-api-access-4k847" (OuterVolumeSpecName: "kube-api-access-4k847") pod "8417b29d-8e5f-4f31-888c-668d457abeb3" (UID: "8417b29d-8e5f-4f31-888c-668d457abeb3"). InnerVolumeSpecName "kube-api-access-4k847". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.152513 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8417b29d-8e5f-4f31-888c-668d457abeb3" (UID: "8417b29d-8e5f-4f31-888c-668d457abeb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.191530 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-config-data" (OuterVolumeSpecName: "config-data") pod "8417b29d-8e5f-4f31-888c-668d457abeb3" (UID: "8417b29d-8e5f-4f31-888c-668d457abeb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.222763 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.222790 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8417b29d-8e5f-4f31-888c-668d457abeb3-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.222799 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k847\" (UniqueName: \"kubernetes.io/projected/8417b29d-8e5f-4f31-888c-668d457abeb3-kube-api-access-4k847\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.383613 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.745946 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9htgj" event={"ID":"8417b29d-8e5f-4f31-888c-668d457abeb3","Type":"ContainerDied","Data":"2b0be47bcb80e4e65e056377c4eb73b638a500efab0d28e6b048f5ebffd3f1af"} Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.745988 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b0be47bcb80e4e65e056377c4eb73b638a500efab0d28e6b048f5ebffd3f1af" Feb 26 17:25:12 crc kubenswrapper[5033]: I0226 17:25:12.746031 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9htgj" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.072882 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2kp7z"] Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.087281 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6b14-account-create-update-h6ct9"] Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.097331 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2kp7z"] Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.102999 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6b14-account-create-update-h6ct9"] Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.646662 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44db2df6-733f-49c7-84d0-6183262c9cc6" path="/var/lib/kubelet/pods/44db2df6-733f-49c7-84d0-6183262c9cc6/volumes" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.647853 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd8d70ca-5b67-45fa-a174-e47c57b6cfb2" path="/var/lib/kubelet/pods/fd8d70ca-5b67-45fa-a174-e47c57b6cfb2/volumes" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.943482 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-586dbbcf89-xjwbx"] Feb 26 17:25:13 crc kubenswrapper[5033]: E0226 17:25:13.943888 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8417b29d-8e5f-4f31-888c-668d457abeb3" containerName="heat-db-sync" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.943904 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8417b29d-8e5f-4f31-888c-668d457abeb3" containerName="heat-db-sync" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.944104 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8417b29d-8e5f-4f31-888c-668d457abeb3" containerName="heat-db-sync" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.944765 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.946959 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.947823 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.948283 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-7sbkv" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.958092 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-config-data-custom\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.958430 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-config-data\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.958498 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdxfh\" (UniqueName: \"kubernetes.io/projected/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-kube-api-access-vdxfh\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.958304 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-586dbbcf89-xjwbx"] Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.958827 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-combined-ca-bundle\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:13 crc kubenswrapper[5033]: I0226 17:25:13.981670 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7879589cbf-mbc9r" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.061230 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-combined-ca-bundle\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.061384 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-config-data-custom\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.061412 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-config-data\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.061472 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdxfh\" (UniqueName: \"kubernetes.io/projected/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-kube-api-access-vdxfh\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.066214 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-66c4795cc9-fn99t"] Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.067843 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-config-data-custom\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.068250 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-config-data\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.070667 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.076102 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-combined-ca-bundle\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.078793 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.099371 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdxfh\" (UniqueName: \"kubernetes.io/projected/b36ee056-5db4-4b73-91e8-1bbf4b1f36cd-kube-api-access-vdxfh\") pod \"heat-engine-586dbbcf89-xjwbx\" (UID: \"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd\") " pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.127666 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-66c4795cc9-fn99t"] Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.156414 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-96bfb8b67-v6fd5"] Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.156674 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-96bfb8b67-v6fd5" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon-log" containerID="cri-o://427bb00d5a1c68d2f6364752dbd87b71fb5801982cf7c04a2b7821826789f982" gracePeriod=30 Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.157045 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-96bfb8b67-v6fd5" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon" containerID="cri-o://70b5d377e96abb246e88d094ba7e2179c9850909132c7593a971931bc748e0f8" gracePeriod=30 Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.164029 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-combined-ca-bundle\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.164192 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc7jt\" (UniqueName: \"kubernetes.io/projected/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-kube-api-access-nc7jt\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.164289 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-config-data\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.164339 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-config-data-custom\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.186906 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-86f57c6d85-7fbtj"] Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.188387 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.193595 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.199396 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-86f57c6d85-7fbtj"] Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.266503 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-combined-ca-bundle\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.266549 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-config-data\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.266584 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-config-data-custom\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.266610 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmgjz\" (UniqueName: \"kubernetes.io/projected/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-kube-api-access-vmgjz\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.266631 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-config-data\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.266655 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-config-data-custom\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.266717 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-combined-ca-bundle\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.266770 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc7jt\" (UniqueName: \"kubernetes.io/projected/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-kube-api-access-nc7jt\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.268664 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.271240 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-config-data-custom\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.285234 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-combined-ca-bundle\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.284442 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-config-data\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.290222 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc7jt\" (UniqueName: \"kubernetes.io/projected/a07c60cb-13ba-4570-a36b-ffa95c55e0f0-kube-api-access-nc7jt\") pod \"heat-cfnapi-66c4795cc9-fn99t\" (UID: \"a07c60cb-13ba-4570-a36b-ffa95c55e0f0\") " pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.368335 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-combined-ca-bundle\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.368406 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmgjz\" (UniqueName: \"kubernetes.io/projected/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-kube-api-access-vmgjz\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.368438 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-config-data\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.368498 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-config-data-custom\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.373982 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-combined-ca-bundle\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.381169 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-config-data\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.387515 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmgjz\" (UniqueName: \"kubernetes.io/projected/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-kube-api-access-vmgjz\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.387648 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3f0b850-f4c8-4fec-a439-dca70fa0c18e-config-data-custom\") pod \"heat-api-86f57c6d85-7fbtj\" (UID: \"e3f0b850-f4c8-4fec-a439-dca70fa0c18e\") " pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.526169 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.531983 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:14 crc kubenswrapper[5033]: I0226 17:25:14.762387 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-586dbbcf89-xjwbx"] Feb 26 17:25:14 crc kubenswrapper[5033]: W0226 17:25:14.764400 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb36ee056_5db4_4b73_91e8_1bbf4b1f36cd.slice/crio-fb8f412bf7afa9dc640b5959a8e801ab3b9f8e0c0c50cf14e7a8397f296dcc01 WatchSource:0}: Error finding container fb8f412bf7afa9dc640b5959a8e801ab3b9f8e0c0c50cf14e7a8397f296dcc01: Status 404 returned error can't find the container with id fb8f412bf7afa9dc640b5959a8e801ab3b9f8e0c0c50cf14e7a8397f296dcc01 Feb 26 17:25:15 crc kubenswrapper[5033]: I0226 17:25:15.018581 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-86f57c6d85-7fbtj"] Feb 26 17:25:15 crc kubenswrapper[5033]: W0226 17:25:15.027665 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3f0b850_f4c8_4fec_a439_dca70fa0c18e.slice/crio-f9c1f9d6b891f0015f0a05fb7f930997550ddd3464cf89b896ca3aa58fa2eba7 WatchSource:0}: Error finding container f9c1f9d6b891f0015f0a05fb7f930997550ddd3464cf89b896ca3aa58fa2eba7: Status 404 returned error can't find the container with id f9c1f9d6b891f0015f0a05fb7f930997550ddd3464cf89b896ca3aa58fa2eba7 Feb 26 17:25:15 crc kubenswrapper[5033]: I0226 17:25:15.071881 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-66c4795cc9-fn99t"] Feb 26 17:25:15 crc kubenswrapper[5033]: W0226 17:25:15.082452 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda07c60cb_13ba_4570_a36b_ffa95c55e0f0.slice/crio-9ab4500c83d2483e0ffd26610610263651ef20db6444fcab6b13b51f1731033c WatchSource:0}: Error finding container 9ab4500c83d2483e0ffd26610610263651ef20db6444fcab6b13b51f1731033c: Status 404 returned error can't find the container with id 9ab4500c83d2483e0ffd26610610263651ef20db6444fcab6b13b51f1731033c Feb 26 17:25:15 crc kubenswrapper[5033]: I0226 17:25:15.778232 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-66c4795cc9-fn99t" event={"ID":"a07c60cb-13ba-4570-a36b-ffa95c55e0f0","Type":"ContainerStarted","Data":"9ab4500c83d2483e0ffd26610610263651ef20db6444fcab6b13b51f1731033c"} Feb 26 17:25:15 crc kubenswrapper[5033]: I0226 17:25:15.780150 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86f57c6d85-7fbtj" event={"ID":"e3f0b850-f4c8-4fec-a439-dca70fa0c18e","Type":"ContainerStarted","Data":"f9c1f9d6b891f0015f0a05fb7f930997550ddd3464cf89b896ca3aa58fa2eba7"} Feb 26 17:25:15 crc kubenswrapper[5033]: I0226 17:25:15.782999 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-586dbbcf89-xjwbx" event={"ID":"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd","Type":"ContainerStarted","Data":"df5d2ea04f3df2126fdc95679a20730f383fa5c2378778cbc9a3974d44c61a19"} Feb 26 17:25:15 crc kubenswrapper[5033]: I0226 17:25:15.783035 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-586dbbcf89-xjwbx" event={"ID":"b36ee056-5db4-4b73-91e8-1bbf4b1f36cd","Type":"ContainerStarted","Data":"fb8f412bf7afa9dc640b5959a8e801ab3b9f8e0c0c50cf14e7a8397f296dcc01"} Feb 26 17:25:15 crc kubenswrapper[5033]: I0226 17:25:15.784246 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:15 crc kubenswrapper[5033]: I0226 17:25:15.811750 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-586dbbcf89-xjwbx" podStartSLOduration=2.811726456 podStartE2EDuration="2.811726456s" podCreationTimestamp="2026-02-26 17:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:25:15.804188001 +0000 UTC m=+6243.016786651" watchObservedRunningTime="2026-02-26 17:25:15.811726456 +0000 UTC m=+6243.024325106" Feb 26 17:25:17 crc kubenswrapper[5033]: I0226 17:25:17.289213 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-96bfb8b67-v6fd5" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.150:8080/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:39110->10.217.1.150:8080: read: connection reset by peer" Feb 26 17:25:17 crc kubenswrapper[5033]: I0226 17:25:17.806281 5033 generic.go:334] "Generic (PLEG): container finished" podID="54334857-01f6-40cc-afbe-6435181a45da" containerID="70b5d377e96abb246e88d094ba7e2179c9850909132c7593a971931bc748e0f8" exitCode=0 Feb 26 17:25:17 crc kubenswrapper[5033]: I0226 17:25:17.806360 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-96bfb8b67-v6fd5" event={"ID":"54334857-01f6-40cc-afbe-6435181a45da","Type":"ContainerDied","Data":"70b5d377e96abb246e88d094ba7e2179c9850909132c7593a971931bc748e0f8"} Feb 26 17:25:18 crc kubenswrapper[5033]: I0226 17:25:18.817589 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-66c4795cc9-fn99t" event={"ID":"a07c60cb-13ba-4570-a36b-ffa95c55e0f0","Type":"ContainerStarted","Data":"309912783de8502889d40d0c0c5e2b68ca2f0e1a6d42d2daad6a828510cadc00"} Feb 26 17:25:18 crc kubenswrapper[5033]: I0226 17:25:18.817988 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:18 crc kubenswrapper[5033]: I0226 17:25:18.819760 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86f57c6d85-7fbtj" event={"ID":"e3f0b850-f4c8-4fec-a439-dca70fa0c18e","Type":"ContainerStarted","Data":"e8b8625ab7e3c207b350ee3f3142d2526666c30a00ec37adf6e1262e5f535b24"} Feb 26 17:25:18 crc kubenswrapper[5033]: I0226 17:25:18.819949 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:18 crc kubenswrapper[5033]: I0226 17:25:18.840214 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-66c4795cc9-fn99t" podStartSLOduration=1.6879943229999999 podStartE2EDuration="4.840191157s" podCreationTimestamp="2026-02-26 17:25:14 +0000 UTC" firstStartedPulling="2026-02-26 17:25:15.084603705 +0000 UTC m=+6242.297202355" lastFinishedPulling="2026-02-26 17:25:18.236800529 +0000 UTC m=+6245.449399189" observedRunningTime="2026-02-26 17:25:18.834009896 +0000 UTC m=+6246.046608536" watchObservedRunningTime="2026-02-26 17:25:18.840191157 +0000 UTC m=+6246.052789807" Feb 26 17:25:18 crc kubenswrapper[5033]: I0226 17:25:18.873907 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-86f57c6d85-7fbtj" podStartSLOduration=1.671265395 podStartE2EDuration="4.87388296s" podCreationTimestamp="2026-02-26 17:25:14 +0000 UTC" firstStartedPulling="2026-02-26 17:25:15.030416062 +0000 UTC m=+6242.243014712" lastFinishedPulling="2026-02-26 17:25:18.233033617 +0000 UTC m=+6245.445632277" observedRunningTime="2026-02-26 17:25:18.864965673 +0000 UTC m=+6246.077564353" watchObservedRunningTime="2026-02-26 17:25:18.87388296 +0000 UTC m=+6246.086481610" Feb 26 17:25:20 crc kubenswrapper[5033]: I0226 17:25:20.189349 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:25:20 crc kubenswrapper[5033]: I0226 17:25:20.189660 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:25:20 crc kubenswrapper[5033]: I0226 17:25:20.189715 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:25:20 crc kubenswrapper[5033]: I0226 17:25:20.190630 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33064e87c19129e8c72344901e1129ded177728fee58532d134814d33183f904"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:25:20 crc kubenswrapper[5033]: I0226 17:25:20.190700 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://33064e87c19129e8c72344901e1129ded177728fee58532d134814d33183f904" gracePeriod=600 Feb 26 17:25:20 crc kubenswrapper[5033]: I0226 17:25:20.857659 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="33064e87c19129e8c72344901e1129ded177728fee58532d134814d33183f904" exitCode=0 Feb 26 17:25:20 crc kubenswrapper[5033]: I0226 17:25:20.857787 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"33064e87c19129e8c72344901e1129ded177728fee58532d134814d33183f904"} Feb 26 17:25:20 crc kubenswrapper[5033]: I0226 17:25:20.857973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb"} Feb 26 17:25:20 crc kubenswrapper[5033]: I0226 17:25:20.858011 5033 scope.go:117] "RemoveContainer" containerID="a2097b77a00997efb922236307210b1dc37dd75d91b9c12777a1dae5764a9421" Feb 26 17:25:21 crc kubenswrapper[5033]: I0226 17:25:21.028974 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qhct7"] Feb 26 17:25:21 crc kubenswrapper[5033]: I0226 17:25:21.036736 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qhct7"] Feb 26 17:25:21 crc kubenswrapper[5033]: I0226 17:25:21.638875 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef21f49e-cefa-4d4d-9675-1c9d30a78aaa" path="/var/lib/kubelet/pods/ef21f49e-cefa-4d4d-9675-1c9d30a78aaa/volumes" Feb 26 17:25:24 crc kubenswrapper[5033]: I0226 17:25:24.336287 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-96bfb8b67-v6fd5" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.150:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.150:8080: connect: connection refused" Feb 26 17:25:25 crc kubenswrapper[5033]: I0226 17:25:25.854540 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-86f57c6d85-7fbtj" Feb 26 17:25:25 crc kubenswrapper[5033]: I0226 17:25:25.978594 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-66c4795cc9-fn99t" Feb 26 17:25:34 crc kubenswrapper[5033]: I0226 17:25:34.306879 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-586dbbcf89-xjwbx" Feb 26 17:25:34 crc kubenswrapper[5033]: I0226 17:25:34.336321 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-96bfb8b67-v6fd5" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.150:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.150:8080: connect: connection refused" Feb 26 17:25:34 crc kubenswrapper[5033]: I0226 17:25:34.336450 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:25:36 crc kubenswrapper[5033]: I0226 17:25:36.474211 5033 scope.go:117] "RemoveContainer" containerID="eb7a13789208c19e4a36e6654620cc293b75bf65afdfd02bd5b8d8baa9f1b6ce" Feb 26 17:25:36 crc kubenswrapper[5033]: I0226 17:25:36.522142 5033 scope.go:117] "RemoveContainer" containerID="3f52b35551bda5b447877dc604a33e259f63fe02950a689c11f30c680f7a15cd" Feb 26 17:25:36 crc kubenswrapper[5033]: I0226 17:25:36.542716 5033 scope.go:117] "RemoveContainer" containerID="39feeb0fd932078974be5d4f740c0c0c77216f5cb656b7291e4874dd5f696365" Feb 26 17:25:44 crc kubenswrapper[5033]: I0226 17:25:44.336719 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-96bfb8b67-v6fd5" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.150:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.150:8080: connect: connection refused" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.085862 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6"] Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.088017 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.096746 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.098991 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6"] Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.118315 5033 generic.go:334] "Generic (PLEG): container finished" podID="54334857-01f6-40cc-afbe-6435181a45da" containerID="427bb00d5a1c68d2f6364752dbd87b71fb5801982cf7c04a2b7821826789f982" exitCode=137 Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.118367 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-96bfb8b67-v6fd5" event={"ID":"54334857-01f6-40cc-afbe-6435181a45da","Type":"ContainerDied","Data":"427bb00d5a1c68d2f6364752dbd87b71fb5801982cf7c04a2b7821826789f982"} Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.129112 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26pz4\" (UniqueName: \"kubernetes.io/projected/18742a04-1627-4e97-b9ae-81b4f5457d05-kube-api-access-26pz4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.129175 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.129292 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.230941 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26pz4\" (UniqueName: \"kubernetes.io/projected/18742a04-1627-4e97-b9ae-81b4f5457d05-kube-api-access-26pz4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.231032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.231089 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.231807 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.231923 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.252824 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26pz4\" (UniqueName: \"kubernetes.io/projected/18742a04-1627-4e97-b9ae-81b4f5457d05-kube-api-access-26pz4\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.336882 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.416798 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.433571 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-scripts\") pod \"54334857-01f6-40cc-afbe-6435181a45da\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.433646 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfft7\" (UniqueName: \"kubernetes.io/projected/54334857-01f6-40cc-afbe-6435181a45da-kube-api-access-kfft7\") pod \"54334857-01f6-40cc-afbe-6435181a45da\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.433715 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54334857-01f6-40cc-afbe-6435181a45da-horizon-secret-key\") pod \"54334857-01f6-40cc-afbe-6435181a45da\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.433841 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-config-data\") pod \"54334857-01f6-40cc-afbe-6435181a45da\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.435211 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54334857-01f6-40cc-afbe-6435181a45da-logs\") pod \"54334857-01f6-40cc-afbe-6435181a45da\" (UID: \"54334857-01f6-40cc-afbe-6435181a45da\") " Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.436438 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54334857-01f6-40cc-afbe-6435181a45da-logs" (OuterVolumeSpecName: "logs") pod "54334857-01f6-40cc-afbe-6435181a45da" (UID: "54334857-01f6-40cc-afbe-6435181a45da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.438568 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54334857-01f6-40cc-afbe-6435181a45da-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "54334857-01f6-40cc-afbe-6435181a45da" (UID: "54334857-01f6-40cc-afbe-6435181a45da"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.440112 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54334857-01f6-40cc-afbe-6435181a45da-kube-api-access-kfft7" (OuterVolumeSpecName: "kube-api-access-kfft7") pod "54334857-01f6-40cc-afbe-6435181a45da" (UID: "54334857-01f6-40cc-afbe-6435181a45da"). InnerVolumeSpecName "kube-api-access-kfft7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.462000 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-config-data" (OuterVolumeSpecName: "config-data") pod "54334857-01f6-40cc-afbe-6435181a45da" (UID: "54334857-01f6-40cc-afbe-6435181a45da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.472052 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-scripts" (OuterVolumeSpecName: "scripts") pod "54334857-01f6-40cc-afbe-6435181a45da" (UID: "54334857-01f6-40cc-afbe-6435181a45da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.537630 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54334857-01f6-40cc-afbe-6435181a45da-logs\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.537669 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.537681 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfft7\" (UniqueName: \"kubernetes.io/projected/54334857-01f6-40cc-afbe-6435181a45da-kube-api-access-kfft7\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.537695 5033 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54334857-01f6-40cc-afbe-6435181a45da-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.537706 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54334857-01f6-40cc-afbe-6435181a45da-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:45 crc kubenswrapper[5033]: I0226 17:25:45.912473 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6"] Feb 26 17:25:45 crc kubenswrapper[5033]: W0226 17:25:45.914859 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18742a04_1627_4e97_b9ae_81b4f5457d05.slice/crio-548c6644d7234b55ab26d176bd8a704556ccdd48893958238d81088db8bc938e WatchSource:0}: Error finding container 548c6644d7234b55ab26d176bd8a704556ccdd48893958238d81088db8bc938e: Status 404 returned error can't find the container with id 548c6644d7234b55ab26d176bd8a704556ccdd48893958238d81088db8bc938e Feb 26 17:25:46 crc kubenswrapper[5033]: I0226 17:25:46.134880 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-96bfb8b67-v6fd5" event={"ID":"54334857-01f6-40cc-afbe-6435181a45da","Type":"ContainerDied","Data":"d4f9d47b58bc3a1e0bc94dd54bdacda20ac115019afe2fae888d457830dabcc5"} Feb 26 17:25:46 crc kubenswrapper[5033]: I0226 17:25:46.134930 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-96bfb8b67-v6fd5" Feb 26 17:25:46 crc kubenswrapper[5033]: I0226 17:25:46.134964 5033 scope.go:117] "RemoveContainer" containerID="70b5d377e96abb246e88d094ba7e2179c9850909132c7593a971931bc748e0f8" Feb 26 17:25:46 crc kubenswrapper[5033]: I0226 17:25:46.140039 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" event={"ID":"18742a04-1627-4e97-b9ae-81b4f5457d05","Type":"ContainerStarted","Data":"482f3935e9ba7be7cac0c84ee207f8f69bd1ebf305baaafe3dbb975755c19aa1"} Feb 26 17:25:46 crc kubenswrapper[5033]: I0226 17:25:46.140078 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" event={"ID":"18742a04-1627-4e97-b9ae-81b4f5457d05","Type":"ContainerStarted","Data":"548c6644d7234b55ab26d176bd8a704556ccdd48893958238d81088db8bc938e"} Feb 26 17:25:46 crc kubenswrapper[5033]: I0226 17:25:46.164135 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-96bfb8b67-v6fd5"] Feb 26 17:25:46 crc kubenswrapper[5033]: I0226 17:25:46.177051 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-96bfb8b67-v6fd5"] Feb 26 17:25:46 crc kubenswrapper[5033]: I0226 17:25:46.327870 5033 scope.go:117] "RemoveContainer" containerID="427bb00d5a1c68d2f6364752dbd87b71fb5801982cf7c04a2b7821826789f982" Feb 26 17:25:47 crc kubenswrapper[5033]: I0226 17:25:47.150744 5033 generic.go:334] "Generic (PLEG): container finished" podID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerID="482f3935e9ba7be7cac0c84ee207f8f69bd1ebf305baaafe3dbb975755c19aa1" exitCode=0 Feb 26 17:25:47 crc kubenswrapper[5033]: I0226 17:25:47.150816 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" event={"ID":"18742a04-1627-4e97-b9ae-81b4f5457d05","Type":"ContainerDied","Data":"482f3935e9ba7be7cac0c84ee207f8f69bd1ebf305baaafe3dbb975755c19aa1"} Feb 26 17:25:47 crc kubenswrapper[5033]: I0226 17:25:47.834801 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54334857-01f6-40cc-afbe-6435181a45da" path="/var/lib/kubelet/pods/54334857-01f6-40cc-afbe-6435181a45da/volumes" Feb 26 17:25:49 crc kubenswrapper[5033]: I0226 17:25:49.173749 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" event={"ID":"18742a04-1627-4e97-b9ae-81b4f5457d05","Type":"ContainerStarted","Data":"5d4a6e39384d7d4e4b9ef2f0e91eea603e15debf108588e36709fdcd1eb3b0ba"} Feb 26 17:25:50 crc kubenswrapper[5033]: I0226 17:25:50.184844 5033 generic.go:334] "Generic (PLEG): container finished" podID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerID="5d4a6e39384d7d4e4b9ef2f0e91eea603e15debf108588e36709fdcd1eb3b0ba" exitCode=0 Feb 26 17:25:50 crc kubenswrapper[5033]: I0226 17:25:50.184919 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" event={"ID":"18742a04-1627-4e97-b9ae-81b4f5457d05","Type":"ContainerDied","Data":"5d4a6e39384d7d4e4b9ef2f0e91eea603e15debf108588e36709fdcd1eb3b0ba"} Feb 26 17:25:51 crc kubenswrapper[5033]: I0226 17:25:51.073741 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7vnpp"] Feb 26 17:25:51 crc kubenswrapper[5033]: I0226 17:25:51.085672 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-150c-account-create-update-zp46w"] Feb 26 17:25:51 crc kubenswrapper[5033]: I0226 17:25:51.094838 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7vnpp"] Feb 26 17:25:51 crc kubenswrapper[5033]: I0226 17:25:51.104231 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-150c-account-create-update-zp46w"] Feb 26 17:25:51 crc kubenswrapper[5033]: I0226 17:25:51.205513 5033 generic.go:334] "Generic (PLEG): container finished" podID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerID="75b0af8636945266a27ba77544db398f433b9c0b351f9f799f24a9e01452ee70" exitCode=0 Feb 26 17:25:51 crc kubenswrapper[5033]: I0226 17:25:51.205581 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" event={"ID":"18742a04-1627-4e97-b9ae-81b4f5457d05","Type":"ContainerDied","Data":"75b0af8636945266a27ba77544db398f433b9c0b351f9f799f24a9e01452ee70"} Feb 26 17:25:51 crc kubenswrapper[5033]: I0226 17:25:51.639233 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a52682-a5b9-41c1-a122-c251a1a69f5a" path="/var/lib/kubelet/pods/c3a52682-a5b9-41c1-a122-c251a1a69f5a/volumes" Feb 26 17:25:51 crc kubenswrapper[5033]: I0226 17:25:51.640462 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a47002-d1e8-4687-adc2-9a34643775e3" path="/var/lib/kubelet/pods/f7a47002-d1e8-4687-adc2-9a34643775e3/volumes" Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.678462 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.802119 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26pz4\" (UniqueName: \"kubernetes.io/projected/18742a04-1627-4e97-b9ae-81b4f5457d05-kube-api-access-26pz4\") pod \"18742a04-1627-4e97-b9ae-81b4f5457d05\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.802225 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-bundle\") pod \"18742a04-1627-4e97-b9ae-81b4f5457d05\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.802507 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-util\") pod \"18742a04-1627-4e97-b9ae-81b4f5457d05\" (UID: \"18742a04-1627-4e97-b9ae-81b4f5457d05\") " Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.804190 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-bundle" (OuterVolumeSpecName: "bundle") pod "18742a04-1627-4e97-b9ae-81b4f5457d05" (UID: "18742a04-1627-4e97-b9ae-81b4f5457d05"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.808144 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18742a04-1627-4e97-b9ae-81b4f5457d05-kube-api-access-26pz4" (OuterVolumeSpecName: "kube-api-access-26pz4") pod "18742a04-1627-4e97-b9ae-81b4f5457d05" (UID: "18742a04-1627-4e97-b9ae-81b4f5457d05"). InnerVolumeSpecName "kube-api-access-26pz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.809660 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-util" (OuterVolumeSpecName: "util") pod "18742a04-1627-4e97-b9ae-81b4f5457d05" (UID: "18742a04-1627-4e97-b9ae-81b4f5457d05"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.905207 5033 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-util\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.905241 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26pz4\" (UniqueName: \"kubernetes.io/projected/18742a04-1627-4e97-b9ae-81b4f5457d05-kube-api-access-26pz4\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:52 crc kubenswrapper[5033]: I0226 17:25:52.905282 5033 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18742a04-1627-4e97-b9ae-81b4f5457d05-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:25:53 crc kubenswrapper[5033]: I0226 17:25:53.228858 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" event={"ID":"18742a04-1627-4e97-b9ae-81b4f5457d05","Type":"ContainerDied","Data":"548c6644d7234b55ab26d176bd8a704556ccdd48893958238d81088db8bc938e"} Feb 26 17:25:53 crc kubenswrapper[5033]: I0226 17:25:53.228910 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="548c6644d7234b55ab26d176bd8a704556ccdd48893958238d81088db8bc938e" Feb 26 17:25:53 crc kubenswrapper[5033]: I0226 17:25:53.228986 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6" Feb 26 17:25:56 crc kubenswrapper[5033]: I0226 17:25:56.059472 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-rm685"] Feb 26 17:25:56 crc kubenswrapper[5033]: I0226 17:25:56.071334 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-rm685"] Feb 26 17:25:57 crc kubenswrapper[5033]: I0226 17:25:57.644893 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ede1906-e96c-43de-9108-5ab8febc23d6" path="/var/lib/kubelet/pods/2ede1906-e96c-43de-9108-5ab8febc23d6/volumes" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.177111 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535446-ldhxw"] Feb 26 17:26:00 crc kubenswrapper[5033]: E0226 17:26:00.178403 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerName="util" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.178425 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerName="util" Feb 26 17:26:00 crc kubenswrapper[5033]: E0226 17:26:00.178469 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.178481 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon" Feb 26 17:26:00 crc kubenswrapper[5033]: E0226 17:26:00.178500 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerName="extract" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.178512 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerName="extract" Feb 26 17:26:00 crc kubenswrapper[5033]: E0226 17:26:00.178529 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerName="pull" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.178540 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerName="pull" Feb 26 17:26:00 crc kubenswrapper[5033]: E0226 17:26:00.178564 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon-log" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.178577 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon-log" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.178878 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.178916 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="18742a04-1627-4e97-b9ae-81b4f5457d05" containerName="extract" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.178948 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="54334857-01f6-40cc-afbe-6435181a45da" containerName="horizon-log" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.180055 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535446-ldhxw" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.184227 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.187889 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.188132 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.195048 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535446-ldhxw"] Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.292433 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4747s\" (UniqueName: \"kubernetes.io/projected/595a35be-ffc3-4329-a937-c9f2c09e4612-kube-api-access-4747s\") pod \"auto-csr-approver-29535446-ldhxw\" (UID: \"595a35be-ffc3-4329-a937-c9f2c09e4612\") " pod="openshift-infra/auto-csr-approver-29535446-ldhxw" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.394370 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4747s\" (UniqueName: \"kubernetes.io/projected/595a35be-ffc3-4329-a937-c9f2c09e4612-kube-api-access-4747s\") pod \"auto-csr-approver-29535446-ldhxw\" (UID: \"595a35be-ffc3-4329-a937-c9f2c09e4612\") " pod="openshift-infra/auto-csr-approver-29535446-ldhxw" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.413670 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4747s\" (UniqueName: \"kubernetes.io/projected/595a35be-ffc3-4329-a937-c9f2c09e4612-kube-api-access-4747s\") pod \"auto-csr-approver-29535446-ldhxw\" (UID: \"595a35be-ffc3-4329-a937-c9f2c09e4612\") " pod="openshift-infra/auto-csr-approver-29535446-ldhxw" Feb 26 17:26:00 crc kubenswrapper[5033]: I0226 17:26:00.506707 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535446-ldhxw" Feb 26 17:26:01 crc kubenswrapper[5033]: I0226 17:26:01.346605 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535446-ldhxw"] Feb 26 17:26:02 crc kubenswrapper[5033]: I0226 17:26:02.336507 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535446-ldhxw" event={"ID":"595a35be-ffc3-4329-a937-c9f2c09e4612","Type":"ContainerStarted","Data":"11ab0cb146b637e89bc6f35dced46b8f7405ca315e921c69ad56ec14f2f18420"} Feb 26 17:26:03 crc kubenswrapper[5033]: I0226 17:26:03.348125 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535446-ldhxw" event={"ID":"595a35be-ffc3-4329-a937-c9f2c09e4612","Type":"ContainerStarted","Data":"f5e6ceeeb50b83c8b94a8f75de6be0aa7e5bcbc14494dc37e4d10ee8d97cefff"} Feb 26 17:26:03 crc kubenswrapper[5033]: I0226 17:26:03.366042 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535446-ldhxw" podStartSLOduration=1.8924366510000001 podStartE2EDuration="3.366021503s" podCreationTimestamp="2026-02-26 17:26:00 +0000 UTC" firstStartedPulling="2026-02-26 17:26:01.342443407 +0000 UTC m=+6288.555042057" lastFinishedPulling="2026-02-26 17:26:02.816028259 +0000 UTC m=+6290.028626909" observedRunningTime="2026-02-26 17:26:03.35976507 +0000 UTC m=+6290.572363720" watchObservedRunningTime="2026-02-26 17:26:03.366021503 +0000 UTC m=+6290.578620153" Feb 26 17:26:03 crc kubenswrapper[5033]: I0226 17:26:03.912802 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd"] Feb 26 17:26:03 crc kubenswrapper[5033]: I0226 17:26:03.914683 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd" Feb 26 17:26:03 crc kubenswrapper[5033]: I0226 17:26:03.923794 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 26 17:26:03 crc kubenswrapper[5033]: I0226 17:26:03.923971 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-jm52w" Feb 26 17:26:03 crc kubenswrapper[5033]: I0226 17:26:03.924082 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 26 17:26:03 crc kubenswrapper[5033]: I0226 17:26:03.942191 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd"] Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.000272 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg"] Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.004924 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.014190 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.014601 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-2z27v" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.038704 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg"] Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.091619 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5a524af-f845-4af4-9a29-1b1bd1d7a57e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg\" (UID: \"f5a524af-f845-4af4-9a29-1b1bd1d7a57e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.091746 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5a524af-f845-4af4-9a29-1b1bd1d7a57e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg\" (UID: \"f5a524af-f845-4af4-9a29-1b1bd1d7a57e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.091990 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdss\" (UniqueName: \"kubernetes.io/projected/675d6580-5f3e-46b9-a1e9-83127d5f1948-kube-api-access-2jdss\") pod \"obo-prometheus-operator-68bc856cb9-7nhbd\" (UID: \"675d6580-5f3e-46b9-a1e9-83127d5f1948\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.092368 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls"] Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.093967 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.123061 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls"] Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.163364 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-cjxmz"] Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.164998 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.174860 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.175062 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-vpl4x" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.193917 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed6d813c-516f-4d4d-b20b-6534359e0220-observability-operator-tls\") pod \"observability-operator-59bdc8b94-cjxmz\" (UID: \"ed6d813c-516f-4d4d-b20b-6534359e0220\") " pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.194017 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdss\" (UniqueName: \"kubernetes.io/projected/675d6580-5f3e-46b9-a1e9-83127d5f1948-kube-api-access-2jdss\") pod \"obo-prometheus-operator-68bc856cb9-7nhbd\" (UID: \"675d6580-5f3e-46b9-a1e9-83127d5f1948\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.194141 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5a524af-f845-4af4-9a29-1b1bd1d7a57e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg\" (UID: \"f5a524af-f845-4af4-9a29-1b1bd1d7a57e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.194200 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a23d8887-760f-43dc-83f4-7d0eddea6db1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls\" (UID: \"a23d8887-760f-43dc-83f4-7d0eddea6db1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.194247 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5a524af-f845-4af4-9a29-1b1bd1d7a57e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg\" (UID: \"f5a524af-f845-4af4-9a29-1b1bd1d7a57e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.194307 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a23d8887-760f-43dc-83f4-7d0eddea6db1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls\" (UID: \"a23d8887-760f-43dc-83f4-7d0eddea6db1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.194333 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfx7z\" (UniqueName: \"kubernetes.io/projected/ed6d813c-516f-4d4d-b20b-6534359e0220-kube-api-access-lfx7z\") pod \"observability-operator-59bdc8b94-cjxmz\" (UID: \"ed6d813c-516f-4d4d-b20b-6534359e0220\") " pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.194482 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-cjxmz"] Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.203225 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5a524af-f845-4af4-9a29-1b1bd1d7a57e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg\" (UID: \"f5a524af-f845-4af4-9a29-1b1bd1d7a57e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.204357 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5a524af-f845-4af4-9a29-1b1bd1d7a57e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg\" (UID: \"f5a524af-f845-4af4-9a29-1b1bd1d7a57e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.219922 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdss\" (UniqueName: \"kubernetes.io/projected/675d6580-5f3e-46b9-a1e9-83127d5f1948-kube-api-access-2jdss\") pod \"obo-prometheus-operator-68bc856cb9-7nhbd\" (UID: \"675d6580-5f3e-46b9-a1e9-83127d5f1948\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.242214 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.296529 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a23d8887-760f-43dc-83f4-7d0eddea6db1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls\" (UID: \"a23d8887-760f-43dc-83f4-7d0eddea6db1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.296608 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a23d8887-760f-43dc-83f4-7d0eddea6db1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls\" (UID: \"a23d8887-760f-43dc-83f4-7d0eddea6db1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.296638 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfx7z\" (UniqueName: \"kubernetes.io/projected/ed6d813c-516f-4d4d-b20b-6534359e0220-kube-api-access-lfx7z\") pod \"observability-operator-59bdc8b94-cjxmz\" (UID: \"ed6d813c-516f-4d4d-b20b-6534359e0220\") " pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.296730 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed6d813c-516f-4d4d-b20b-6534359e0220-observability-operator-tls\") pod \"observability-operator-59bdc8b94-cjxmz\" (UID: \"ed6d813c-516f-4d4d-b20b-6534359e0220\") " pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.303463 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a23d8887-760f-43dc-83f4-7d0eddea6db1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls\" (UID: \"a23d8887-760f-43dc-83f4-7d0eddea6db1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.306670 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a23d8887-760f-43dc-83f4-7d0eddea6db1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls\" (UID: \"a23d8887-760f-43dc-83f4-7d0eddea6db1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.310042 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed6d813c-516f-4d4d-b20b-6534359e0220-observability-operator-tls\") pod \"observability-operator-59bdc8b94-cjxmz\" (UID: \"ed6d813c-516f-4d4d-b20b-6534359e0220\") " pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.345139 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfx7z\" (UniqueName: \"kubernetes.io/projected/ed6d813c-516f-4d4d-b20b-6534359e0220-kube-api-access-lfx7z\") pod \"observability-operator-59bdc8b94-cjxmz\" (UID: \"ed6d813c-516f-4d4d-b20b-6534359e0220\") " pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.355982 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-c9bgz"] Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.362119 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.363672 5033 generic.go:334] "Generic (PLEG): container finished" podID="595a35be-ffc3-4329-a937-c9f2c09e4612" containerID="f5e6ceeeb50b83c8b94a8f75de6be0aa7e5bcbc14494dc37e4d10ee8d97cefff" exitCode=0 Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.363722 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535446-ldhxw" event={"ID":"595a35be-ffc3-4329-a937-c9f2c09e4612","Type":"ContainerDied","Data":"f5e6ceeeb50b83c8b94a8f75de6be0aa7e5bcbc14494dc37e4d10ee8d97cefff"} Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.369604 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-jlx2l" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.370585 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.375647 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-c9bgz"] Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.398734 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff5f88ba-1664-44b4-b1da-84fe25772c98-openshift-service-ca\") pod \"perses-operator-5bf474d74f-c9bgz\" (UID: \"ff5f88ba-1664-44b4-b1da-84fe25772c98\") " pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.398790 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwpf5\" (UniqueName: \"kubernetes.io/projected/ff5f88ba-1664-44b4-b1da-84fe25772c98-kube-api-access-cwpf5\") pod \"perses-operator-5bf474d74f-c9bgz\" (UID: \"ff5f88ba-1664-44b4-b1da-84fe25772c98\") " pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.437196 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.501348 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff5f88ba-1664-44b4-b1da-84fe25772c98-openshift-service-ca\") pod \"perses-operator-5bf474d74f-c9bgz\" (UID: \"ff5f88ba-1664-44b4-b1da-84fe25772c98\") " pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.501892 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwpf5\" (UniqueName: \"kubernetes.io/projected/ff5f88ba-1664-44b4-b1da-84fe25772c98-kube-api-access-cwpf5\") pod \"perses-operator-5bf474d74f-c9bgz\" (UID: \"ff5f88ba-1664-44b4-b1da-84fe25772c98\") " pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.503286 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff5f88ba-1664-44b4-b1da-84fe25772c98-openshift-service-ca\") pod \"perses-operator-5bf474d74f-c9bgz\" (UID: \"ff5f88ba-1664-44b4-b1da-84fe25772c98\") " pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.505836 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.519709 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwpf5\" (UniqueName: \"kubernetes.io/projected/ff5f88ba-1664-44b4-b1da-84fe25772c98-kube-api-access-cwpf5\") pod \"perses-operator-5bf474d74f-c9bgz\" (UID: \"ff5f88ba-1664-44b4-b1da-84fe25772c98\") " pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.712219 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:04 crc kubenswrapper[5033]: I0226 17:26:04.914073 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd"] Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.050702 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg"] Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.160420 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-cjxmz"] Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.172655 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls"] Feb 26 17:26:05 crc kubenswrapper[5033]: W0226 17:26:05.174628 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded6d813c_516f_4d4d_b20b_6534359e0220.slice/crio-35ac2145128ef36e6dc98041883ff98f21f495e125a00ba5b9a443aa920db673 WatchSource:0}: Error finding container 35ac2145128ef36e6dc98041883ff98f21f495e125a00ba5b9a443aa920db673: Status 404 returned error can't find the container with id 35ac2145128ef36e6dc98041883ff98f21f495e125a00ba5b9a443aa920db673 Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.349101 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-c9bgz"] Feb 26 17:26:05 crc kubenswrapper[5033]: W0226 17:26:05.351159 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff5f88ba_1664_44b4_b1da_84fe25772c98.slice/crio-eedacdbe3f1679e6184f9cd92e980cba7a274c27c4a07ffd275e31b37d535ed1 WatchSource:0}: Error finding container eedacdbe3f1679e6184f9cd92e980cba7a274c27c4a07ffd275e31b37d535ed1: Status 404 returned error can't find the container with id eedacdbe3f1679e6184f9cd92e980cba7a274c27c4a07ffd275e31b37d535ed1 Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.375630 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd" event={"ID":"675d6580-5f3e-46b9-a1e9-83127d5f1948","Type":"ContainerStarted","Data":"8a33e4f0ab5f29a9fa09d13e5ce62bd1ba2885dc842a53353d566f0cc22f5b17"} Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.377637 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" event={"ID":"ff5f88ba-1664-44b4-b1da-84fe25772c98","Type":"ContainerStarted","Data":"eedacdbe3f1679e6184f9cd92e980cba7a274c27c4a07ffd275e31b37d535ed1"} Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.379277 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" event={"ID":"f5a524af-f845-4af4-9a29-1b1bd1d7a57e","Type":"ContainerStarted","Data":"56e7970d748a80738434740bf79d0ea9e0192b3b0491eed08c974d3bc6c1a65e"} Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.380634 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" event={"ID":"a23d8887-760f-43dc-83f4-7d0eddea6db1","Type":"ContainerStarted","Data":"df0a4f4ec22b69934a704e459ea5c58204675b921f0590bf1bbabd60066cf3ac"} Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.382183 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" event={"ID":"ed6d813c-516f-4d4d-b20b-6534359e0220","Type":"ContainerStarted","Data":"35ac2145128ef36e6dc98041883ff98f21f495e125a00ba5b9a443aa920db673"} Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.792014 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535446-ldhxw" Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.932380 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4747s\" (UniqueName: \"kubernetes.io/projected/595a35be-ffc3-4329-a937-c9f2c09e4612-kube-api-access-4747s\") pod \"595a35be-ffc3-4329-a937-c9f2c09e4612\" (UID: \"595a35be-ffc3-4329-a937-c9f2c09e4612\") " Feb 26 17:26:05 crc kubenswrapper[5033]: I0226 17:26:05.947513 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595a35be-ffc3-4329-a937-c9f2c09e4612-kube-api-access-4747s" (OuterVolumeSpecName: "kube-api-access-4747s") pod "595a35be-ffc3-4329-a937-c9f2c09e4612" (UID: "595a35be-ffc3-4329-a937-c9f2c09e4612"). InnerVolumeSpecName "kube-api-access-4747s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:26:06 crc kubenswrapper[5033]: I0226 17:26:06.035230 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4747s\" (UniqueName: \"kubernetes.io/projected/595a35be-ffc3-4329-a937-c9f2c09e4612-kube-api-access-4747s\") on node \"crc\" DevicePath \"\"" Feb 26 17:26:06 crc kubenswrapper[5033]: I0226 17:26:06.417420 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535446-ldhxw" event={"ID":"595a35be-ffc3-4329-a937-c9f2c09e4612","Type":"ContainerDied","Data":"11ab0cb146b637e89bc6f35dced46b8f7405ca315e921c69ad56ec14f2f18420"} Feb 26 17:26:06 crc kubenswrapper[5033]: I0226 17:26:06.417662 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ab0cb146b637e89bc6f35dced46b8f7405ca315e921c69ad56ec14f2f18420" Feb 26 17:26:06 crc kubenswrapper[5033]: I0226 17:26:06.417497 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535446-ldhxw" Feb 26 17:26:06 crc kubenswrapper[5033]: I0226 17:26:06.446941 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535440-q7pvf"] Feb 26 17:26:06 crc kubenswrapper[5033]: I0226 17:26:06.456064 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535440-q7pvf"] Feb 26 17:26:07 crc kubenswrapper[5033]: I0226 17:26:07.661006 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627dac39-736b-4c35-b933-2ea5be748f72" path="/var/lib/kubelet/pods/627dac39-736b-4c35-b933-2ea5be748f72/volumes" Feb 26 17:26:10 crc kubenswrapper[5033]: I0226 17:26:10.903061 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hqg4j"] Feb 26 17:26:10 crc kubenswrapper[5033]: E0226 17:26:10.904021 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="595a35be-ffc3-4329-a937-c9f2c09e4612" containerName="oc" Feb 26 17:26:10 crc kubenswrapper[5033]: I0226 17:26:10.904037 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="595a35be-ffc3-4329-a937-c9f2c09e4612" containerName="oc" Feb 26 17:26:10 crc kubenswrapper[5033]: I0226 17:26:10.904296 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="595a35be-ffc3-4329-a937-c9f2c09e4612" containerName="oc" Feb 26 17:26:10 crc kubenswrapper[5033]: I0226 17:26:10.905754 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:10 crc kubenswrapper[5033]: I0226 17:26:10.914653 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqg4j"] Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.017217 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mszjx\" (UniqueName: \"kubernetes.io/projected/871c0be3-f427-4567-bb6e-865f76ffa536-kube-api-access-mszjx\") pod \"redhat-marketplace-hqg4j\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.017600 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-catalog-content\") pod \"redhat-marketplace-hqg4j\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.018088 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-utilities\") pod \"redhat-marketplace-hqg4j\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.120209 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-utilities\") pod \"redhat-marketplace-hqg4j\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.120306 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mszjx\" (UniqueName: \"kubernetes.io/projected/871c0be3-f427-4567-bb6e-865f76ffa536-kube-api-access-mszjx\") pod \"redhat-marketplace-hqg4j\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.120343 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-catalog-content\") pod \"redhat-marketplace-hqg4j\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.120938 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-utilities\") pod \"redhat-marketplace-hqg4j\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.120989 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-catalog-content\") pod \"redhat-marketplace-hqg4j\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.140603 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mszjx\" (UniqueName: \"kubernetes.io/projected/871c0be3-f427-4567-bb6e-865f76ffa536-kube-api-access-mszjx\") pod \"redhat-marketplace-hqg4j\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:11 crc kubenswrapper[5033]: I0226 17:26:11.229870 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:18 crc kubenswrapper[5033]: I0226 17:26:18.971057 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqg4j"] Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.642206 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" event={"ID":"a23d8887-760f-43dc-83f4-7d0eddea6db1","Type":"ContainerStarted","Data":"6617a6a1dd8f18160639e5e1914242ce8ead1efe576ae8be7fcf9a0ac58a66df"} Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.644137 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" event={"ID":"f5a524af-f845-4af4-9a29-1b1bd1d7a57e","Type":"ContainerStarted","Data":"6d6d8c6704d1bfb1571e3112a913cd2baa8d84a5a7dc3c5e0af7a4cc3195d72e"} Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.645785 5033 generic.go:334] "Generic (PLEG): container finished" podID="871c0be3-f427-4567-bb6e-865f76ffa536" containerID="ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79" exitCode=0 Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.645834 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqg4j" event={"ID":"871c0be3-f427-4567-bb6e-865f76ffa536","Type":"ContainerDied","Data":"ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79"} Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.645882 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqg4j" event={"ID":"871c0be3-f427-4567-bb6e-865f76ffa536","Type":"ContainerStarted","Data":"c1740325e4b224ade363f276a6dc043c89be3465eade6b2743a17e09fdc88387"} Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.647605 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" event={"ID":"ed6d813c-516f-4d4d-b20b-6534359e0220","Type":"ContainerStarted","Data":"c9a5ee03b8a2b9ec599d882e5e4f546a4786d30aea5d40d55731a8bed9a6c8ba"} Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.647798 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.649784 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd" event={"ID":"675d6580-5f3e-46b9-a1e9-83127d5f1948","Type":"ContainerStarted","Data":"a447958f12cdfbe71cffcb5964d83550e3689d01f3bcf076a1d16c35c48f9c09"} Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.651520 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.653780 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" event={"ID":"ff5f88ba-1664-44b4-b1da-84fe25772c98","Type":"ContainerStarted","Data":"10b99511ad3ccc168b20e74a449cae720675fc5667043540964cccfb23856d86"} Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.653943 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.667194 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls" podStartSLOduration=3.592809908 podStartE2EDuration="16.667171027s" podCreationTimestamp="2026-02-26 17:26:03 +0000 UTC" firstStartedPulling="2026-02-26 17:26:05.182745627 +0000 UTC m=+6292.395344287" lastFinishedPulling="2026-02-26 17:26:18.257106756 +0000 UTC m=+6305.469705406" observedRunningTime="2026-02-26 17:26:19.660029293 +0000 UTC m=+6306.872627943" watchObservedRunningTime="2026-02-26 17:26:19.667171027 +0000 UTC m=+6306.879769687" Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.703423 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" podStartSLOduration=2.815290902 podStartE2EDuration="15.703397512s" podCreationTimestamp="2026-02-26 17:26:04 +0000 UTC" firstStartedPulling="2026-02-26 17:26:05.354199715 +0000 UTC m=+6292.566798375" lastFinishedPulling="2026-02-26 17:26:18.242306335 +0000 UTC m=+6305.454904985" observedRunningTime="2026-02-26 17:26:19.702629524 +0000 UTC m=+6306.915228194" watchObservedRunningTime="2026-02-26 17:26:19.703397512 +0000 UTC m=+6306.915996182" Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.743833 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg" podStartSLOduration=3.594293174 podStartE2EDuration="16.743809039s" podCreationTimestamp="2026-02-26 17:26:03 +0000 UTC" firstStartedPulling="2026-02-26 17:26:05.106577187 +0000 UTC m=+6292.319175837" lastFinishedPulling="2026-02-26 17:26:18.256093042 +0000 UTC m=+6305.468691702" observedRunningTime="2026-02-26 17:26:19.726955128 +0000 UTC m=+6306.939553778" watchObservedRunningTime="2026-02-26 17:26:19.743809039 +0000 UTC m=+6306.956407699" Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.757375 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-cjxmz" podStartSLOduration=2.665564046 podStartE2EDuration="15.75735175s" podCreationTimestamp="2026-02-26 17:26:04 +0000 UTC" firstStartedPulling="2026-02-26 17:26:05.177574861 +0000 UTC m=+6292.390173511" lastFinishedPulling="2026-02-26 17:26:18.269362545 +0000 UTC m=+6305.481961215" observedRunningTime="2026-02-26 17:26:19.75323365 +0000 UTC m=+6306.965832310" watchObservedRunningTime="2026-02-26 17:26:19.75735175 +0000 UTC m=+6306.969950410" Feb 26 17:26:19 crc kubenswrapper[5033]: I0226 17:26:19.827233 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7nhbd" podStartSLOduration=3.496974719 podStartE2EDuration="16.827203797s" podCreationTimestamp="2026-02-26 17:26:03 +0000 UTC" firstStartedPulling="2026-02-26 17:26:04.924392227 +0000 UTC m=+6292.136990877" lastFinishedPulling="2026-02-26 17:26:18.254621305 +0000 UTC m=+6305.467219955" observedRunningTime="2026-02-26 17:26:19.813650305 +0000 UTC m=+6307.026248965" watchObservedRunningTime="2026-02-26 17:26:19.827203797 +0000 UTC m=+6307.039802467" Feb 26 17:26:20 crc kubenswrapper[5033]: I0226 17:26:20.672191 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqg4j" event={"ID":"871c0be3-f427-4567-bb6e-865f76ffa536","Type":"ContainerStarted","Data":"5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057"} Feb 26 17:26:21 crc kubenswrapper[5033]: I0226 17:26:21.695638 5033 generic.go:334] "Generic (PLEG): container finished" podID="871c0be3-f427-4567-bb6e-865f76ffa536" containerID="5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057" exitCode=0 Feb 26 17:26:21 crc kubenswrapper[5033]: I0226 17:26:21.697920 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqg4j" event={"ID":"871c0be3-f427-4567-bb6e-865f76ffa536","Type":"ContainerDied","Data":"5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057"} Feb 26 17:26:23 crc kubenswrapper[5033]: I0226 17:26:23.716734 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqg4j" event={"ID":"871c0be3-f427-4567-bb6e-865f76ffa536","Type":"ContainerStarted","Data":"e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3"} Feb 26 17:26:23 crc kubenswrapper[5033]: I0226 17:26:23.737717 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hqg4j" podStartSLOduration=11.286949482 podStartE2EDuration="13.737694043s" podCreationTimestamp="2026-02-26 17:26:10 +0000 UTC" firstStartedPulling="2026-02-26 17:26:19.647455066 +0000 UTC m=+6306.860053736" lastFinishedPulling="2026-02-26 17:26:22.098199647 +0000 UTC m=+6309.310798297" observedRunningTime="2026-02-26 17:26:23.736653397 +0000 UTC m=+6310.949252047" watchObservedRunningTime="2026-02-26 17:26:23.737694043 +0000 UTC m=+6310.950292693" Feb 26 17:26:24 crc kubenswrapper[5033]: I0226 17:26:24.717960 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-c9bgz" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.352744 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.353345 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="01250dc7-efa2-45f2-87d6-5d1a865369c2" containerName="openstackclient" containerID="cri-o://8d0b492ffb7fcea538d8e31052eeb122a7fee3cbee1a0ee552cf46261b91ab82" gracePeriod=2 Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.361730 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.416662 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 26 17:26:27 crc kubenswrapper[5033]: E0226 17:26:27.417353 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01250dc7-efa2-45f2-87d6-5d1a865369c2" containerName="openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.417385 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="01250dc7-efa2-45f2-87d6-5d1a865369c2" containerName="openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.417660 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="01250dc7-efa2-45f2-87d6-5d1a865369c2" containerName="openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.424853 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.435911 5033 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="01250dc7-efa2-45f2-87d6-5d1a865369c2" podUID="a0a1aafd-6624-43ec-b133-425ec8e6309e" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.473906 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.501928 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0a1aafd-6624-43ec-b133-425ec8e6309e-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0a1aafd-6624-43ec-b133-425ec8e6309e\") " pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.502352 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0a1aafd-6624-43ec-b133-425ec8e6309e-openstack-config\") pod \"openstackclient\" (UID: \"a0a1aafd-6624-43ec-b133-425ec8e6309e\") " pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.502384 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dmv5\" (UniqueName: \"kubernetes.io/projected/a0a1aafd-6624-43ec-b133-425ec8e6309e-kube-api-access-4dmv5\") pod \"openstackclient\" (UID: \"a0a1aafd-6624-43ec-b133-425ec8e6309e\") " pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.593553 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.594834 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.601041 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wb69w" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.605685 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0a1aafd-6624-43ec-b133-425ec8e6309e-openstack-config\") pod \"openstackclient\" (UID: \"a0a1aafd-6624-43ec-b133-425ec8e6309e\") " pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.605861 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dmv5\" (UniqueName: \"kubernetes.io/projected/a0a1aafd-6624-43ec-b133-425ec8e6309e-kube-api-access-4dmv5\") pod \"openstackclient\" (UID: \"a0a1aafd-6624-43ec-b133-425ec8e6309e\") " pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.606226 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf5dv\" (UniqueName: \"kubernetes.io/projected/9a49e383-e3d9-4a97-8f52-df90e3f4bbc9-kube-api-access-rf5dv\") pod \"kube-state-metrics-0\" (UID: \"9a49e383-e3d9-4a97-8f52-df90e3f4bbc9\") " pod="openstack/kube-state-metrics-0" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.606571 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0a1aafd-6624-43ec-b133-425ec8e6309e-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0a1aafd-6624-43ec-b133-425ec8e6309e\") " pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.609057 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0a1aafd-6624-43ec-b133-425ec8e6309e-openstack-config\") pod \"openstackclient\" (UID: \"a0a1aafd-6624-43ec-b133-425ec8e6309e\") " pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.620723 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0a1aafd-6624-43ec-b133-425ec8e6309e-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0a1aafd-6624-43ec-b133-425ec8e6309e\") " pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.631664 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.652120 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dmv5\" (UniqueName: \"kubernetes.io/projected/a0a1aafd-6624-43ec-b133-425ec8e6309e-kube-api-access-4dmv5\") pod \"openstackclient\" (UID: \"a0a1aafd-6624-43ec-b133-425ec8e6309e\") " pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.717299 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf5dv\" (UniqueName: \"kubernetes.io/projected/9a49e383-e3d9-4a97-8f52-df90e3f4bbc9-kube-api-access-rf5dv\") pod \"kube-state-metrics-0\" (UID: \"9a49e383-e3d9-4a97-8f52-df90e3f4bbc9\") " pod="openstack/kube-state-metrics-0" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.746928 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf5dv\" (UniqueName: \"kubernetes.io/projected/9a49e383-e3d9-4a97-8f52-df90e3f4bbc9-kube-api-access-rf5dv\") pod \"kube-state-metrics-0\" (UID: \"9a49e383-e3d9-4a97-8f52-df90e3f4bbc9\") " pod="openstack/kube-state-metrics-0" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.767707 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 17:26:27 crc kubenswrapper[5033]: I0226 17:26:27.923551 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.296059 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.298697 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.303771 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-l6dgk" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.304034 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.304188 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.304401 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.304881 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.311470 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.448626 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp2fp\" (UniqueName: \"kubernetes.io/projected/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-kube-api-access-tp2fp\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.449794 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.449897 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.450031 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.450195 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.450334 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.450455 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.552455 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.552541 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.552575 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.552617 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.552683 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp2fp\" (UniqueName: \"kubernetes.io/projected/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-kube-api-access-tp2fp\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.552700 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.552719 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.553646 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.564219 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.576669 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.577453 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.579907 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.591373 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.591996 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp2fp\" (UniqueName: \"kubernetes.io/projected/f9343e5c-17b8-4f9f-bd66-071d27c0b43a-kube-api-access-tp2fp\") pod \"alertmanager-metric-storage-0\" (UID: \"f9343e5c-17b8-4f9f-bd66-071d27c0b43a\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.643004 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.770806 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.901534 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 17:26:28 crc kubenswrapper[5033]: I0226 17:26:28.989563 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.010967 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.030548 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.030782 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.030886 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.031491 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.031535 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.031603 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.031704 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.031865 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-lxr2g" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.040751 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176581 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/07c079fd-cffe-457d-9416-4cb765fd19de-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176631 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/07c079fd-cffe-457d-9416-4cb765fd19de-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176674 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/07c079fd-cffe-457d-9416-4cb765fd19de-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176697 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/07c079fd-cffe-457d-9416-4cb765fd19de-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176740 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bf1d7896-6402-4f6a-a676-34f9393d76bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bf1d7896-6402-4f6a-a676-34f9393d76bf\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176846 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/07c079fd-cffe-457d-9416-4cb765fd19de-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176898 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6596\" (UniqueName: \"kubernetes.io/projected/07c079fd-cffe-457d-9416-4cb765fd19de-kube-api-access-q6596\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176925 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/07c079fd-cffe-457d-9416-4cb765fd19de-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176943 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07c079fd-cffe-457d-9416-4cb765fd19de-config\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.176959 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/07c079fd-cffe-457d-9416-4cb765fd19de-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284102 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/07c079fd-cffe-457d-9416-4cb765fd19de-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284470 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07c079fd-cffe-457d-9416-4cb765fd19de-config\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284497 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/07c079fd-cffe-457d-9416-4cb765fd19de-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284533 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/07c079fd-cffe-457d-9416-4cb765fd19de-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284565 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/07c079fd-cffe-457d-9416-4cb765fd19de-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284618 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/07c079fd-cffe-457d-9416-4cb765fd19de-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284641 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/07c079fd-cffe-457d-9416-4cb765fd19de-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284673 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bf1d7896-6402-4f6a-a676-34f9393d76bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bf1d7896-6402-4f6a-a676-34f9393d76bf\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284807 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/07c079fd-cffe-457d-9416-4cb765fd19de-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.284877 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6596\" (UniqueName: \"kubernetes.io/projected/07c079fd-cffe-457d-9416-4cb765fd19de-kube-api-access-q6596\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.285944 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/07c079fd-cffe-457d-9416-4cb765fd19de-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.296504 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/07c079fd-cffe-457d-9416-4cb765fd19de-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.298201 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/07c079fd-cffe-457d-9416-4cb765fd19de-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.300394 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/07c079fd-cffe-457d-9416-4cb765fd19de-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.314561 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/07c079fd-cffe-457d-9416-4cb765fd19de-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.326008 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6596\" (UniqueName: \"kubernetes.io/projected/07c079fd-cffe-457d-9416-4cb765fd19de-kube-api-access-q6596\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.328834 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/07c079fd-cffe-457d-9416-4cb765fd19de-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.330633 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/07c079fd-cffe-457d-9416-4cb765fd19de-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.336285 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/07c079fd-cffe-457d-9416-4cb765fd19de-config\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.357474 5033 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.357509 5033 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bf1d7896-6402-4f6a-a676-34f9393d76bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bf1d7896-6402-4f6a-a676-34f9393d76bf\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4caadee1feaf0941fd923946fa43f5bcb5657af8d8d73126c5f4c9a1dbed90c6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.584784 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bf1d7896-6402-4f6a-a676-34f9393d76bf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bf1d7896-6402-4f6a-a676-34f9393d76bf\") pod \"prometheus-metric-storage-0\" (UID: \"07c079fd-cffe-457d-9416-4cb765fd19de\") " pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.714796 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.848015 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.889653 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9a49e383-e3d9-4a97-8f52-df90e3f4bbc9","Type":"ContainerStarted","Data":"927d9473d7506c6ba90f9372006aa71dcdd20570a9f3bf67d11d48db33fc4ea9"} Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.899990 5033 generic.go:334] "Generic (PLEG): container finished" podID="01250dc7-efa2-45f2-87d6-5d1a865369c2" containerID="8d0b492ffb7fcea538d8e31052eeb122a7fee3cbee1a0ee552cf46261b91ab82" exitCode=137 Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.906008 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a0a1aafd-6624-43ec-b133-425ec8e6309e","Type":"ContainerStarted","Data":"0ca100f07aa86c9e9e18bcd88f1431ad2aac6011fe2e5670ad809dae45083f7e"} Feb 26 17:26:29 crc kubenswrapper[5033]: I0226 17:26:29.970170 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.110313 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config-secret\") pod \"01250dc7-efa2-45f2-87d6-5d1a865369c2\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.110555 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config\") pod \"01250dc7-efa2-45f2-87d6-5d1a865369c2\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.110610 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82c7x\" (UniqueName: \"kubernetes.io/projected/01250dc7-efa2-45f2-87d6-5d1a865369c2-kube-api-access-82c7x\") pod \"01250dc7-efa2-45f2-87d6-5d1a865369c2\" (UID: \"01250dc7-efa2-45f2-87d6-5d1a865369c2\") " Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.118452 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01250dc7-efa2-45f2-87d6-5d1a865369c2-kube-api-access-82c7x" (OuterVolumeSpecName: "kube-api-access-82c7x") pod "01250dc7-efa2-45f2-87d6-5d1a865369c2" (UID: "01250dc7-efa2-45f2-87d6-5d1a865369c2"). InnerVolumeSpecName "kube-api-access-82c7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.174132 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "01250dc7-efa2-45f2-87d6-5d1a865369c2" (UID: "01250dc7-efa2-45f2-87d6-5d1a865369c2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.199495 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "01250dc7-efa2-45f2-87d6-5d1a865369c2" (UID: "01250dc7-efa2-45f2-87d6-5d1a865369c2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.214225 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.214282 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82c7x\" (UniqueName: \"kubernetes.io/projected/01250dc7-efa2-45f2-87d6-5d1a865369c2-kube-api-access-82c7x\") on node \"crc\" DevicePath \"\"" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.214294 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01250dc7-efa2-45f2-87d6-5d1a865369c2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.377954 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.917010 5033 scope.go:117] "RemoveContainer" containerID="8d0b492ffb7fcea538d8e31052eeb122a7fee3cbee1a0ee552cf46261b91ab82" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.917220 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.925714 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a0a1aafd-6624-43ec-b133-425ec8e6309e","Type":"ContainerStarted","Data":"2a894df149b94ccf44ed803543d4cb2c105e45f7589fb8f0cd50e39613b879ce"} Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.928004 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07c079fd-cffe-457d-9416-4cb765fd19de","Type":"ContainerStarted","Data":"7fde8a05a47e62b84736d99fc05ca44651a330418efed5fe47f0a79666a16237"} Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.934824 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f9343e5c-17b8-4f9f-bd66-071d27c0b43a","Type":"ContainerStarted","Data":"d8e680a8759ac287d020129e7eab658b3f432b434aaed9c897c44df9e5c0801b"} Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.936809 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9a49e383-e3d9-4a97-8f52-df90e3f4bbc9","Type":"ContainerStarted","Data":"f3269dbf1efc422a92b938439213c4c95f082fa39e50e766167d9b832516901b"} Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.937506 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.962798 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.962776719 podStartE2EDuration="3.962776719s" podCreationTimestamp="2026-02-26 17:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:26:30.947597118 +0000 UTC m=+6318.160195768" watchObservedRunningTime="2026-02-26 17:26:30.962776719 +0000 UTC m=+6318.175375379" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.970965 5033 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="01250dc7-efa2-45f2-87d6-5d1a865369c2" podUID="a0a1aafd-6624-43ec-b133-425ec8e6309e" Feb 26 17:26:30 crc kubenswrapper[5033]: I0226 17:26:30.973638 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.248718727 podStartE2EDuration="3.973616764s" podCreationTimestamp="2026-02-26 17:26:27 +0000 UTC" firstStartedPulling="2026-02-26 17:26:28.809475203 +0000 UTC m=+6316.022073853" lastFinishedPulling="2026-02-26 17:26:29.53437324 +0000 UTC m=+6316.746971890" observedRunningTime="2026-02-26 17:26:30.966045869 +0000 UTC m=+6318.178644519" watchObservedRunningTime="2026-02-26 17:26:30.973616764 +0000 UTC m=+6318.186215424" Feb 26 17:26:31 crc kubenswrapper[5033]: I0226 17:26:31.230513 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:31 crc kubenswrapper[5033]: I0226 17:26:31.230613 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:31 crc kubenswrapper[5033]: I0226 17:26:31.294670 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:31 crc kubenswrapper[5033]: I0226 17:26:31.639734 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01250dc7-efa2-45f2-87d6-5d1a865369c2" path="/var/lib/kubelet/pods/01250dc7-efa2-45f2-87d6-5d1a865369c2/volumes" Feb 26 17:26:32 crc kubenswrapper[5033]: I0226 17:26:32.009348 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:32 crc kubenswrapper[5033]: I0226 17:26:32.067024 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqg4j"] Feb 26 17:26:33 crc kubenswrapper[5033]: I0226 17:26:33.967195 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hqg4j" podUID="871c0be3-f427-4567-bb6e-865f76ffa536" containerName="registry-server" containerID="cri-o://e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3" gracePeriod=2 Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.609833 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.701585 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-utilities\") pod \"871c0be3-f427-4567-bb6e-865f76ffa536\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.701660 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-catalog-content\") pod \"871c0be3-f427-4567-bb6e-865f76ffa536\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.701834 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mszjx\" (UniqueName: \"kubernetes.io/projected/871c0be3-f427-4567-bb6e-865f76ffa536-kube-api-access-mszjx\") pod \"871c0be3-f427-4567-bb6e-865f76ffa536\" (UID: \"871c0be3-f427-4567-bb6e-865f76ffa536\") " Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.702320 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-utilities" (OuterVolumeSpecName: "utilities") pod "871c0be3-f427-4567-bb6e-865f76ffa536" (UID: "871c0be3-f427-4567-bb6e-865f76ffa536"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.702891 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.708658 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/871c0be3-f427-4567-bb6e-865f76ffa536-kube-api-access-mszjx" (OuterVolumeSpecName: "kube-api-access-mszjx") pod "871c0be3-f427-4567-bb6e-865f76ffa536" (UID: "871c0be3-f427-4567-bb6e-865f76ffa536"). InnerVolumeSpecName "kube-api-access-mszjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.729985 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "871c0be3-f427-4567-bb6e-865f76ffa536" (UID: "871c0be3-f427-4567-bb6e-865f76ffa536"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.805130 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mszjx\" (UniqueName: \"kubernetes.io/projected/871c0be3-f427-4567-bb6e-865f76ffa536-kube-api-access-mszjx\") on node \"crc\" DevicePath \"\"" Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.805410 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871c0be3-f427-4567-bb6e-865f76ffa536-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.981854 5033 generic.go:334] "Generic (PLEG): container finished" podID="871c0be3-f427-4567-bb6e-865f76ffa536" containerID="e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3" exitCode=0 Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.981898 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqg4j" event={"ID":"871c0be3-f427-4567-bb6e-865f76ffa536","Type":"ContainerDied","Data":"e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3"} Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.981958 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqg4j" event={"ID":"871c0be3-f427-4567-bb6e-865f76ffa536","Type":"ContainerDied","Data":"c1740325e4b224ade363f276a6dc043c89be3465eade6b2743a17e09fdc88387"} Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.981984 5033 scope.go:117] "RemoveContainer" containerID="e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3" Feb 26 17:26:34 crc kubenswrapper[5033]: I0226 17:26:34.983169 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqg4j" Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.023419 5033 scope.go:117] "RemoveContainer" containerID="5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057" Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.030709 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqg4j"] Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.042291 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqg4j"] Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.087652 5033 scope.go:117] "RemoveContainer" containerID="ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79" Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.420363 5033 scope.go:117] "RemoveContainer" containerID="e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3" Feb 26 17:26:35 crc kubenswrapper[5033]: E0226 17:26:35.420751 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3\": container with ID starting with e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3 not found: ID does not exist" containerID="e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3" Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.420776 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3"} err="failed to get container status \"e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3\": rpc error: code = NotFound desc = could not find container \"e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3\": container with ID starting with e27f57a56cc27b1d316c0a379aa7b9ee654b52ea9d0c0f6238c55563ece177a3 not found: ID does not exist" Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.420798 5033 scope.go:117] "RemoveContainer" containerID="5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057" Feb 26 17:26:35 crc kubenswrapper[5033]: E0226 17:26:35.421982 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057\": container with ID starting with 5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057 not found: ID does not exist" containerID="5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057" Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.422087 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057"} err="failed to get container status \"5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057\": rpc error: code = NotFound desc = could not find container \"5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057\": container with ID starting with 5fadd1ef661e1a19bdb297cfba6ff769eac41fe3a4cefdb9ca2a1b2dc22fa057 not found: ID does not exist" Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.422167 5033 scope.go:117] "RemoveContainer" containerID="ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79" Feb 26 17:26:35 crc kubenswrapper[5033]: E0226 17:26:35.422544 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79\": container with ID starting with ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79 not found: ID does not exist" containerID="ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79" Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.422587 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79"} err="failed to get container status \"ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79\": rpc error: code = NotFound desc = could not find container \"ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79\": container with ID starting with ecc0a3c3e6f573c2696b838e8dd731a88f8c18c18e3082915cf40e1bca185e79 not found: ID does not exist" Feb 26 17:26:35 crc kubenswrapper[5033]: I0226 17:26:35.643107 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="871c0be3-f427-4567-bb6e-865f76ffa536" path="/var/lib/kubelet/pods/871c0be3-f427-4567-bb6e-865f76ffa536/volumes" Feb 26 17:26:36 crc kubenswrapper[5033]: I0226 17:26:36.766983 5033 scope.go:117] "RemoveContainer" containerID="03aa3c9e49fa89a9e21e8ebd98bc118f285f1deb72cf429daa6b468dd1f12189" Feb 26 17:26:36 crc kubenswrapper[5033]: I0226 17:26:36.829734 5033 scope.go:117] "RemoveContainer" containerID="b4ba149fa1e83b3c267ef2584833fcea66da9b6be0a7fceffd5f84c151caf79c" Feb 26 17:26:36 crc kubenswrapper[5033]: I0226 17:26:36.865090 5033 scope.go:117] "RemoveContainer" containerID="d33f54c76f48b6a2978f86307569ec5f7151fc932b878b095567cdda3c8ca97c" Feb 26 17:26:36 crc kubenswrapper[5033]: I0226 17:26:36.920820 5033 scope.go:117] "RemoveContainer" containerID="d9b026180b470927a12c2159555297e04830e9eb4b396d091f3bfcd5f5ae1829" Feb 26 17:26:37 crc kubenswrapper[5033]: I0226 17:26:37.030252 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07c079fd-cffe-457d-9416-4cb765fd19de","Type":"ContainerStarted","Data":"6f1845e7440a0de568886a4ef3c75bd483f4f2278d32bd40db55233d1227e19c"} Feb 26 17:26:37 crc kubenswrapper[5033]: I0226 17:26:37.036391 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f9343e5c-17b8-4f9f-bd66-071d27c0b43a","Type":"ContainerStarted","Data":"3e96cd79223a4c79bd661fa0b27fca944d33c97aab0b4253f84a4c790992960f"} Feb 26 17:26:37 crc kubenswrapper[5033]: I0226 17:26:37.928064 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 17:26:45 crc kubenswrapper[5033]: I0226 17:26:45.134499 5033 generic.go:334] "Generic (PLEG): container finished" podID="07c079fd-cffe-457d-9416-4cb765fd19de" containerID="6f1845e7440a0de568886a4ef3c75bd483f4f2278d32bd40db55233d1227e19c" exitCode=0 Feb 26 17:26:45 crc kubenswrapper[5033]: I0226 17:26:45.134622 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07c079fd-cffe-457d-9416-4cb765fd19de","Type":"ContainerDied","Data":"6f1845e7440a0de568886a4ef3c75bd483f4f2278d32bd40db55233d1227e19c"} Feb 26 17:26:45 crc kubenswrapper[5033]: I0226 17:26:45.139940 5033 generic.go:334] "Generic (PLEG): container finished" podID="f9343e5c-17b8-4f9f-bd66-071d27c0b43a" containerID="3e96cd79223a4c79bd661fa0b27fca944d33c97aab0b4253f84a4c790992960f" exitCode=0 Feb 26 17:26:45 crc kubenswrapper[5033]: I0226 17:26:45.139978 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f9343e5c-17b8-4f9f-bd66-071d27c0b43a","Type":"ContainerDied","Data":"3e96cd79223a4c79bd661fa0b27fca944d33c97aab0b4253f84a4c790992960f"} Feb 26 17:26:45 crc kubenswrapper[5033]: I0226 17:26:45.140630 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:26:48 crc kubenswrapper[5033]: I0226 17:26:48.179780 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f9343e5c-17b8-4f9f-bd66-071d27c0b43a","Type":"ContainerStarted","Data":"b95728b40f29d69ab82c472c29f762c3f0d97690acb9f9a81f74287c2c33353a"} Feb 26 17:26:51 crc kubenswrapper[5033]: I0226 17:26:51.242827 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07c079fd-cffe-457d-9416-4cb765fd19de","Type":"ContainerStarted","Data":"10b82463f3ec7ea3d9b62cb71145f932267ac3607f73c3c150c0012a62481401"} Feb 26 17:26:51 crc kubenswrapper[5033]: I0226 17:26:51.245663 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"f9343e5c-17b8-4f9f-bd66-071d27c0b43a","Type":"ContainerStarted","Data":"e4ce11a004c5da13eb2c6d27488d3b76d867cf0ee8a60f9502857e310b2f3c47"} Feb 26 17:26:51 crc kubenswrapper[5033]: I0226 17:26:51.246192 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:51 crc kubenswrapper[5033]: I0226 17:26:51.249597 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 26 17:26:51 crc kubenswrapper[5033]: I0226 17:26:51.274097 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.448614177 podStartE2EDuration="23.274079013s" podCreationTimestamp="2026-02-26 17:26:28 +0000 UTC" firstStartedPulling="2026-02-26 17:26:29.90119465 +0000 UTC m=+6317.113793300" lastFinishedPulling="2026-02-26 17:26:47.726659486 +0000 UTC m=+6334.939258136" observedRunningTime="2026-02-26 17:26:51.266887688 +0000 UTC m=+6338.479486378" watchObservedRunningTime="2026-02-26 17:26:51.274079013 +0000 UTC m=+6338.486677663" Feb 26 17:26:53 crc kubenswrapper[5033]: I0226 17:26:53.077079 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-7qh6c"] Feb 26 17:26:53 crc kubenswrapper[5033]: I0226 17:26:53.096644 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-7qh6c"] Feb 26 17:26:53 crc kubenswrapper[5033]: I0226 17:26:53.106001 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-fqlpb"] Feb 26 17:26:53 crc kubenswrapper[5033]: I0226 17:26:53.117706 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-09eb-account-create-update-hhcv8"] Feb 26 17:26:53 crc kubenswrapper[5033]: I0226 17:26:53.126795 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-fqlpb"] Feb 26 17:26:53 crc kubenswrapper[5033]: I0226 17:26:53.135842 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-09eb-account-create-update-hhcv8"] Feb 26 17:26:53 crc kubenswrapper[5033]: I0226 17:26:53.642094 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="019ce90c-a2fb-4c62-90e1-13a61c3190f8" path="/var/lib/kubelet/pods/019ce90c-a2fb-4c62-90e1-13a61c3190f8/volumes" Feb 26 17:26:53 crc kubenswrapper[5033]: I0226 17:26:53.643239 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e31358f-1ef9-46c9-9eaf-209f8a78cd9c" path="/var/lib/kubelet/pods/0e31358f-1ef9-46c9-9eaf-209f8a78cd9c/volumes" Feb 26 17:26:53 crc kubenswrapper[5033]: I0226 17:26:53.644083 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7157e0-f691-44e5-b73a-8cb559993892" path="/var/lib/kubelet/pods/7c7157e0-f691-44e5-b73a-8cb559993892/volumes" Feb 26 17:26:54 crc kubenswrapper[5033]: I0226 17:26:54.052783 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-gc7xz"] Feb 26 17:26:54 crc kubenswrapper[5033]: I0226 17:26:54.062567 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-gc7xz"] Feb 26 17:26:54 crc kubenswrapper[5033]: I0226 17:26:54.071406 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a96f-account-create-update-hxbdp"] Feb 26 17:26:54 crc kubenswrapper[5033]: I0226 17:26:54.080772 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a96f-account-create-update-hxbdp"] Feb 26 17:26:54 crc kubenswrapper[5033]: I0226 17:26:54.088409 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1590-account-create-update-nq5gt"] Feb 26 17:26:54 crc kubenswrapper[5033]: I0226 17:26:54.096568 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1590-account-create-update-nq5gt"] Feb 26 17:26:55 crc kubenswrapper[5033]: I0226 17:26:55.299526 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07c079fd-cffe-457d-9416-4cb765fd19de","Type":"ContainerStarted","Data":"0ffacb45194d4141e9f3d79d8ac5342192e3c1f64fa2b0c77bf312489802a15f"} Feb 26 17:26:55 crc kubenswrapper[5033]: I0226 17:26:55.646411 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc806cf-c9e4-43dc-895a-5b5186d1416c" path="/var/lib/kubelet/pods/7bc806cf-c9e4-43dc-895a-5b5186d1416c/volumes" Feb 26 17:26:55 crc kubenswrapper[5033]: I0226 17:26:55.647167 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b00dae15-cf05-496e-8291-97c26f4a5cf4" path="/var/lib/kubelet/pods/b00dae15-cf05-496e-8291-97c26f4a5cf4/volumes" Feb 26 17:26:55 crc kubenswrapper[5033]: I0226 17:26:55.647998 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a20a7d-c65b-4934-b634-bcda152fd8a8" path="/var/lib/kubelet/pods/e3a20a7d-c65b-4934-b634-bcda152fd8a8/volumes" Feb 26 17:26:59 crc kubenswrapper[5033]: I0226 17:26:59.347881 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07c079fd-cffe-457d-9416-4cb765fd19de","Type":"ContainerStarted","Data":"7a510b2d4844ebc0249e9b1062a0598ea1d502997774b00737f4d449e3088031"} Feb 26 17:26:59 crc kubenswrapper[5033]: I0226 17:26:59.387939 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.548194057 podStartE2EDuration="32.387914778s" podCreationTimestamp="2026-02-26 17:26:27 +0000 UTC" firstStartedPulling="2026-02-26 17:26:30.382236289 +0000 UTC m=+6317.594834959" lastFinishedPulling="2026-02-26 17:26:58.22195704 +0000 UTC m=+6345.434555680" observedRunningTime="2026-02-26 17:26:59.382051485 +0000 UTC m=+6346.594650175" watchObservedRunningTime="2026-02-26 17:26:59.387914778 +0000 UTC m=+6346.600513428" Feb 26 17:26:59 crc kubenswrapper[5033]: I0226 17:26:59.716834 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:59 crc kubenswrapper[5033]: I0226 17:26:59.716896 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 26 17:26:59 crc kubenswrapper[5033]: I0226 17:26:59.719814 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 26 17:27:00 crc kubenswrapper[5033]: I0226 17:27:00.362613 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 26 17:27:04 crc kubenswrapper[5033]: I0226 17:27:04.081189 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bwn8f"] Feb 26 17:27:04 crc kubenswrapper[5033]: I0226 17:27:04.110456 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bwn8f"] Feb 26 17:27:05 crc kubenswrapper[5033]: I0226 17:27:05.639597 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a99f35-d435-4af6-9c92-35202db780d1" path="/var/lib/kubelet/pods/12a99f35-d435-4af6-9c92-35202db780d1/volumes" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.519794 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:27:06 crc kubenswrapper[5033]: E0226 17:27:06.520307 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871c0be3-f427-4567-bb6e-865f76ffa536" containerName="extract-content" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.520329 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="871c0be3-f427-4567-bb6e-865f76ffa536" containerName="extract-content" Feb 26 17:27:06 crc kubenswrapper[5033]: E0226 17:27:06.520354 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871c0be3-f427-4567-bb6e-865f76ffa536" containerName="registry-server" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.520363 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="871c0be3-f427-4567-bb6e-865f76ffa536" containerName="registry-server" Feb 26 17:27:06 crc kubenswrapper[5033]: E0226 17:27:06.520374 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871c0be3-f427-4567-bb6e-865f76ffa536" containerName="extract-utilities" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.520383 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="871c0be3-f427-4567-bb6e-865f76ffa536" containerName="extract-utilities" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.520642 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="871c0be3-f427-4567-bb6e-865f76ffa536" containerName="registry-server" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.522929 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.525384 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.526737 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.546390 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.674133 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-run-httpd\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.674275 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-log-httpd\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.674357 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.674666 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck5h7\" (UniqueName: \"kubernetes.io/projected/64a632d2-9373-49df-8776-18715a267980-kube-api-access-ck5h7\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.674744 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.674799 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-scripts\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.674913 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-config-data\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.777325 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck5h7\" (UniqueName: \"kubernetes.io/projected/64a632d2-9373-49df-8776-18715a267980-kube-api-access-ck5h7\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.777421 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.777487 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-scripts\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.777563 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-config-data\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.777626 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-run-httpd\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.777688 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-log-httpd\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.777739 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.780282 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-log-httpd\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.785006 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-run-httpd\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.786382 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.787957 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.788156 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-scripts\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.788975 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-config-data\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.809249 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck5h7\" (UniqueName: \"kubernetes.io/projected/64a632d2-9373-49df-8776-18715a267980-kube-api-access-ck5h7\") pod \"ceilometer-0\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " pod="openstack/ceilometer-0" Feb 26 17:27:06 crc kubenswrapper[5033]: I0226 17:27:06.844817 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:27:07 crc kubenswrapper[5033]: I0226 17:27:07.364059 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:27:07 crc kubenswrapper[5033]: I0226 17:27:07.429213 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerStarted","Data":"91265c3ec885719aff80ae661c35103d1547776166893f5defafda1a067c9fbd"} Feb 26 17:27:08 crc kubenswrapper[5033]: I0226 17:27:08.445774 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerStarted","Data":"4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b"} Feb 26 17:27:09 crc kubenswrapper[5033]: I0226 17:27:09.458140 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerStarted","Data":"b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6"} Feb 26 17:27:09 crc kubenswrapper[5033]: I0226 17:27:09.458760 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerStarted","Data":"8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4"} Feb 26 17:27:12 crc kubenswrapper[5033]: I0226 17:27:12.506937 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerStarted","Data":"1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b"} Feb 26 17:27:12 crc kubenswrapper[5033]: I0226 17:27:12.507216 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 17:27:12 crc kubenswrapper[5033]: I0226 17:27:12.556317 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.674945219 podStartE2EDuration="6.556293133s" podCreationTimestamp="2026-02-26 17:27:06 +0000 UTC" firstStartedPulling="2026-02-26 17:27:07.358790751 +0000 UTC m=+6354.571389401" lastFinishedPulling="2026-02-26 17:27:11.240138665 +0000 UTC m=+6358.452737315" observedRunningTime="2026-02-26 17:27:12.538886498 +0000 UTC m=+6359.751485158" watchObservedRunningTime="2026-02-26 17:27:12.556293133 +0000 UTC m=+6359.768891793" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.478240 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-ffj5s"] Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.479986 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.500294 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-ffj5s"] Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.556490 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3976cdda-4518-4d6a-b33e-13b88d1d74fc-operator-scripts\") pod \"aodh-db-create-ffj5s\" (UID: \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\") " pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.556780 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvkjs\" (UniqueName: \"kubernetes.io/projected/3976cdda-4518-4d6a-b33e-13b88d1d74fc-kube-api-access-nvkjs\") pod \"aodh-db-create-ffj5s\" (UID: \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\") " pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.582047 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-8dc9-account-create-update-vhg8q"] Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.583707 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.586126 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.594278 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-8dc9-account-create-update-vhg8q"] Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.658934 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c609b584-a871-4af1-bdc9-123c4b497ed9-operator-scripts\") pod \"aodh-8dc9-account-create-update-vhg8q\" (UID: \"c609b584-a871-4af1-bdc9-123c4b497ed9\") " pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.659415 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3976cdda-4518-4d6a-b33e-13b88d1d74fc-operator-scripts\") pod \"aodh-db-create-ffj5s\" (UID: \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\") " pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.659438 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvkjs\" (UniqueName: \"kubernetes.io/projected/3976cdda-4518-4d6a-b33e-13b88d1d74fc-kube-api-access-nvkjs\") pod \"aodh-db-create-ffj5s\" (UID: \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\") " pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.659540 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxgx5\" (UniqueName: \"kubernetes.io/projected/c609b584-a871-4af1-bdc9-123c4b497ed9-kube-api-access-nxgx5\") pod \"aodh-8dc9-account-create-update-vhg8q\" (UID: \"c609b584-a871-4af1-bdc9-123c4b497ed9\") " pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.661293 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3976cdda-4518-4d6a-b33e-13b88d1d74fc-operator-scripts\") pod \"aodh-db-create-ffj5s\" (UID: \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\") " pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.682815 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvkjs\" (UniqueName: \"kubernetes.io/projected/3976cdda-4518-4d6a-b33e-13b88d1d74fc-kube-api-access-nvkjs\") pod \"aodh-db-create-ffj5s\" (UID: \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\") " pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.761424 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxgx5\" (UniqueName: \"kubernetes.io/projected/c609b584-a871-4af1-bdc9-123c4b497ed9-kube-api-access-nxgx5\") pod \"aodh-8dc9-account-create-update-vhg8q\" (UID: \"c609b584-a871-4af1-bdc9-123c4b497ed9\") " pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.761560 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c609b584-a871-4af1-bdc9-123c4b497ed9-operator-scripts\") pod \"aodh-8dc9-account-create-update-vhg8q\" (UID: \"c609b584-a871-4af1-bdc9-123c4b497ed9\") " pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.762510 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c609b584-a871-4af1-bdc9-123c4b497ed9-operator-scripts\") pod \"aodh-8dc9-account-create-update-vhg8q\" (UID: \"c609b584-a871-4af1-bdc9-123c4b497ed9\") " pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.779148 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxgx5\" (UniqueName: \"kubernetes.io/projected/c609b584-a871-4af1-bdc9-123c4b497ed9-kube-api-access-nxgx5\") pod \"aodh-8dc9-account-create-update-vhg8q\" (UID: \"c609b584-a871-4af1-bdc9-123c4b497ed9\") " pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.813940 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:18 crc kubenswrapper[5033]: I0226 17:27:18.899988 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:19 crc kubenswrapper[5033]: I0226 17:27:19.287952 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-ffj5s"] Feb 26 17:27:19 crc kubenswrapper[5033]: W0226 17:27:19.512409 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc609b584_a871_4af1_bdc9_123c4b497ed9.slice/crio-169023a5086a1f44e49f688fcd76567c45c70627e2c31b2d8e2ddd0619b19388 WatchSource:0}: Error finding container 169023a5086a1f44e49f688fcd76567c45c70627e2c31b2d8e2ddd0619b19388: Status 404 returned error can't find the container with id 169023a5086a1f44e49f688fcd76567c45c70627e2c31b2d8e2ddd0619b19388 Feb 26 17:27:19 crc kubenswrapper[5033]: I0226 17:27:19.513577 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-8dc9-account-create-update-vhg8q"] Feb 26 17:27:19 crc kubenswrapper[5033]: I0226 17:27:19.587068 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-ffj5s" event={"ID":"3976cdda-4518-4d6a-b33e-13b88d1d74fc","Type":"ContainerStarted","Data":"00ffcabc969d84b462a2994235efea576bd93b546d346ba94bd98a9b3fea0a8e"} Feb 26 17:27:19 crc kubenswrapper[5033]: I0226 17:27:19.587111 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-ffj5s" event={"ID":"3976cdda-4518-4d6a-b33e-13b88d1d74fc","Type":"ContainerStarted","Data":"140316f4b6990c9e9a3fff0ad3dfbba04269f2944a1fc33e744b456e30e5dd3a"} Feb 26 17:27:19 crc kubenswrapper[5033]: I0226 17:27:19.590310 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8dc9-account-create-update-vhg8q" event={"ID":"c609b584-a871-4af1-bdc9-123c4b497ed9","Type":"ContainerStarted","Data":"169023a5086a1f44e49f688fcd76567c45c70627e2c31b2d8e2ddd0619b19388"} Feb 26 17:27:19 crc kubenswrapper[5033]: I0226 17:27:19.609503 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-ffj5s" podStartSLOduration=1.6094802719999999 podStartE2EDuration="1.609480272s" podCreationTimestamp="2026-02-26 17:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:27:19.606400157 +0000 UTC m=+6366.818998817" watchObservedRunningTime="2026-02-26 17:27:19.609480272 +0000 UTC m=+6366.822078942" Feb 26 17:27:20 crc kubenswrapper[5033]: I0226 17:27:20.189220 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:27:20 crc kubenswrapper[5033]: I0226 17:27:20.189565 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:27:20 crc kubenswrapper[5033]: I0226 17:27:20.600876 5033 generic.go:334] "Generic (PLEG): container finished" podID="c609b584-a871-4af1-bdc9-123c4b497ed9" containerID="566c866c992097aa590ec283ad79dd20bb6644916bd792ce18bc5b2c232d2ec4" exitCode=0 Feb 26 17:27:20 crc kubenswrapper[5033]: I0226 17:27:20.600950 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8dc9-account-create-update-vhg8q" event={"ID":"c609b584-a871-4af1-bdc9-123c4b497ed9","Type":"ContainerDied","Data":"566c866c992097aa590ec283ad79dd20bb6644916bd792ce18bc5b2c232d2ec4"} Feb 26 17:27:20 crc kubenswrapper[5033]: I0226 17:27:20.602096 5033 generic.go:334] "Generic (PLEG): container finished" podID="3976cdda-4518-4d6a-b33e-13b88d1d74fc" containerID="00ffcabc969d84b462a2994235efea576bd93b546d346ba94bd98a9b3fea0a8e" exitCode=0 Feb 26 17:27:20 crc kubenswrapper[5033]: I0226 17:27:20.602135 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-ffj5s" event={"ID":"3976cdda-4518-4d6a-b33e-13b88d1d74fc","Type":"ContainerDied","Data":"00ffcabc969d84b462a2994235efea576bd93b546d346ba94bd98a9b3fea0a8e"} Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.039721 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wcths"] Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.051297 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wcths"] Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.120372 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.125773 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.257334 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3976cdda-4518-4d6a-b33e-13b88d1d74fc-operator-scripts\") pod \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\" (UID: \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\") " Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.257532 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c609b584-a871-4af1-bdc9-123c4b497ed9-operator-scripts\") pod \"c609b584-a871-4af1-bdc9-123c4b497ed9\" (UID: \"c609b584-a871-4af1-bdc9-123c4b497ed9\") " Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.257579 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxgx5\" (UniqueName: \"kubernetes.io/projected/c609b584-a871-4af1-bdc9-123c4b497ed9-kube-api-access-nxgx5\") pod \"c609b584-a871-4af1-bdc9-123c4b497ed9\" (UID: \"c609b584-a871-4af1-bdc9-123c4b497ed9\") " Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.257667 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvkjs\" (UniqueName: \"kubernetes.io/projected/3976cdda-4518-4d6a-b33e-13b88d1d74fc-kube-api-access-nvkjs\") pod \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\" (UID: \"3976cdda-4518-4d6a-b33e-13b88d1d74fc\") " Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.257800 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3976cdda-4518-4d6a-b33e-13b88d1d74fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3976cdda-4518-4d6a-b33e-13b88d1d74fc" (UID: "3976cdda-4518-4d6a-b33e-13b88d1d74fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.258161 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c609b584-a871-4af1-bdc9-123c4b497ed9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c609b584-a871-4af1-bdc9-123c4b497ed9" (UID: "c609b584-a871-4af1-bdc9-123c4b497ed9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.258210 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3976cdda-4518-4d6a-b33e-13b88d1d74fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.265999 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3976cdda-4518-4d6a-b33e-13b88d1d74fc-kube-api-access-nvkjs" (OuterVolumeSpecName: "kube-api-access-nvkjs") pod "3976cdda-4518-4d6a-b33e-13b88d1d74fc" (UID: "3976cdda-4518-4d6a-b33e-13b88d1d74fc"). InnerVolumeSpecName "kube-api-access-nvkjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.282166 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c609b584-a871-4af1-bdc9-123c4b497ed9-kube-api-access-nxgx5" (OuterVolumeSpecName: "kube-api-access-nxgx5") pod "c609b584-a871-4af1-bdc9-123c4b497ed9" (UID: "c609b584-a871-4af1-bdc9-123c4b497ed9"). InnerVolumeSpecName "kube-api-access-nxgx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.360656 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxgx5\" (UniqueName: \"kubernetes.io/projected/c609b584-a871-4af1-bdc9-123c4b497ed9-kube-api-access-nxgx5\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.360693 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvkjs\" (UniqueName: \"kubernetes.io/projected/3976cdda-4518-4d6a-b33e-13b88d1d74fc-kube-api-access-nvkjs\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.360709 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c609b584-a871-4af1-bdc9-123c4b497ed9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.641445 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-8dc9-account-create-update-vhg8q" event={"ID":"c609b584-a871-4af1-bdc9-123c4b497ed9","Type":"ContainerDied","Data":"169023a5086a1f44e49f688fcd76567c45c70627e2c31b2d8e2ddd0619b19388"} Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.642189 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169023a5086a1f44e49f688fcd76567c45c70627e2c31b2d8e2ddd0619b19388" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.642240 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-8dc9-account-create-update-vhg8q" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.645185 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-ffj5s" event={"ID":"3976cdda-4518-4d6a-b33e-13b88d1d74fc","Type":"ContainerDied","Data":"140316f4b6990c9e9a3fff0ad3dfbba04269f2944a1fc33e744b456e30e5dd3a"} Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.645241 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="140316f4b6990c9e9a3fff0ad3dfbba04269f2944a1fc33e744b456e30e5dd3a" Feb 26 17:27:22 crc kubenswrapper[5033]: I0226 17:27:22.645371 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-ffj5s" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.051013 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9rsvh"] Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.065529 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9rsvh"] Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.646411 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="407bdbec-fe4f-4726-baa6-061f9c9ebc6d" path="/var/lib/kubelet/pods/407bdbec-fe4f-4726-baa6-061f9c9ebc6d/volumes" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.647901 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c674b55b-5acf-421a-8772-fea6c2e3ad3d" path="/var/lib/kubelet/pods/c674b55b-5acf-421a-8772-fea6c2e3ad3d/volumes" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.974549 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-wzprc"] Feb 26 17:27:23 crc kubenswrapper[5033]: E0226 17:27:23.974968 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c609b584-a871-4af1-bdc9-123c4b497ed9" containerName="mariadb-account-create-update" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.974984 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c609b584-a871-4af1-bdc9-123c4b497ed9" containerName="mariadb-account-create-update" Feb 26 17:27:23 crc kubenswrapper[5033]: E0226 17:27:23.975022 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3976cdda-4518-4d6a-b33e-13b88d1d74fc" containerName="mariadb-database-create" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.975028 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3976cdda-4518-4d6a-b33e-13b88d1d74fc" containerName="mariadb-database-create" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.975209 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3976cdda-4518-4d6a-b33e-13b88d1d74fc" containerName="mariadb-database-create" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.975223 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c609b584-a871-4af1-bdc9-123c4b497ed9" containerName="mariadb-account-create-update" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.975941 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.978324 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-j4kn9" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.980664 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.980680 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 26 17:27:23 crc kubenswrapper[5033]: I0226 17:27:23.981483 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.010160 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-wzprc"] Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.101319 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-config-data\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.101835 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-scripts\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.101930 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcs5f\" (UniqueName: \"kubernetes.io/projected/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-kube-api-access-qcs5f\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.102128 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-combined-ca-bundle\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.204044 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-combined-ca-bundle\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.204351 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-config-data\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.204474 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-scripts\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.204555 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcs5f\" (UniqueName: \"kubernetes.io/projected/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-kube-api-access-qcs5f\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.209769 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-config-data\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.211042 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-scripts\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.212855 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-combined-ca-bundle\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.222144 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcs5f\" (UniqueName: \"kubernetes.io/projected/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-kube-api-access-qcs5f\") pod \"aodh-db-sync-wzprc\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.305107 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:24 crc kubenswrapper[5033]: W0226 17:27:24.779076 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce55d9fc_6d50_4c54_8a55_3eb6e238b177.slice/crio-480ca4a7347d9938c21b4b219c13d9d84a80cdd01404d9da869dda9f17ef36e4 WatchSource:0}: Error finding container 480ca4a7347d9938c21b4b219c13d9d84a80cdd01404d9da869dda9f17ef36e4: Status 404 returned error can't find the container with id 480ca4a7347d9938c21b4b219c13d9d84a80cdd01404d9da869dda9f17ef36e4 Feb 26 17:27:24 crc kubenswrapper[5033]: I0226 17:27:24.782160 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-wzprc"] Feb 26 17:27:25 crc kubenswrapper[5033]: I0226 17:27:25.676246 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-wzprc" event={"ID":"ce55d9fc-6d50-4c54-8a55-3eb6e238b177","Type":"ContainerStarted","Data":"480ca4a7347d9938c21b4b219c13d9d84a80cdd01404d9da869dda9f17ef36e4"} Feb 26 17:27:29 crc kubenswrapper[5033]: I0226 17:27:29.722055 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-wzprc" event={"ID":"ce55d9fc-6d50-4c54-8a55-3eb6e238b177","Type":"ContainerStarted","Data":"5ce1fed72de0d02a065df1ec3e6a489ec80c006575a2cec1d674238007203885"} Feb 26 17:27:29 crc kubenswrapper[5033]: I0226 17:27:29.741227 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-wzprc" podStartSLOduration=2.030713018 podStartE2EDuration="6.741209554s" podCreationTimestamp="2026-02-26 17:27:23 +0000 UTC" firstStartedPulling="2026-02-26 17:27:24.781318127 +0000 UTC m=+6371.993916777" lastFinishedPulling="2026-02-26 17:27:29.491814663 +0000 UTC m=+6376.704413313" observedRunningTime="2026-02-26 17:27:29.739654737 +0000 UTC m=+6376.952253387" watchObservedRunningTime="2026-02-26 17:27:29.741209554 +0000 UTC m=+6376.953808204" Feb 26 17:27:32 crc kubenswrapper[5033]: I0226 17:27:32.751860 5033 generic.go:334] "Generic (PLEG): container finished" podID="ce55d9fc-6d50-4c54-8a55-3eb6e238b177" containerID="5ce1fed72de0d02a065df1ec3e6a489ec80c006575a2cec1d674238007203885" exitCode=0 Feb 26 17:27:32 crc kubenswrapper[5033]: I0226 17:27:32.751939 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-wzprc" event={"ID":"ce55d9fc-6d50-4c54-8a55-3eb6e238b177","Type":"ContainerDied","Data":"5ce1fed72de0d02a065df1ec3e6a489ec80c006575a2cec1d674238007203885"} Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.173660 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.330111 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-combined-ca-bundle\") pod \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.330664 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-scripts\") pod \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.331342 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-config-data\") pod \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.331492 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcs5f\" (UniqueName: \"kubernetes.io/projected/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-kube-api-access-qcs5f\") pod \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\" (UID: \"ce55d9fc-6d50-4c54-8a55-3eb6e238b177\") " Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.337170 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-scripts" (OuterVolumeSpecName: "scripts") pod "ce55d9fc-6d50-4c54-8a55-3eb6e238b177" (UID: "ce55d9fc-6d50-4c54-8a55-3eb6e238b177"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.337297 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-kube-api-access-qcs5f" (OuterVolumeSpecName: "kube-api-access-qcs5f") pod "ce55d9fc-6d50-4c54-8a55-3eb6e238b177" (UID: "ce55d9fc-6d50-4c54-8a55-3eb6e238b177"). InnerVolumeSpecName "kube-api-access-qcs5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.372642 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-config-data" (OuterVolumeSpecName: "config-data") pod "ce55d9fc-6d50-4c54-8a55-3eb6e238b177" (UID: "ce55d9fc-6d50-4c54-8a55-3eb6e238b177"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.375958 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce55d9fc-6d50-4c54-8a55-3eb6e238b177" (UID: "ce55d9fc-6d50-4c54-8a55-3eb6e238b177"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.434470 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.434508 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.434521 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcs5f\" (UniqueName: \"kubernetes.io/projected/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-kube-api-access-qcs5f\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.434530 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce55d9fc-6d50-4c54-8a55-3eb6e238b177-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.776010 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-wzprc" event={"ID":"ce55d9fc-6d50-4c54-8a55-3eb6e238b177","Type":"ContainerDied","Data":"480ca4a7347d9938c21b4b219c13d9d84a80cdd01404d9da869dda9f17ef36e4"} Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.776063 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="480ca4a7347d9938c21b4b219c13d9d84a80cdd01404d9da869dda9f17ef36e4" Feb 26 17:27:34 crc kubenswrapper[5033]: I0226 17:27:34.776125 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-wzprc" Feb 26 17:27:36 crc kubenswrapper[5033]: I0226 17:27:36.855618 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 17:27:37 crc kubenswrapper[5033]: I0226 17:27:37.134669 5033 scope.go:117] "RemoveContainer" containerID="65cce5d38065cf5dc18de96156a1429f20284fa58c9b0f4e94f491081c4a13cb" Feb 26 17:27:37 crc kubenswrapper[5033]: I0226 17:27:37.158457 5033 scope.go:117] "RemoveContainer" containerID="bd4b0025e0b84c4590ee98fa8ebfc1c039c604256b6e8bd6218c205201d378a4" Feb 26 17:27:37 crc kubenswrapper[5033]: I0226 17:27:37.231823 5033 scope.go:117] "RemoveContainer" containerID="0561d285f8af60c4647c398e67aa76efbdae27793c11ce3e75367699fb17beb5" Feb 26 17:27:37 crc kubenswrapper[5033]: I0226 17:27:37.293354 5033 scope.go:117] "RemoveContainer" containerID="89155be67a1e0cd99fb739f0eafea1631bef2d1d40b50d138f19cfb4a8081dd8" Feb 26 17:27:37 crc kubenswrapper[5033]: I0226 17:27:37.322747 5033 scope.go:117] "RemoveContainer" containerID="6bc2910fe6c4c284470058468f3a142b87240ba0523a2be94487b4d1a1ec0373" Feb 26 17:27:37 crc kubenswrapper[5033]: I0226 17:27:37.372485 5033 scope.go:117] "RemoveContainer" containerID="c1a63c5dcac537112cf58361d0485436b317be39972e171e823526ff84a0c62e" Feb 26 17:27:37 crc kubenswrapper[5033]: I0226 17:27:37.413371 5033 scope.go:117] "RemoveContainer" containerID="339f92ee7719eabf170ca21f54f7f870fc5691757cc318ef5ab96b55b00830bd" Feb 26 17:27:37 crc kubenswrapper[5033]: I0226 17:27:37.435684 5033 scope.go:117] "RemoveContainer" containerID="6e454efcce2322cb2b27e2958b65d5c5cec1a31d09460fc3eb401c7669e77566" Feb 26 17:27:37 crc kubenswrapper[5033]: I0226 17:27:37.466290 5033 scope.go:117] "RemoveContainer" containerID="929383a61ee3b828490a46339202a4f4c46ead1b76b939f224fa9394b4fb1136" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.131688 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jlpmf"] Feb 26 17:27:38 crc kubenswrapper[5033]: E0226 17:27:38.132234 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce55d9fc-6d50-4c54-8a55-3eb6e238b177" containerName="aodh-db-sync" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.132251 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce55d9fc-6d50-4c54-8a55-3eb6e238b177" containerName="aodh-db-sync" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.132678 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce55d9fc-6d50-4c54-8a55-3eb6e238b177" containerName="aodh-db-sync" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.135104 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.145467 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jlpmf"] Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.319098 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-catalog-content\") pod \"certified-operators-jlpmf\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.319476 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lplj8\" (UniqueName: \"kubernetes.io/projected/c139f380-4c74-42e0-bb11-482ab3660963-kube-api-access-lplj8\") pod \"certified-operators-jlpmf\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.319759 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-utilities\") pod \"certified-operators-jlpmf\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.422769 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-utilities\") pod \"certified-operators-jlpmf\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.422964 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-catalog-content\") pod \"certified-operators-jlpmf\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.423134 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lplj8\" (UniqueName: \"kubernetes.io/projected/c139f380-4c74-42e0-bb11-482ab3660963-kube-api-access-lplj8\") pod \"certified-operators-jlpmf\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.423486 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-utilities\") pod \"certified-operators-jlpmf\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.423521 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-catalog-content\") pod \"certified-operators-jlpmf\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.458463 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lplj8\" (UniqueName: \"kubernetes.io/projected/c139f380-4c74-42e0-bb11-482ab3660963-kube-api-access-lplj8\") pod \"certified-operators-jlpmf\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:38 crc kubenswrapper[5033]: I0226 17:27:38.532702 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.070684 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.073512 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.076203 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.076275 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-j4kn9" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.076467 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.101721 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.144175 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jlpmf"] Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.240373 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e6c3b2-d7d4-4297-99c7-761ec33de582-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.240572 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcphb\" (UniqueName: \"kubernetes.io/projected/b3e6c3b2-d7d4-4297-99c7-761ec33de582-kube-api-access-pcphb\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.240641 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e6c3b2-d7d4-4297-99c7-761ec33de582-config-data\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.240729 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e6c3b2-d7d4-4297-99c7-761ec33de582-scripts\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.342693 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e6c3b2-d7d4-4297-99c7-761ec33de582-scripts\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.343162 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e6c3b2-d7d4-4297-99c7-761ec33de582-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.343237 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcphb\" (UniqueName: \"kubernetes.io/projected/b3e6c3b2-d7d4-4297-99c7-761ec33de582-kube-api-access-pcphb\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.343302 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e6c3b2-d7d4-4297-99c7-761ec33de582-config-data\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.348985 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e6c3b2-d7d4-4297-99c7-761ec33de582-scripts\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.350922 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e6c3b2-d7d4-4297-99c7-761ec33de582-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.352460 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e6c3b2-d7d4-4297-99c7-761ec33de582-config-data\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.365542 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcphb\" (UniqueName: \"kubernetes.io/projected/b3e6c3b2-d7d4-4297-99c7-761ec33de582-kube-api-access-pcphb\") pod \"aodh-0\" (UID: \"b3e6c3b2-d7d4-4297-99c7-761ec33de582\") " pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.390760 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.850038 5033 generic.go:334] "Generic (PLEG): container finished" podID="c139f380-4c74-42e0-bb11-482ab3660963" containerID="bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7" exitCode=0 Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.850321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlpmf" event={"ID":"c139f380-4c74-42e0-bb11-482ab3660963","Type":"ContainerDied","Data":"bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7"} Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.850351 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlpmf" event={"ID":"c139f380-4c74-42e0-bb11-482ab3660963","Type":"ContainerStarted","Data":"c66bd6d51272414289815ab9df5a56633ef6335035a217fae2bbbbb5d6bc9bf0"} Feb 26 17:27:39 crc kubenswrapper[5033]: I0226 17:27:39.919518 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 26 17:27:40 crc kubenswrapper[5033]: I0226 17:27:40.862806 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlpmf" event={"ID":"c139f380-4c74-42e0-bb11-482ab3660963","Type":"ContainerStarted","Data":"5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10"} Feb 26 17:27:40 crc kubenswrapper[5033]: I0226 17:27:40.866672 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3e6c3b2-d7d4-4297-99c7-761ec33de582","Type":"ContainerStarted","Data":"536c912f1be7a6b670d3eeebe0d7b64058d8d8defc677b41526fadfceba13805"} Feb 26 17:27:40 crc kubenswrapper[5033]: I0226 17:27:40.866708 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3e6c3b2-d7d4-4297-99c7-761ec33de582","Type":"ContainerStarted","Data":"5e8e8016fb80131b5509063db35f9bc286c5af080c7a8b228ddd3793a2422cfb"} Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.041646 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-g5czw"] Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.058494 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-g5czw"] Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.278462 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.279628 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="ceilometer-central-agent" containerID="cri-o://4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b" gracePeriod=30 Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.279676 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="proxy-httpd" containerID="cri-o://1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b" gracePeriod=30 Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.279695 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="sg-core" containerID="cri-o://b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6" gracePeriod=30 Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.279702 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="ceilometer-notification-agent" containerID="cri-o://8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4" gracePeriod=30 Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.647171 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c3fafef-bdef-4e63-afa3-aa2460c1b926" path="/var/lib/kubelet/pods/0c3fafef-bdef-4e63-afa3-aa2460c1b926/volumes" Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.881685 5033 generic.go:334] "Generic (PLEG): container finished" podID="64a632d2-9373-49df-8776-18715a267980" containerID="1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b" exitCode=0 Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.881715 5033 generic.go:334] "Generic (PLEG): container finished" podID="64a632d2-9373-49df-8776-18715a267980" containerID="b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6" exitCode=2 Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.881723 5033 generic.go:334] "Generic (PLEG): container finished" podID="64a632d2-9373-49df-8776-18715a267980" containerID="4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b" exitCode=0 Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.881774 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerDied","Data":"1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b"} Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.881816 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerDied","Data":"b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6"} Feb 26 17:27:41 crc kubenswrapper[5033]: I0226 17:27:41.881828 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerDied","Data":"4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b"} Feb 26 17:27:43 crc kubenswrapper[5033]: I0226 17:27:43.902192 5033 generic.go:334] "Generic (PLEG): container finished" podID="c139f380-4c74-42e0-bb11-482ab3660963" containerID="5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10" exitCode=0 Feb 26 17:27:43 crc kubenswrapper[5033]: I0226 17:27:43.902305 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlpmf" event={"ID":"c139f380-4c74-42e0-bb11-482ab3660963","Type":"ContainerDied","Data":"5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10"} Feb 26 17:27:43 crc kubenswrapper[5033]: I0226 17:27:43.907661 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3e6c3b2-d7d4-4297-99c7-761ec33de582","Type":"ContainerStarted","Data":"dd66e3f28057876a1629d4cda654e10e5134237703cbd671dff4ba1bba32d97e"} Feb 26 17:27:44 crc kubenswrapper[5033]: I0226 17:27:44.919450 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3e6c3b2-d7d4-4297-99c7-761ec33de582","Type":"ContainerStarted","Data":"5a986d1576eaa3b559a12707bb0d23c42eb8d9c092e262be62db14c4aaece436"} Feb 26 17:27:44 crc kubenswrapper[5033]: I0226 17:27:44.921513 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlpmf" event={"ID":"c139f380-4c74-42e0-bb11-482ab3660963","Type":"ContainerStarted","Data":"e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c"} Feb 26 17:27:44 crc kubenswrapper[5033]: I0226 17:27:44.957079 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jlpmf" podStartSLOduration=2.372310547 podStartE2EDuration="6.957059571s" podCreationTimestamp="2026-02-26 17:27:38 +0000 UTC" firstStartedPulling="2026-02-26 17:27:39.854149009 +0000 UTC m=+6387.066747659" lastFinishedPulling="2026-02-26 17:27:44.438898043 +0000 UTC m=+6391.651496683" observedRunningTime="2026-02-26 17:27:44.94517343 +0000 UTC m=+6392.157772080" watchObservedRunningTime="2026-02-26 17:27:44.957059571 +0000 UTC m=+6392.169658221" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.581048 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.681186 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-combined-ca-bundle\") pod \"64a632d2-9373-49df-8776-18715a267980\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.681326 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-scripts\") pod \"64a632d2-9373-49df-8776-18715a267980\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.681366 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-log-httpd\") pod \"64a632d2-9373-49df-8776-18715a267980\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.681398 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-config-data\") pod \"64a632d2-9373-49df-8776-18715a267980\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.681501 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-sg-core-conf-yaml\") pod \"64a632d2-9373-49df-8776-18715a267980\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.681552 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-run-httpd\") pod \"64a632d2-9373-49df-8776-18715a267980\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.681604 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck5h7\" (UniqueName: \"kubernetes.io/projected/64a632d2-9373-49df-8776-18715a267980-kube-api-access-ck5h7\") pod \"64a632d2-9373-49df-8776-18715a267980\" (UID: \"64a632d2-9373-49df-8776-18715a267980\") " Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.682575 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "64a632d2-9373-49df-8776-18715a267980" (UID: "64a632d2-9373-49df-8776-18715a267980"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.682624 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "64a632d2-9373-49df-8776-18715a267980" (UID: "64a632d2-9373-49df-8776-18715a267980"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.688676 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-scripts" (OuterVolumeSpecName: "scripts") pod "64a632d2-9373-49df-8776-18715a267980" (UID: "64a632d2-9373-49df-8776-18715a267980"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.689515 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a632d2-9373-49df-8776-18715a267980-kube-api-access-ck5h7" (OuterVolumeSpecName: "kube-api-access-ck5h7") pod "64a632d2-9373-49df-8776-18715a267980" (UID: "64a632d2-9373-49df-8776-18715a267980"). InnerVolumeSpecName "kube-api-access-ck5h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.734407 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "64a632d2-9373-49df-8776-18715a267980" (UID: "64a632d2-9373-49df-8776-18715a267980"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.780991 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64a632d2-9373-49df-8776-18715a267980" (UID: "64a632d2-9373-49df-8776-18715a267980"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.785005 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.785038 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.785049 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.785061 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.785073 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64a632d2-9373-49df-8776-18715a267980-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.785086 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck5h7\" (UniqueName: \"kubernetes.io/projected/64a632d2-9373-49df-8776-18715a267980-kube-api-access-ck5h7\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.813858 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-config-data" (OuterVolumeSpecName: "config-data") pod "64a632d2-9373-49df-8776-18715a267980" (UID: "64a632d2-9373-49df-8776-18715a267980"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.892321 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a632d2-9373-49df-8776-18715a267980-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.943876 5033 generic.go:334] "Generic (PLEG): container finished" podID="64a632d2-9373-49df-8776-18715a267980" containerID="8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4" exitCode=0 Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.943934 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerDied","Data":"8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4"} Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.943966 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64a632d2-9373-49df-8776-18715a267980","Type":"ContainerDied","Data":"91265c3ec885719aff80ae661c35103d1547776166893f5defafda1a067c9fbd"} Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.943987 5033 scope.go:117] "RemoveContainer" containerID="1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.944139 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:27:45 crc kubenswrapper[5033]: I0226 17:27:45.993457 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.012702 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.022473 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:27:46 crc kubenswrapper[5033]: E0226 17:27:46.022972 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="ceilometer-central-agent" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.022992 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="ceilometer-central-agent" Feb 26 17:27:46 crc kubenswrapper[5033]: E0226 17:27:46.023008 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="sg-core" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.023015 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="sg-core" Feb 26 17:27:46 crc kubenswrapper[5033]: E0226 17:27:46.023038 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="proxy-httpd" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.023044 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="proxy-httpd" Feb 26 17:27:46 crc kubenswrapper[5033]: E0226 17:27:46.023066 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="ceilometer-notification-agent" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.023072 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="ceilometer-notification-agent" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.023281 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="proxy-httpd" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.023299 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="ceilometer-notification-agent" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.023306 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="ceilometer-central-agent" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.023320 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a632d2-9373-49df-8776-18715a267980" containerName="sg-core" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.025174 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.029000 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.029361 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.031608 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.097130 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-config-data\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.097194 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.097291 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-log-httpd\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.097309 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-scripts\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.097787 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cjxn\" (UniqueName: \"kubernetes.io/projected/527e8b6d-0f16-4d7b-ae72-04735e5921ac-kube-api-access-4cjxn\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.097919 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-run-httpd\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.098069 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.200493 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-config-data\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.200833 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.200988 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-log-httpd\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.201114 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-scripts\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.201288 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cjxn\" (UniqueName: \"kubernetes.io/projected/527e8b6d-0f16-4d7b-ae72-04735e5921ac-kube-api-access-4cjxn\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.201435 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-run-httpd\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.201570 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.201890 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-run-httpd\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.202009 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-log-httpd\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.205458 5033 scope.go:117] "RemoveContainer" containerID="b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.205666 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-scripts\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.206086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-config-data\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.207642 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.213801 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.224340 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cjxn\" (UniqueName: \"kubernetes.io/projected/527e8b6d-0f16-4d7b-ae72-04735e5921ac-kube-api-access-4cjxn\") pod \"ceilometer-0\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.228457 5033 scope.go:117] "RemoveContainer" containerID="8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.279441 5033 scope.go:117] "RemoveContainer" containerID="4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.313755 5033 scope.go:117] "RemoveContainer" containerID="1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b" Feb 26 17:27:46 crc kubenswrapper[5033]: E0226 17:27:46.314567 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b\": container with ID starting with 1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b not found: ID does not exist" containerID="1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.314635 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b"} err="failed to get container status \"1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b\": rpc error: code = NotFound desc = could not find container \"1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b\": container with ID starting with 1fe0183993b33e17fae153e73badfe71df25669b188601d82d8f70a251be946b not found: ID does not exist" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.314680 5033 scope.go:117] "RemoveContainer" containerID="b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6" Feb 26 17:27:46 crc kubenswrapper[5033]: E0226 17:27:46.315040 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6\": container with ID starting with b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6 not found: ID does not exist" containerID="b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.315094 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6"} err="failed to get container status \"b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6\": rpc error: code = NotFound desc = could not find container \"b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6\": container with ID starting with b90cfbe3bb6491eb4be3aeb78e7a59eda41a6295a29873c3dc3ff675a825f7f6 not found: ID does not exist" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.315128 5033 scope.go:117] "RemoveContainer" containerID="8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4" Feb 26 17:27:46 crc kubenswrapper[5033]: E0226 17:27:46.315709 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4\": container with ID starting with 8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4 not found: ID does not exist" containerID="8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.315737 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4"} err="failed to get container status \"8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4\": rpc error: code = NotFound desc = could not find container \"8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4\": container with ID starting with 8ff2fa92e62ddd45c9f7a47da8daa6455ec646c0c9f0b34f6817db6ad9b93ae4 not found: ID does not exist" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.315756 5033 scope.go:117] "RemoveContainer" containerID="4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b" Feb 26 17:27:46 crc kubenswrapper[5033]: E0226 17:27:46.316120 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b\": container with ID starting with 4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b not found: ID does not exist" containerID="4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.316153 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b"} err="failed to get container status \"4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b\": rpc error: code = NotFound desc = could not find container \"4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b\": container with ID starting with 4ce4c3b1d860fb77d2c07ec85af225263ed7406f9779f68165b873d857e5228b not found: ID does not exist" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.365028 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.950809 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.961487 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3e6c3b2-d7d4-4297-99c7-761ec33de582","Type":"ContainerStarted","Data":"373f90400b0b4ced9b75641e922f49d2fe51b6204872bc2caf23f6780450f498"} Feb 26 17:27:46 crc kubenswrapper[5033]: I0226 17:27:46.996934 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.626412371 podStartE2EDuration="7.996912814s" podCreationTimestamp="2026-02-26 17:27:39 +0000 UTC" firstStartedPulling="2026-02-26 17:27:39.943948652 +0000 UTC m=+6387.156547302" lastFinishedPulling="2026-02-26 17:27:46.314449055 +0000 UTC m=+6393.527047745" observedRunningTime="2026-02-26 17:27:46.984443681 +0000 UTC m=+6394.197042321" watchObservedRunningTime="2026-02-26 17:27:46.996912814 +0000 UTC m=+6394.209511464" Feb 26 17:27:47 crc kubenswrapper[5033]: I0226 17:27:47.680388 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a632d2-9373-49df-8776-18715a267980" path="/var/lib/kubelet/pods/64a632d2-9373-49df-8776-18715a267980/volumes" Feb 26 17:27:47 crc kubenswrapper[5033]: I0226 17:27:47.972362 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerStarted","Data":"08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1"} Feb 26 17:27:47 crc kubenswrapper[5033]: I0226 17:27:47.972454 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerStarted","Data":"4fa683582492eeb2c5401133ad458acccb0083c1ccfa126566f396186e1cdbdb"} Feb 26 17:27:48 crc kubenswrapper[5033]: I0226 17:27:48.535008 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:48 crc kubenswrapper[5033]: I0226 17:27:48.535339 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:48 crc kubenswrapper[5033]: I0226 17:27:48.685096 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:48 crc kubenswrapper[5033]: I0226 17:27:48.984081 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerStarted","Data":"3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23"} Feb 26 17:27:49 crc kubenswrapper[5033]: I0226 17:27:49.996821 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerStarted","Data":"5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace"} Feb 26 17:27:50 crc kubenswrapper[5033]: I0226 17:27:50.189481 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:27:50 crc kubenswrapper[5033]: I0226 17:27:50.189555 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:27:52 crc kubenswrapper[5033]: I0226 17:27:52.025761 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerStarted","Data":"6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6"} Feb 26 17:27:52 crc kubenswrapper[5033]: I0226 17:27:52.027693 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 17:27:52 crc kubenswrapper[5033]: I0226 17:27:52.067784 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.720350256 podStartE2EDuration="7.067758703s" podCreationTimestamp="2026-02-26 17:27:45 +0000 UTC" firstStartedPulling="2026-02-26 17:27:46.955019782 +0000 UTC m=+6394.167618432" lastFinishedPulling="2026-02-26 17:27:51.302428229 +0000 UTC m=+6398.515026879" observedRunningTime="2026-02-26 17:27:52.060230639 +0000 UTC m=+6399.272829309" watchObservedRunningTime="2026-02-26 17:27:52.067758703 +0000 UTC m=+6399.280357363" Feb 26 17:27:52 crc kubenswrapper[5033]: I0226 17:27:52.899123 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-wv4qk"] Feb 26 17:27:52 crc kubenswrapper[5033]: I0226 17:27:52.901825 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:52 crc kubenswrapper[5033]: I0226 17:27:52.912994 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wv4qk"] Feb 26 17:27:52 crc kubenswrapper[5033]: I0226 17:27:52.947027 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9ccr\" (UniqueName: \"kubernetes.io/projected/7976ad2e-ed8e-4f4d-9832-655824619d62-kube-api-access-w9ccr\") pod \"manila-db-create-wv4qk\" (UID: \"7976ad2e-ed8e-4f4d-9832-655824619d62\") " pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:52 crc kubenswrapper[5033]: I0226 17:27:52.947111 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7976ad2e-ed8e-4f4d-9832-655824619d62-operator-scripts\") pod \"manila-db-create-wv4qk\" (UID: \"7976ad2e-ed8e-4f4d-9832-655824619d62\") " pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.050058 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9ccr\" (UniqueName: \"kubernetes.io/projected/7976ad2e-ed8e-4f4d-9832-655824619d62-kube-api-access-w9ccr\") pod \"manila-db-create-wv4qk\" (UID: \"7976ad2e-ed8e-4f4d-9832-655824619d62\") " pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.050125 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7976ad2e-ed8e-4f4d-9832-655824619d62-operator-scripts\") pod \"manila-db-create-wv4qk\" (UID: \"7976ad2e-ed8e-4f4d-9832-655824619d62\") " pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.051159 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7976ad2e-ed8e-4f4d-9832-655824619d62-operator-scripts\") pod \"manila-db-create-wv4qk\" (UID: \"7976ad2e-ed8e-4f4d-9832-655824619d62\") " pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.082354 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9ccr\" (UniqueName: \"kubernetes.io/projected/7976ad2e-ed8e-4f4d-9832-655824619d62-kube-api-access-w9ccr\") pod \"manila-db-create-wv4qk\" (UID: \"7976ad2e-ed8e-4f4d-9832-655824619d62\") " pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.113342 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-6808-account-create-update-69xc5"] Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.115458 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.117882 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.125887 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-6808-account-create-update-69xc5"] Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.221533 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.253752 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-operator-scripts\") pod \"manila-6808-account-create-update-69xc5\" (UID: \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\") " pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.254028 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8b2b\" (UniqueName: \"kubernetes.io/projected/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-kube-api-access-k8b2b\") pod \"manila-6808-account-create-update-69xc5\" (UID: \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\") " pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.355814 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-operator-scripts\") pod \"manila-6808-account-create-update-69xc5\" (UID: \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\") " pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.356210 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8b2b\" (UniqueName: \"kubernetes.io/projected/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-kube-api-access-k8b2b\") pod \"manila-6808-account-create-update-69xc5\" (UID: \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\") " pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.356493 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-operator-scripts\") pod \"manila-6808-account-create-update-69xc5\" (UID: \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\") " pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.380056 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8b2b\" (UniqueName: \"kubernetes.io/projected/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-kube-api-access-k8b2b\") pod \"manila-6808-account-create-update-69xc5\" (UID: \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\") " pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.449101 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.765629 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wv4qk"] Feb 26 17:27:53 crc kubenswrapper[5033]: W0226 17:27:53.786782 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7976ad2e_ed8e_4f4d_9832_655824619d62.slice/crio-56c90c7bcf4a0d7e0cadc8a5fb991915d2f597a50f64bd554f95076f577223bb WatchSource:0}: Error finding container 56c90c7bcf4a0d7e0cadc8a5fb991915d2f597a50f64bd554f95076f577223bb: Status 404 returned error can't find the container with id 56c90c7bcf4a0d7e0cadc8a5fb991915d2f597a50f64bd554f95076f577223bb Feb 26 17:27:53 crc kubenswrapper[5033]: I0226 17:27:53.871413 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-6808-account-create-update-69xc5"] Feb 26 17:27:54 crc kubenswrapper[5033]: I0226 17:27:54.054128 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6808-account-create-update-69xc5" event={"ID":"02b7ac8c-763d-4ee5-b3d8-3c07504dd494","Type":"ContainerStarted","Data":"1876c936a0e912dfcc133e7918d64b41ca1b00c2b046134731ff4ab5aa10dd7b"} Feb 26 17:27:54 crc kubenswrapper[5033]: I0226 17:27:54.054182 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6808-account-create-update-69xc5" event={"ID":"02b7ac8c-763d-4ee5-b3d8-3c07504dd494","Type":"ContainerStarted","Data":"6845ab6a004aa34e11ec2b76d6b703b4e7c9ed8a0c286a45571639c8f2cadd07"} Feb 26 17:27:54 crc kubenswrapper[5033]: I0226 17:27:54.058851 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wv4qk" event={"ID":"7976ad2e-ed8e-4f4d-9832-655824619d62","Type":"ContainerStarted","Data":"605988be20a0afc88fc23ff0d6799751e057fa0b49deb246cee8c771e6623fb9"} Feb 26 17:27:54 crc kubenswrapper[5033]: I0226 17:27:54.058900 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wv4qk" event={"ID":"7976ad2e-ed8e-4f4d-9832-655824619d62","Type":"ContainerStarted","Data":"56c90c7bcf4a0d7e0cadc8a5fb991915d2f597a50f64bd554f95076f577223bb"} Feb 26 17:27:54 crc kubenswrapper[5033]: I0226 17:27:54.078190 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-6808-account-create-update-69xc5" podStartSLOduration=1.078171358 podStartE2EDuration="1.078171358s" podCreationTimestamp="2026-02-26 17:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:27:54.067512138 +0000 UTC m=+6401.280110788" watchObservedRunningTime="2026-02-26 17:27:54.078171358 +0000 UTC m=+6401.290769998" Feb 26 17:27:54 crc kubenswrapper[5033]: I0226 17:27:54.099194 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-wv4qk" podStartSLOduration=2.099173891 podStartE2EDuration="2.099173891s" podCreationTimestamp="2026-02-26 17:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:27:54.091845673 +0000 UTC m=+6401.304444333" watchObservedRunningTime="2026-02-26 17:27:54.099173891 +0000 UTC m=+6401.311772541" Feb 26 17:27:55 crc kubenswrapper[5033]: I0226 17:27:55.073685 5033 generic.go:334] "Generic (PLEG): container finished" podID="02b7ac8c-763d-4ee5-b3d8-3c07504dd494" containerID="1876c936a0e912dfcc133e7918d64b41ca1b00c2b046134731ff4ab5aa10dd7b" exitCode=0 Feb 26 17:27:55 crc kubenswrapper[5033]: I0226 17:27:55.073743 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6808-account-create-update-69xc5" event={"ID":"02b7ac8c-763d-4ee5-b3d8-3c07504dd494","Type":"ContainerDied","Data":"1876c936a0e912dfcc133e7918d64b41ca1b00c2b046134731ff4ab5aa10dd7b"} Feb 26 17:27:55 crc kubenswrapper[5033]: I0226 17:27:55.075297 5033 generic.go:334] "Generic (PLEG): container finished" podID="7976ad2e-ed8e-4f4d-9832-655824619d62" containerID="605988be20a0afc88fc23ff0d6799751e057fa0b49deb246cee8c771e6623fb9" exitCode=0 Feb 26 17:27:55 crc kubenswrapper[5033]: I0226 17:27:55.075339 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wv4qk" event={"ID":"7976ad2e-ed8e-4f4d-9832-655824619d62","Type":"ContainerDied","Data":"605988be20a0afc88fc23ff0d6799751e057fa0b49deb246cee8c771e6623fb9"} Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.582150 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.590245 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.625526 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8b2b\" (UniqueName: \"kubernetes.io/projected/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-kube-api-access-k8b2b\") pod \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\" (UID: \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\") " Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.625585 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9ccr\" (UniqueName: \"kubernetes.io/projected/7976ad2e-ed8e-4f4d-9832-655824619d62-kube-api-access-w9ccr\") pod \"7976ad2e-ed8e-4f4d-9832-655824619d62\" (UID: \"7976ad2e-ed8e-4f4d-9832-655824619d62\") " Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.625604 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-operator-scripts\") pod \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\" (UID: \"02b7ac8c-763d-4ee5-b3d8-3c07504dd494\") " Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.625728 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7976ad2e-ed8e-4f4d-9832-655824619d62-operator-scripts\") pod \"7976ad2e-ed8e-4f4d-9832-655824619d62\" (UID: \"7976ad2e-ed8e-4f4d-9832-655824619d62\") " Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.626754 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7976ad2e-ed8e-4f4d-9832-655824619d62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7976ad2e-ed8e-4f4d-9832-655824619d62" (UID: "7976ad2e-ed8e-4f4d-9832-655824619d62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.627983 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02b7ac8c-763d-4ee5-b3d8-3c07504dd494" (UID: "02b7ac8c-763d-4ee5-b3d8-3c07504dd494"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.632969 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-kube-api-access-k8b2b" (OuterVolumeSpecName: "kube-api-access-k8b2b") pod "02b7ac8c-763d-4ee5-b3d8-3c07504dd494" (UID: "02b7ac8c-763d-4ee5-b3d8-3c07504dd494"). InnerVolumeSpecName "kube-api-access-k8b2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.634723 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7976ad2e-ed8e-4f4d-9832-655824619d62-kube-api-access-w9ccr" (OuterVolumeSpecName: "kube-api-access-w9ccr") pod "7976ad2e-ed8e-4f4d-9832-655824619d62" (UID: "7976ad2e-ed8e-4f4d-9832-655824619d62"). InnerVolumeSpecName "kube-api-access-w9ccr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.728876 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8b2b\" (UniqueName: \"kubernetes.io/projected/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-kube-api-access-k8b2b\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.729343 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9ccr\" (UniqueName: \"kubernetes.io/projected/7976ad2e-ed8e-4f4d-9832-655824619d62-kube-api-access-w9ccr\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.729367 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b7ac8c-763d-4ee5-b3d8-3c07504dd494-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:56 crc kubenswrapper[5033]: I0226 17:27:56.729383 5033 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7976ad2e-ed8e-4f4d-9832-655824619d62-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:57 crc kubenswrapper[5033]: I0226 17:27:57.094396 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6808-account-create-update-69xc5" event={"ID":"02b7ac8c-763d-4ee5-b3d8-3c07504dd494","Type":"ContainerDied","Data":"6845ab6a004aa34e11ec2b76d6b703b4e7c9ed8a0c286a45571639c8f2cadd07"} Feb 26 17:27:57 crc kubenswrapper[5033]: I0226 17:27:57.094452 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6845ab6a004aa34e11ec2b76d6b703b4e7c9ed8a0c286a45571639c8f2cadd07" Feb 26 17:27:57 crc kubenswrapper[5033]: I0226 17:27:57.094430 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6808-account-create-update-69xc5" Feb 26 17:27:57 crc kubenswrapper[5033]: I0226 17:27:57.095886 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wv4qk" event={"ID":"7976ad2e-ed8e-4f4d-9832-655824619d62","Type":"ContainerDied","Data":"56c90c7bcf4a0d7e0cadc8a5fb991915d2f597a50f64bd554f95076f577223bb"} Feb 26 17:27:57 crc kubenswrapper[5033]: I0226 17:27:57.095928 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c90c7bcf4a0d7e0cadc8a5fb991915d2f597a50f64bd554f95076f577223bb" Feb 26 17:27:57 crc kubenswrapper[5033]: I0226 17:27:57.095936 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wv4qk" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.431770 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-wcsql"] Feb 26 17:27:58 crc kubenswrapper[5033]: E0226 17:27:58.432529 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b7ac8c-763d-4ee5-b3d8-3c07504dd494" containerName="mariadb-account-create-update" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.432546 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b7ac8c-763d-4ee5-b3d8-3c07504dd494" containerName="mariadb-account-create-update" Feb 26 17:27:58 crc kubenswrapper[5033]: E0226 17:27:58.432563 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7976ad2e-ed8e-4f4d-9832-655824619d62" containerName="mariadb-database-create" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.432569 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7976ad2e-ed8e-4f4d-9832-655824619d62" containerName="mariadb-database-create" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.432790 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7976ad2e-ed8e-4f4d-9832-655824619d62" containerName="mariadb-database-create" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.432817 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="02b7ac8c-763d-4ee5-b3d8-3c07504dd494" containerName="mariadb-account-create-update" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.433700 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.436996 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-8crk8" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.437009 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.443581 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-wcsql"] Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.592850 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.596756 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-job-config-data\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.596830 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-combined-ca-bundle\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.596943 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxl6v\" (UniqueName: \"kubernetes.io/projected/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-kube-api-access-qxl6v\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.597158 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-config-data\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.641634 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jlpmf"] Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.698578 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-config-data\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.698981 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-job-config-data\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.699158 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-combined-ca-bundle\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.699328 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxl6v\" (UniqueName: \"kubernetes.io/projected/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-kube-api-access-qxl6v\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.704385 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-config-data\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.704578 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-combined-ca-bundle\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.705386 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-job-config-data\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.720811 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxl6v\" (UniqueName: \"kubernetes.io/projected/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-kube-api-access-qxl6v\") pod \"manila-db-sync-wcsql\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:58 crc kubenswrapper[5033]: I0226 17:27:58.765733 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wcsql" Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.129097 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jlpmf" podUID="c139f380-4c74-42e0-bb11-482ab3660963" containerName="registry-server" containerID="cri-o://e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c" gracePeriod=2 Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.406685 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-wcsql"] Feb 26 17:27:59 crc kubenswrapper[5033]: W0226 17:27:59.407352 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9556b055_cd2e_4dd2_a4c7_fedb56d1f3b9.slice/crio-def4cd70d47318cdc1a961169ab88f8f3a2ce9bf856af1fd2497f60193aad88b WatchSource:0}: Error finding container def4cd70d47318cdc1a961169ab88f8f3a2ce9bf856af1fd2497f60193aad88b: Status 404 returned error can't find the container with id def4cd70d47318cdc1a961169ab88f8f3a2ce9bf856af1fd2497f60193aad88b Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.582105 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.721571 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-utilities\") pod \"c139f380-4c74-42e0-bb11-482ab3660963\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.721780 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-catalog-content\") pod \"c139f380-4c74-42e0-bb11-482ab3660963\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.721892 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lplj8\" (UniqueName: \"kubernetes.io/projected/c139f380-4c74-42e0-bb11-482ab3660963-kube-api-access-lplj8\") pod \"c139f380-4c74-42e0-bb11-482ab3660963\" (UID: \"c139f380-4c74-42e0-bb11-482ab3660963\") " Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.722637 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-utilities" (OuterVolumeSpecName: "utilities") pod "c139f380-4c74-42e0-bb11-482ab3660963" (UID: "c139f380-4c74-42e0-bb11-482ab3660963"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.723145 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.727931 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c139f380-4c74-42e0-bb11-482ab3660963-kube-api-access-lplj8" (OuterVolumeSpecName: "kube-api-access-lplj8") pod "c139f380-4c74-42e0-bb11-482ab3660963" (UID: "c139f380-4c74-42e0-bb11-482ab3660963"). InnerVolumeSpecName "kube-api-access-lplj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.780891 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c139f380-4c74-42e0-bb11-482ab3660963" (UID: "c139f380-4c74-42e0-bb11-482ab3660963"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.826053 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c139f380-4c74-42e0-bb11-482ab3660963-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:27:59 crc kubenswrapper[5033]: I0226 17:27:59.826086 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lplj8\" (UniqueName: \"kubernetes.io/projected/c139f380-4c74-42e0-bb11-482ab3660963-kube-api-access-lplj8\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.139760 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535448-4nc4j"] Feb 26 17:28:00 crc kubenswrapper[5033]: E0226 17:28:00.143488 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c139f380-4c74-42e0-bb11-482ab3660963" containerName="registry-server" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.143511 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c139f380-4c74-42e0-bb11-482ab3660963" containerName="registry-server" Feb 26 17:28:00 crc kubenswrapper[5033]: E0226 17:28:00.143525 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c139f380-4c74-42e0-bb11-482ab3660963" containerName="extract-content" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.143531 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c139f380-4c74-42e0-bb11-482ab3660963" containerName="extract-content" Feb 26 17:28:00 crc kubenswrapper[5033]: E0226 17:28:00.143541 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c139f380-4c74-42e0-bb11-482ab3660963" containerName="extract-utilities" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.143548 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c139f380-4c74-42e0-bb11-482ab3660963" containerName="extract-utilities" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.143764 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c139f380-4c74-42e0-bb11-482ab3660963" containerName="registry-server" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.145964 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535448-4nc4j" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.149575 5033 generic.go:334] "Generic (PLEG): container finished" podID="c139f380-4c74-42e0-bb11-482ab3660963" containerID="e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c" exitCode=0 Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.149655 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlpmf" event={"ID":"c139f380-4c74-42e0-bb11-482ab3660963","Type":"ContainerDied","Data":"e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c"} Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.149702 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jlpmf" event={"ID":"c139f380-4c74-42e0-bb11-482ab3660963","Type":"ContainerDied","Data":"c66bd6d51272414289815ab9df5a56633ef6335035a217fae2bbbbb5d6bc9bf0"} Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.149722 5033 scope.go:117] "RemoveContainer" containerID="e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.149643 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jlpmf" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.149960 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.152018 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wcsql" event={"ID":"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9","Type":"ContainerStarted","Data":"def4cd70d47318cdc1a961169ab88f8f3a2ce9bf856af1fd2497f60193aad88b"} Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.154038 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.157579 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.157714 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535448-4nc4j"] Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.209282 5033 scope.go:117] "RemoveContainer" containerID="5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.216636 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jlpmf"] Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.237285 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twqjj\" (UniqueName: \"kubernetes.io/projected/82997903-13a2-41d6-91f5-3719bac90196-kube-api-access-twqjj\") pod \"auto-csr-approver-29535448-4nc4j\" (UID: \"82997903-13a2-41d6-91f5-3719bac90196\") " pod="openshift-infra/auto-csr-approver-29535448-4nc4j" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.242509 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jlpmf"] Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.245500 5033 scope.go:117] "RemoveContainer" containerID="bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7" Feb 26 17:28:00 crc kubenswrapper[5033]: E0226 17:28:00.268756 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc139f380_4c74_42e0_bb11_482ab3660963.slice/crio-c66bd6d51272414289815ab9df5a56633ef6335035a217fae2bbbbb5d6bc9bf0\": RecentStats: unable to find data in memory cache]" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.291571 5033 scope.go:117] "RemoveContainer" containerID="e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c" Feb 26 17:28:00 crc kubenswrapper[5033]: E0226 17:28:00.295614 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c\": container with ID starting with e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c not found: ID does not exist" containerID="e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.295667 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c"} err="failed to get container status \"e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c\": rpc error: code = NotFound desc = could not find container \"e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c\": container with ID starting with e69f1539dbc2a8f1c369e74bde80c1d1a5abc8f4abfb629bfaf4507a2b5b766c not found: ID does not exist" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.295693 5033 scope.go:117] "RemoveContainer" containerID="5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10" Feb 26 17:28:00 crc kubenswrapper[5033]: E0226 17:28:00.297405 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10\": container with ID starting with 5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10 not found: ID does not exist" containerID="5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.297457 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10"} err="failed to get container status \"5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10\": rpc error: code = NotFound desc = could not find container \"5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10\": container with ID starting with 5d1c26852e885b65afbe02c94d9f889c9d0df45ef82f1719295d815b26ae9c10 not found: ID does not exist" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.297488 5033 scope.go:117] "RemoveContainer" containerID="bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7" Feb 26 17:28:00 crc kubenswrapper[5033]: E0226 17:28:00.297918 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7\": container with ID starting with bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7 not found: ID does not exist" containerID="bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.297989 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7"} err="failed to get container status \"bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7\": rpc error: code = NotFound desc = could not find container \"bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7\": container with ID starting with bd3a4d4884b4af0509ef902a1d200a62335f159e321b6cc7d4729a860a36e5f7 not found: ID does not exist" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.339527 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twqjj\" (UniqueName: \"kubernetes.io/projected/82997903-13a2-41d6-91f5-3719bac90196-kube-api-access-twqjj\") pod \"auto-csr-approver-29535448-4nc4j\" (UID: \"82997903-13a2-41d6-91f5-3719bac90196\") " pod="openshift-infra/auto-csr-approver-29535448-4nc4j" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.359370 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twqjj\" (UniqueName: \"kubernetes.io/projected/82997903-13a2-41d6-91f5-3719bac90196-kube-api-access-twqjj\") pod \"auto-csr-approver-29535448-4nc4j\" (UID: \"82997903-13a2-41d6-91f5-3719bac90196\") " pod="openshift-infra/auto-csr-approver-29535448-4nc4j" Feb 26 17:28:00 crc kubenswrapper[5033]: I0226 17:28:00.475587 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535448-4nc4j" Feb 26 17:28:01 crc kubenswrapper[5033]: I0226 17:28:01.647662 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c139f380-4c74-42e0-bb11-482ab3660963" path="/var/lib/kubelet/pods/c139f380-4c74-42e0-bb11-482ab3660963/volumes" Feb 26 17:28:01 crc kubenswrapper[5033]: I0226 17:28:01.658183 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535448-4nc4j"] Feb 26 17:28:03 crc kubenswrapper[5033]: I0226 17:28:03.207201 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535448-4nc4j" event={"ID":"82997903-13a2-41d6-91f5-3719bac90196","Type":"ContainerStarted","Data":"afcf9dfb75049f5802aa9327df786c34bfee5becbd19cdf3f8c3863c7607d64e"} Feb 26 17:28:04 crc kubenswrapper[5033]: I0226 17:28:04.219051 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wcsql" event={"ID":"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9","Type":"ContainerStarted","Data":"9163fdac3e58ff8672675378c62b3c326c527eaa5ae675bfa3beea7908506825"} Feb 26 17:28:04 crc kubenswrapper[5033]: I0226 17:28:04.240064 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-wcsql" podStartSLOduration=2.132459298 podStartE2EDuration="6.240048708s" podCreationTimestamp="2026-02-26 17:27:58 +0000 UTC" firstStartedPulling="2026-02-26 17:27:59.409467648 +0000 UTC m=+6406.622066308" lastFinishedPulling="2026-02-26 17:28:03.517057068 +0000 UTC m=+6410.729655718" observedRunningTime="2026-02-26 17:28:04.237057525 +0000 UTC m=+6411.449656195" watchObservedRunningTime="2026-02-26 17:28:04.240048708 +0000 UTC m=+6411.452647358" Feb 26 17:28:05 crc kubenswrapper[5033]: I0226 17:28:05.230147 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535448-4nc4j" event={"ID":"82997903-13a2-41d6-91f5-3719bac90196","Type":"ContainerStarted","Data":"e6d3d2638c9162311d6e2812fde55c34c3acd6e97c12ca0d8c686dacccc427e4"} Feb 26 17:28:05 crc kubenswrapper[5033]: I0226 17:28:05.248976 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535448-4nc4j" podStartSLOduration=4.155440451 podStartE2EDuration="5.248959251s" podCreationTimestamp="2026-02-26 17:28:00 +0000 UTC" firstStartedPulling="2026-02-26 17:28:03.442034776 +0000 UTC m=+6410.654633426" lastFinishedPulling="2026-02-26 17:28:04.535553536 +0000 UTC m=+6411.748152226" observedRunningTime="2026-02-26 17:28:05.247138347 +0000 UTC m=+6412.459737007" watchObservedRunningTime="2026-02-26 17:28:05.248959251 +0000 UTC m=+6412.461557891" Feb 26 17:28:06 crc kubenswrapper[5033]: I0226 17:28:06.242067 5033 generic.go:334] "Generic (PLEG): container finished" podID="82997903-13a2-41d6-91f5-3719bac90196" containerID="e6d3d2638c9162311d6e2812fde55c34c3acd6e97c12ca0d8c686dacccc427e4" exitCode=0 Feb 26 17:28:06 crc kubenswrapper[5033]: I0226 17:28:06.242224 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535448-4nc4j" event={"ID":"82997903-13a2-41d6-91f5-3719bac90196","Type":"ContainerDied","Data":"e6d3d2638c9162311d6e2812fde55c34c3acd6e97c12ca0d8c686dacccc427e4"} Feb 26 17:28:06 crc kubenswrapper[5033]: I0226 17:28:06.246790 5033 generic.go:334] "Generic (PLEG): container finished" podID="9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9" containerID="9163fdac3e58ff8672675378c62b3c326c527eaa5ae675bfa3beea7908506825" exitCode=0 Feb 26 17:28:06 crc kubenswrapper[5033]: I0226 17:28:06.246851 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wcsql" event={"ID":"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9","Type":"ContainerDied","Data":"9163fdac3e58ff8672675378c62b3c326c527eaa5ae675bfa3beea7908506825"} Feb 26 17:28:07 crc kubenswrapper[5033]: I0226 17:28:07.840994 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535448-4nc4j" Feb 26 17:28:07 crc kubenswrapper[5033]: I0226 17:28:07.849067 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wcsql" Feb 26 17:28:07 crc kubenswrapper[5033]: I0226 17:28:07.910244 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twqjj\" (UniqueName: \"kubernetes.io/projected/82997903-13a2-41d6-91f5-3719bac90196-kube-api-access-twqjj\") pod \"82997903-13a2-41d6-91f5-3719bac90196\" (UID: \"82997903-13a2-41d6-91f5-3719bac90196\") " Feb 26 17:28:07 crc kubenswrapper[5033]: I0226 17:28:07.919201 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82997903-13a2-41d6-91f5-3719bac90196-kube-api-access-twqjj" (OuterVolumeSpecName: "kube-api-access-twqjj") pod "82997903-13a2-41d6-91f5-3719bac90196" (UID: "82997903-13a2-41d6-91f5-3719bac90196"). InnerVolumeSpecName "kube-api-access-twqjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.012158 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxl6v\" (UniqueName: \"kubernetes.io/projected/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-kube-api-access-qxl6v\") pod \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.012479 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-job-config-data\") pod \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.012547 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-combined-ca-bundle\") pod \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.012748 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-config-data\") pod \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\" (UID: \"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9\") " Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.013539 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twqjj\" (UniqueName: \"kubernetes.io/projected/82997903-13a2-41d6-91f5-3719bac90196-kube-api-access-twqjj\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.015742 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-kube-api-access-qxl6v" (OuterVolumeSpecName: "kube-api-access-qxl6v") pod "9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9" (UID: "9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9"). InnerVolumeSpecName "kube-api-access-qxl6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.015888 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9" (UID: "9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.019455 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-config-data" (OuterVolumeSpecName: "config-data") pod "9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9" (UID: "9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.051491 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9" (UID: "9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.114889 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxl6v\" (UniqueName: \"kubernetes.io/projected/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-kube-api-access-qxl6v\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.114921 5033 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-job-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.114930 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.114938 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.276142 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535448-4nc4j" event={"ID":"82997903-13a2-41d6-91f5-3719bac90196","Type":"ContainerDied","Data":"afcf9dfb75049f5802aa9327df786c34bfee5becbd19cdf3f8c3863c7607d64e"} Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.276690 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afcf9dfb75049f5802aa9327df786c34bfee5becbd19cdf3f8c3863c7607d64e" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.276231 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535448-4nc4j" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.290031 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wcsql" event={"ID":"9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9","Type":"ContainerDied","Data":"def4cd70d47318cdc1a961169ab88f8f3a2ce9bf856af1fd2497f60193aad88b"} Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.290097 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="def4cd70d47318cdc1a961169ab88f8f3a2ce9bf856af1fd2497f60193aad88b" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.290323 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wcsql" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.349340 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535442-2bk4l"] Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.360964 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535442-2bk4l"] Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.642309 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 26 17:28:08 crc kubenswrapper[5033]: E0226 17:28:08.643023 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9" containerName="manila-db-sync" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.643042 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9" containerName="manila-db-sync" Feb 26 17:28:08 crc kubenswrapper[5033]: E0226 17:28:08.643063 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82997903-13a2-41d6-91f5-3719bac90196" containerName="oc" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.643070 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="82997903-13a2-41d6-91f5-3719bac90196" containerName="oc" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.643319 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9" containerName="manila-db-sync" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.643354 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="82997903-13a2-41d6-91f5-3719bac90196" containerName="oc" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.644442 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.648725 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.648913 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.649074 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.649170 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-8crk8" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.680399 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.682134 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.684398 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.713095 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.722917 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.729215 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.729294 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zfjx\" (UniqueName: \"kubernetes.io/projected/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-kube-api-access-5zfjx\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.729346 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-config-data\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.729417 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-scripts\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.729460 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.729522 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.756699 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d4f997557-gf2sl"] Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.783826 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.799923 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d4f997557-gf2sl"] Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.839239 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.842484 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-config-data\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.840621 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.848460 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-scripts\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.848668 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.848757 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.848975 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zfjx\" (UniqueName: \"kubernetes.io/projected/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-kube-api-access-5zfjx\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.849124 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-config-data\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.849220 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nrr5\" (UniqueName: \"kubernetes.io/projected/96f8bb5c-3820-4d9f-856d-f777d2e44129-kube-api-access-4nrr5\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.849446 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96f8bb5c-3820-4d9f-856d-f777d2e44129-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.849479 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.849539 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-scripts\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.849662 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/96f8bb5c-3820-4d9f-856d-f777d2e44129-ceph\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.849695 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.849914 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/96f8bb5c-3820-4d9f-856d-f777d2e44129-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.854494 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.855189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.863787 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-config-data\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.870551 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.872838 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.873190 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-scripts\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.874652 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.876873 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zfjx\" (UniqueName: \"kubernetes.io/projected/8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd-kube-api-access-5zfjx\") pod \"manila-scheduler-0\" (UID: \"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd\") " pod="openstack/manila-scheduler-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.901894 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951484 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/96f8bb5c-3820-4d9f-856d-f777d2e44129-ceph\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951549 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/96f8bb5c-3820-4d9f-856d-f777d2e44129-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951592 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-config-data\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951653 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090a19bc-766f-43fb-9220-dfefb248ebf4-logs\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951683 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-nb\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951722 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-config-data\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951757 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-sb\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951778 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-scripts\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951836 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951890 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951953 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nrr5\" (UniqueName: \"kubernetes.io/projected/96f8bb5c-3820-4d9f-856d-f777d2e44129-kube-api-access-4nrr5\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.951983 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-config-data-custom\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.952837 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtq6v\" (UniqueName: \"kubernetes.io/projected/e5728a38-0f9c-4637-a636-eb6b3e90387b-kube-api-access-qtq6v\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.952876 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-scripts\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.952901 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9dhj\" (UniqueName: \"kubernetes.io/projected/090a19bc-766f-43fb-9220-dfefb248ebf4-kube-api-access-j9dhj\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.952924 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96f8bb5c-3820-4d9f-856d-f777d2e44129-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.952946 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.952968 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-dns-svc\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.953005 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-config\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.953034 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/090a19bc-766f-43fb-9220-dfefb248ebf4-etc-machine-id\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.953378 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/96f8bb5c-3820-4d9f-856d-f777d2e44129-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.953459 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96f8bb5c-3820-4d9f-856d-f777d2e44129-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.959137 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.959858 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/96f8bb5c-3820-4d9f-856d-f777d2e44129-ceph\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.960957 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-config-data\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.961828 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.968153 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96f8bb5c-3820-4d9f-856d-f777d2e44129-scripts\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:08 crc kubenswrapper[5033]: I0226 17:28:08.969185 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nrr5\" (UniqueName: \"kubernetes.io/projected/96f8bb5c-3820-4d9f-856d-f777d2e44129-kube-api-access-4nrr5\") pod \"manila-share-share1-0\" (UID: \"96f8bb5c-3820-4d9f-856d-f777d2e44129\") " pod="openstack/manila-share-share1-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.027465 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.055707 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.055772 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-config-data-custom\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.055829 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtq6v\" (UniqueName: \"kubernetes.io/projected/e5728a38-0f9c-4637-a636-eb6b3e90387b-kube-api-access-qtq6v\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.055853 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-scripts\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.055881 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9dhj\" (UniqueName: \"kubernetes.io/projected/090a19bc-766f-43fb-9220-dfefb248ebf4-kube-api-access-j9dhj\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.055906 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-dns-svc\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.055942 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-config\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.055976 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/090a19bc-766f-43fb-9220-dfefb248ebf4-etc-machine-id\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.056048 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-config-data\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.056071 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090a19bc-766f-43fb-9220-dfefb248ebf4-logs\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.056099 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-nb\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.056148 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-sb\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.056447 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/090a19bc-766f-43fb-9220-dfefb248ebf4-etc-machine-id\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.057005 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090a19bc-766f-43fb-9220-dfefb248ebf4-logs\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.057006 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.060123 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-config-data-custom\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.061196 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-config-data\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.064114 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.066308 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-dns-svc\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.066320 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-nb\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.067274 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090a19bc-766f-43fb-9220-dfefb248ebf4-scripts\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.066907 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-config\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.067675 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-sb\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.075426 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtq6v\" (UniqueName: \"kubernetes.io/projected/e5728a38-0f9c-4637-a636-eb6b3e90387b-kube-api-access-qtq6v\") pod \"dnsmasq-dns-d4f997557-gf2sl\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.077402 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9dhj\" (UniqueName: \"kubernetes.io/projected/090a19bc-766f-43fb-9220-dfefb248ebf4-kube-api-access-j9dhj\") pod \"manila-api-0\" (UID: \"090a19bc-766f-43fb-9220-dfefb248ebf4\") " pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.117393 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.360489 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.597455 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.648049 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aefb3e10-7132-4716-abc2-e0f3eeea1d72" path="/var/lib/kubelet/pods/aefb3e10-7132-4716-abc2-e0f3eeea1d72/volumes" Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.703175 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.812457 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d4f997557-gf2sl"] Feb 26 17:28:09 crc kubenswrapper[5033]: W0226 17:28:09.815181 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5728a38_0f9c_4637_a636_eb6b3e90387b.slice/crio-46eb80248786b5d70806e663de622a9baed084a9809061d1370d74f2732c954d WatchSource:0}: Error finding container 46eb80248786b5d70806e663de622a9baed084a9809061d1370d74f2732c954d: Status 404 returned error can't find the container with id 46eb80248786b5d70806e663de622a9baed084a9809061d1370d74f2732c954d Feb 26 17:28:09 crc kubenswrapper[5033]: I0226 17:28:09.997944 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 26 17:28:10 crc kubenswrapper[5033]: W0226 17:28:10.024324 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod090a19bc_766f_43fb_9220_dfefb248ebf4.slice/crio-9278c0604e4bfb190776415cba4fcdbe65911ad7217c74aeb314960b394ab793 WatchSource:0}: Error finding container 9278c0604e4bfb190776415cba4fcdbe65911ad7217c74aeb314960b394ab793: Status 404 returned error can't find the container with id 9278c0604e4bfb190776415cba4fcdbe65911ad7217c74aeb314960b394ab793 Feb 26 17:28:10 crc kubenswrapper[5033]: I0226 17:28:10.321049 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd","Type":"ContainerStarted","Data":"2d9e3b4e3ff3f79a751031bff4596501ccd06879747e837131da1b6159eb5dee"} Feb 26 17:28:10 crc kubenswrapper[5033]: I0226 17:28:10.322871 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"96f8bb5c-3820-4d9f-856d-f777d2e44129","Type":"ContainerStarted","Data":"e6a4866453d247d84de354d7e1fdd452399bd59f580009be96804402c63a94e3"} Feb 26 17:28:10 crc kubenswrapper[5033]: I0226 17:28:10.324216 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"090a19bc-766f-43fb-9220-dfefb248ebf4","Type":"ContainerStarted","Data":"9278c0604e4bfb190776415cba4fcdbe65911ad7217c74aeb314960b394ab793"} Feb 26 17:28:10 crc kubenswrapper[5033]: I0226 17:28:10.325452 5033 generic.go:334] "Generic (PLEG): container finished" podID="e5728a38-0f9c-4637-a636-eb6b3e90387b" containerID="2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5" exitCode=0 Feb 26 17:28:10 crc kubenswrapper[5033]: I0226 17:28:10.325481 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" event={"ID":"e5728a38-0f9c-4637-a636-eb6b3e90387b","Type":"ContainerDied","Data":"2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5"} Feb 26 17:28:10 crc kubenswrapper[5033]: I0226 17:28:10.325496 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" event={"ID":"e5728a38-0f9c-4637-a636-eb6b3e90387b","Type":"ContainerStarted","Data":"46eb80248786b5d70806e663de622a9baed084a9809061d1370d74f2732c954d"} Feb 26 17:28:11 crc kubenswrapper[5033]: I0226 17:28:11.353033 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"090a19bc-766f-43fb-9220-dfefb248ebf4","Type":"ContainerStarted","Data":"708261636cf2882331c82a354d51fac4e7afec6f1dcd234111ecfeb4b7267811"} Feb 26 17:28:11 crc kubenswrapper[5033]: I0226 17:28:11.353337 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"090a19bc-766f-43fb-9220-dfefb248ebf4","Type":"ContainerStarted","Data":"da8bd9cf71a5bcfe46d0a387a2806b4676b48b8b1244677142dfd19eedb78a2c"} Feb 26 17:28:11 crc kubenswrapper[5033]: I0226 17:28:11.353401 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 26 17:28:11 crc kubenswrapper[5033]: I0226 17:28:11.359862 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" event={"ID":"e5728a38-0f9c-4637-a636-eb6b3e90387b","Type":"ContainerStarted","Data":"305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae"} Feb 26 17:28:11 crc kubenswrapper[5033]: I0226 17:28:11.360056 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:11 crc kubenswrapper[5033]: I0226 17:28:11.387769 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.387749574 podStartE2EDuration="3.387749574s" podCreationTimestamp="2026-02-26 17:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:28:11.378016327 +0000 UTC m=+6418.590614977" watchObservedRunningTime="2026-02-26 17:28:11.387749574 +0000 UTC m=+6418.600348224" Feb 26 17:28:11 crc kubenswrapper[5033]: I0226 17:28:11.403795 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" podStartSLOduration=3.403776865 podStartE2EDuration="3.403776865s" podCreationTimestamp="2026-02-26 17:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:28:11.393227028 +0000 UTC m=+6418.605825678" watchObservedRunningTime="2026-02-26 17:28:11.403776865 +0000 UTC m=+6418.616375515" Feb 26 17:28:12 crc kubenswrapper[5033]: I0226 17:28:12.393722 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd","Type":"ContainerStarted","Data":"9a50fc5fd4bc3698792359e3349ce0ca7a688273c5a30160c2adae94d345fde8"} Feb 26 17:28:12 crc kubenswrapper[5033]: I0226 17:28:12.394447 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd","Type":"ContainerStarted","Data":"3df4669895affe7800fc66c8258f93c41f5cc9db4d573bb545de6ed124c66c43"} Feb 26 17:28:12 crc kubenswrapper[5033]: I0226 17:28:12.417605 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.114603312 podStartE2EDuration="4.417589608s" podCreationTimestamp="2026-02-26 17:28:08 +0000 UTC" firstStartedPulling="2026-02-26 17:28:09.608964216 +0000 UTC m=+6416.821562876" lastFinishedPulling="2026-02-26 17:28:10.911950522 +0000 UTC m=+6418.124549172" observedRunningTime="2026-02-26 17:28:12.413681453 +0000 UTC m=+6419.626280113" watchObservedRunningTime="2026-02-26 17:28:12.417589608 +0000 UTC m=+6419.630188248" Feb 26 17:28:16 crc kubenswrapper[5033]: I0226 17:28:16.369421 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 17:28:17 crc kubenswrapper[5033]: I0226 17:28:17.469004 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"96f8bb5c-3820-4d9f-856d-f777d2e44129","Type":"ContainerStarted","Data":"c931d02da40349bc2d12abb90b84776167fddf73c29a2099b9940f5ef6b9fdab"} Feb 26 17:28:17 crc kubenswrapper[5033]: I0226 17:28:17.471510 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"96f8bb5c-3820-4d9f-856d-f777d2e44129","Type":"ContainerStarted","Data":"c2e97c5fdf2f662c81d6dc2321d05b25b2a187334cce6d8b7aab8a51f29970d9"} Feb 26 17:28:17 crc kubenswrapper[5033]: I0226 17:28:17.497170 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.03962549 podStartE2EDuration="9.497146609s" podCreationTimestamp="2026-02-26 17:28:08 +0000 UTC" firstStartedPulling="2026-02-26 17:28:09.708643871 +0000 UTC m=+6416.921242521" lastFinishedPulling="2026-02-26 17:28:16.16616499 +0000 UTC m=+6423.378763640" observedRunningTime="2026-02-26 17:28:17.491809959 +0000 UTC m=+6424.704408619" watchObservedRunningTime="2026-02-26 17:28:17.497146609 +0000 UTC m=+6424.709745259" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.028102 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.057252 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.120042 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.206279 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d89f5b57-5xbrd"] Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.206513 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" podUID="a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" containerName="dnsmasq-dns" containerID="cri-o://076e0b99b0ccd5a7ca168e4d519c3b2ca48c5c75482054f1d6f8dfb9f47a6ba4" gracePeriod=10 Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.508709 5033 generic.go:334] "Generic (PLEG): container finished" podID="a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" containerID="076e0b99b0ccd5a7ca168e4d519c3b2ca48c5c75482054f1d6f8dfb9f47a6ba4" exitCode=0 Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.508876 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" event={"ID":"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a","Type":"ContainerDied","Data":"076e0b99b0ccd5a7ca168e4d519c3b2ca48c5c75482054f1d6f8dfb9f47a6ba4"} Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.774677 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.858780 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-dns-svc\") pod \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.858822 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-config\") pod \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.858893 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-nb\") pod \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.859104 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-sb\") pod \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.859158 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9l24\" (UniqueName: \"kubernetes.io/projected/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-kube-api-access-d9l24\") pod \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\" (UID: \"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a\") " Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.879589 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-kube-api-access-d9l24" (OuterVolumeSpecName: "kube-api-access-d9l24") pod "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" (UID: "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a"). InnerVolumeSpecName "kube-api-access-d9l24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.921636 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" (UID: "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.925667 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" (UID: "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.931691 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-config" (OuterVolumeSpecName: "config") pod "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" (UID: "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.936893 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" (UID: "a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.961538 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.961571 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9l24\" (UniqueName: \"kubernetes.io/projected/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-kube-api-access-d9l24\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.961581 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.961590 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:19 crc kubenswrapper[5033]: I0226 17:28:19.961599 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.189196 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.189555 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.189606 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.190479 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.190538 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" gracePeriod=600 Feb 26 17:28:20 crc kubenswrapper[5033]: E0226 17:28:20.310524 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.527939 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" event={"ID":"a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a","Type":"ContainerDied","Data":"cb0de5de7a95a36db625e271f9a9f8028f816ebe541e35d3c9a53d635e007ae4"} Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.527985 5033 scope.go:117] "RemoveContainer" containerID="076e0b99b0ccd5a7ca168e4d519c3b2ca48c5c75482054f1d6f8dfb9f47a6ba4" Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.528088 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d89f5b57-5xbrd" Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.536583 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" exitCode=0 Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.536633 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb"} Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.537554 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:28:20 crc kubenswrapper[5033]: E0226 17:28:20.537874 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.584969 5033 scope.go:117] "RemoveContainer" containerID="a775950506db4d562be7ca64cb610991e2cf7b6c88149398e8f1b575f8d7d96d" Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.603791 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d89f5b57-5xbrd"] Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.619970 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d89f5b57-5xbrd"] Feb 26 17:28:20 crc kubenswrapper[5033]: I0226 17:28:20.663809 5033 scope.go:117] "RemoveContainer" containerID="33064e87c19129e8c72344901e1129ded177728fee58532d134814d33183f904" Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.254642 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.255188 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="ceilometer-central-agent" containerID="cri-o://08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1" gracePeriod=30 Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.255316 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="ceilometer-notification-agent" containerID="cri-o://3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23" gracePeriod=30 Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.255367 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="proxy-httpd" containerID="cri-o://6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6" gracePeriod=30 Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.255430 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="sg-core" containerID="cri-o://5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace" gracePeriod=30 Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.583384 5033 generic.go:334] "Generic (PLEG): container finished" podID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerID="6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6" exitCode=0 Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.583428 5033 generic.go:334] "Generic (PLEG): container finished" podID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerID="5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace" exitCode=2 Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.583448 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerDied","Data":"6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6"} Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.583500 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerDied","Data":"5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace"} Feb 26 17:28:21 crc kubenswrapper[5033]: I0226 17:28:21.642013 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" path="/var/lib/kubelet/pods/a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a/volumes" Feb 26 17:28:22 crc kubenswrapper[5033]: I0226 17:28:22.600836 5033 generic.go:334] "Generic (PLEG): container finished" podID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerID="08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1" exitCode=0 Feb 26 17:28:22 crc kubenswrapper[5033]: I0226 17:28:22.600890 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerDied","Data":"08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1"} Feb 26 17:28:23 crc kubenswrapper[5033]: I0226 17:28:23.051673 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2cdc-account-create-update-9zcpl"] Feb 26 17:28:23 crc kubenswrapper[5033]: I0226 17:28:23.065429 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-f6ww6"] Feb 26 17:28:23 crc kubenswrapper[5033]: I0226 17:28:23.076684 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2cdc-account-create-update-9zcpl"] Feb 26 17:28:23 crc kubenswrapper[5033]: I0226 17:28:23.086429 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-f6ww6"] Feb 26 17:28:23 crc kubenswrapper[5033]: I0226 17:28:23.646116 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e89e07-a139-4821-b77c-f39911e1fb5e" path="/var/lib/kubelet/pods/84e89e07-a139-4821-b77c-f39911e1fb5e/volumes" Feb 26 17:28:23 crc kubenswrapper[5033]: I0226 17:28:23.648729 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd0439f-5d88-4e1f-bf1b-b0624a056a16" path="/var/lib/kubelet/pods/dbd0439f-5d88-4e1f-bf1b-b0624a056a16/volumes" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.254670 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.373869 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-log-httpd\") pod \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.373977 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-config-data\") pod \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.374023 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-sg-core-conf-yaml\") pod \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.374074 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-scripts\") pod \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.374184 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-run-httpd\") pod \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.374295 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-combined-ca-bundle\") pod \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.374338 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cjxn\" (UniqueName: \"kubernetes.io/projected/527e8b6d-0f16-4d7b-ae72-04735e5921ac-kube-api-access-4cjxn\") pod \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\" (UID: \"527e8b6d-0f16-4d7b-ae72-04735e5921ac\") " Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.375800 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "527e8b6d-0f16-4d7b-ae72-04735e5921ac" (UID: "527e8b6d-0f16-4d7b-ae72-04735e5921ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.376077 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "527e8b6d-0f16-4d7b-ae72-04735e5921ac" (UID: "527e8b6d-0f16-4d7b-ae72-04735e5921ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.382059 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-scripts" (OuterVolumeSpecName: "scripts") pod "527e8b6d-0f16-4d7b-ae72-04735e5921ac" (UID: "527e8b6d-0f16-4d7b-ae72-04735e5921ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.400499 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527e8b6d-0f16-4d7b-ae72-04735e5921ac-kube-api-access-4cjxn" (OuterVolumeSpecName: "kube-api-access-4cjxn") pod "527e8b6d-0f16-4d7b-ae72-04735e5921ac" (UID: "527e8b6d-0f16-4d7b-ae72-04735e5921ac"). InnerVolumeSpecName "kube-api-access-4cjxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.407454 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "527e8b6d-0f16-4d7b-ae72-04735e5921ac" (UID: "527e8b6d-0f16-4d7b-ae72-04735e5921ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.476388 5033 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.476415 5033 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.476424 5033 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.476433 5033 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/527e8b6d-0f16-4d7b-ae72-04735e5921ac-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.476441 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cjxn\" (UniqueName: \"kubernetes.io/projected/527e8b6d-0f16-4d7b-ae72-04735e5921ac-kube-api-access-4cjxn\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.483978 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "527e8b6d-0f16-4d7b-ae72-04735e5921ac" (UID: "527e8b6d-0f16-4d7b-ae72-04735e5921ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.489649 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-config-data" (OuterVolumeSpecName: "config-data") pod "527e8b6d-0f16-4d7b-ae72-04735e5921ac" (UID: "527e8b6d-0f16-4d7b-ae72-04735e5921ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.577991 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.578021 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527e8b6d-0f16-4d7b-ae72-04735e5921ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.634433 5033 generic.go:334] "Generic (PLEG): container finished" podID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerID="3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23" exitCode=0 Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.634535 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.643628 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerDied","Data":"3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23"} Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.643668 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"527e8b6d-0f16-4d7b-ae72-04735e5921ac","Type":"ContainerDied","Data":"4fa683582492eeb2c5401133ad458acccb0083c1ccfa126566f396186e1cdbdb"} Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.643686 5033 scope.go:117] "RemoveContainer" containerID="6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.721650 5033 scope.go:117] "RemoveContainer" containerID="5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.723798 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.745520 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.747094 5033 scope.go:117] "RemoveContainer" containerID="3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.758580 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.759130 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="sg-core" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759161 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="sg-core" Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.759184 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" containerName="init" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759193 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" containerName="init" Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.759207 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" containerName="dnsmasq-dns" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759216 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" containerName="dnsmasq-dns" Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.759234 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="proxy-httpd" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759243 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="proxy-httpd" Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.759303 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="ceilometer-notification-agent" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759313 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="ceilometer-notification-agent" Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.759329 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="ceilometer-central-agent" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759336 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="ceilometer-central-agent" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759755 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="sg-core" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759789 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a71bc86a-15a6-4b26-98a9-8ec4d1d0b29a" containerName="dnsmasq-dns" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759800 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="proxy-httpd" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759821 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="ceilometer-notification-agent" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.759834 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" containerName="ceilometer-central-agent" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.762493 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.764798 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.766367 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.773082 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.789075 5033 scope.go:117] "RemoveContainer" containerID="08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.831349 5033 scope.go:117] "RemoveContainer" containerID="6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6" Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.831708 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6\": container with ID starting with 6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6 not found: ID does not exist" containerID="6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.831739 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6"} err="failed to get container status \"6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6\": rpc error: code = NotFound desc = could not find container \"6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6\": container with ID starting with 6a058b1bb207b46f98a5145181e4eb6c3a778e84d6635f2531239cb5038e7fa6 not found: ID does not exist" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.831764 5033 scope.go:117] "RemoveContainer" containerID="5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace" Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.832041 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace\": container with ID starting with 5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace not found: ID does not exist" containerID="5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.832093 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace"} err="failed to get container status \"5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace\": rpc error: code = NotFound desc = could not find container \"5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace\": container with ID starting with 5e691a66226342ae0f96a5f8f4aa5868af95e11117eef5195c51010a14a68ace not found: ID does not exist" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.832121 5033 scope.go:117] "RemoveContainer" containerID="3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23" Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.832617 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23\": container with ID starting with 3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23 not found: ID does not exist" containerID="3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.832643 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23"} err="failed to get container status \"3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23\": rpc error: code = NotFound desc = could not find container \"3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23\": container with ID starting with 3772659bf263c8489f46631cb8718db2333cf4239ff8340693eb961713c2bb23 not found: ID does not exist" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.832659 5033 scope.go:117] "RemoveContainer" containerID="08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1" Feb 26 17:28:25 crc kubenswrapper[5033]: E0226 17:28:25.832870 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1\": container with ID starting with 08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1 not found: ID does not exist" containerID="08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.832902 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1"} err="failed to get container status \"08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1\": rpc error: code = NotFound desc = could not find container \"08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1\": container with ID starting with 08ecaafc63646be50d79e7352d656885af8dd4e4888bcb27fb38fed20ffdf8d1 not found: ID does not exist" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.892705 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.892817 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-config-data\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.892846 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xjgc\" (UniqueName: \"kubernetes.io/projected/b63df35f-0e95-461c-9549-82bbe8f7e7b6-kube-api-access-5xjgc\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.892909 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-scripts\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.893009 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.893044 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b63df35f-0e95-461c-9549-82bbe8f7e7b6-log-httpd\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.893087 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b63df35f-0e95-461c-9549-82bbe8f7e7b6-run-httpd\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.995230 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-scripts\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.995306 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.995329 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b63df35f-0e95-461c-9549-82bbe8f7e7b6-log-httpd\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.995393 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b63df35f-0e95-461c-9549-82bbe8f7e7b6-run-httpd\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.995500 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.995562 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-config-data\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.995578 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xjgc\" (UniqueName: \"kubernetes.io/projected/b63df35f-0e95-461c-9549-82bbe8f7e7b6-kube-api-access-5xjgc\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.996414 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b63df35f-0e95-461c-9549-82bbe8f7e7b6-log-httpd\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:25 crc kubenswrapper[5033]: I0226 17:28:25.996533 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b63df35f-0e95-461c-9549-82bbe8f7e7b6-run-httpd\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:26 crc kubenswrapper[5033]: I0226 17:28:26.000809 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-config-data\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:26 crc kubenswrapper[5033]: I0226 17:28:26.009015 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-scripts\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:26 crc kubenswrapper[5033]: I0226 17:28:26.012752 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:26 crc kubenswrapper[5033]: I0226 17:28:26.012924 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63df35f-0e95-461c-9549-82bbe8f7e7b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:26 crc kubenswrapper[5033]: I0226 17:28:26.017333 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xjgc\" (UniqueName: \"kubernetes.io/projected/b63df35f-0e95-461c-9549-82bbe8f7e7b6-kube-api-access-5xjgc\") pod \"ceilometer-0\" (UID: \"b63df35f-0e95-461c-9549-82bbe8f7e7b6\") " pod="openstack/ceilometer-0" Feb 26 17:28:26 crc kubenswrapper[5033]: I0226 17:28:26.130976 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 17:28:26 crc kubenswrapper[5033]: I0226 17:28:26.588271 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 17:28:26 crc kubenswrapper[5033]: W0226 17:28:26.591885 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb63df35f_0e95_461c_9549_82bbe8f7e7b6.slice/crio-aff9184437df15c0a713fdf1c6a37bf8e6a0eccae340a73de58af4d893cde77a WatchSource:0}: Error finding container aff9184437df15c0a713fdf1c6a37bf8e6a0eccae340a73de58af4d893cde77a: Status 404 returned error can't find the container with id aff9184437df15c0a713fdf1c6a37bf8e6a0eccae340a73de58af4d893cde77a Feb 26 17:28:26 crc kubenswrapper[5033]: I0226 17:28:26.647108 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b63df35f-0e95-461c-9549-82bbe8f7e7b6","Type":"ContainerStarted","Data":"aff9184437df15c0a713fdf1c6a37bf8e6a0eccae340a73de58af4d893cde77a"} Feb 26 17:28:27 crc kubenswrapper[5033]: I0226 17:28:27.642034 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527e8b6d-0f16-4d7b-ae72-04735e5921ac" path="/var/lib/kubelet/pods/527e8b6d-0f16-4d7b-ae72-04735e5921ac/volumes" Feb 26 17:28:27 crc kubenswrapper[5033]: I0226 17:28:27.667037 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b63df35f-0e95-461c-9549-82bbe8f7e7b6","Type":"ContainerStarted","Data":"18fd8dc5bfcb10be943098e981dcaaf59136b05b9e4fe3ca6b21ba58ff4af3db"} Feb 26 17:28:28 crc kubenswrapper[5033]: I0226 17:28:28.681150 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b63df35f-0e95-461c-9549-82bbe8f7e7b6","Type":"ContainerStarted","Data":"b18731c23a78e7fc4b636b8663d5b3c1a910137ba920a718b241e2e7df2d1d0c"} Feb 26 17:28:29 crc kubenswrapper[5033]: I0226 17:28:29.692501 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b63df35f-0e95-461c-9549-82bbe8f7e7b6","Type":"ContainerStarted","Data":"f5a1556c49e30ea3c4ca582969e84f56114f2345efc697a432c11a92389ebcd7"} Feb 26 17:28:30 crc kubenswrapper[5033]: I0226 17:28:30.735906 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 26 17:28:30 crc kubenswrapper[5033]: I0226 17:28:30.771943 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Feb 26 17:28:30 crc kubenswrapper[5033]: I0226 17:28:30.776613 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 26 17:28:32 crc kubenswrapper[5033]: I0226 17:28:32.029443 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-8kn6s"] Feb 26 17:28:32 crc kubenswrapper[5033]: I0226 17:28:32.047744 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-8kn6s"] Feb 26 17:28:32 crc kubenswrapper[5033]: I0226 17:28:32.725832 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b63df35f-0e95-461c-9549-82bbe8f7e7b6","Type":"ContainerStarted","Data":"c0271d7acf7596c5698eda261c00e70f7fdc7bf3cf6c70e410a56ce373c0698f"} Feb 26 17:28:32 crc kubenswrapper[5033]: I0226 17:28:32.726014 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 17:28:32 crc kubenswrapper[5033]: I0226 17:28:32.763528 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.817583752 podStartE2EDuration="7.763501469s" podCreationTimestamp="2026-02-26 17:28:25 +0000 UTC" firstStartedPulling="2026-02-26 17:28:26.594240651 +0000 UTC m=+6433.806839301" lastFinishedPulling="2026-02-26 17:28:31.540158368 +0000 UTC m=+6438.752757018" observedRunningTime="2026-02-26 17:28:32.750784447 +0000 UTC m=+6439.963383147" watchObservedRunningTime="2026-02-26 17:28:32.763501469 +0000 UTC m=+6439.976100129" Feb 26 17:28:33 crc kubenswrapper[5033]: I0226 17:28:33.644524 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1238df11-1db7-4267-ac4b-5022d1c95276" path="/var/lib/kubelet/pods/1238df11-1db7-4267-ac4b-5022d1c95276/volumes" Feb 26 17:28:35 crc kubenswrapper[5033]: I0226 17:28:35.634147 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:28:35 crc kubenswrapper[5033]: E0226 17:28:35.634980 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:28:37 crc kubenswrapper[5033]: I0226 17:28:37.909457 5033 scope.go:117] "RemoveContainer" containerID="e357d2dc288499cdde9c3efa8a450c7394f722163c768dcd8a4d02d96005c819" Feb 26 17:28:37 crc kubenswrapper[5033]: I0226 17:28:37.946322 5033 scope.go:117] "RemoveContainer" containerID="19f81430c6a3452b15b6f486192cc79ec3e9914727a760f24da53ca3e93240eb" Feb 26 17:28:38 crc kubenswrapper[5033]: I0226 17:28:38.018491 5033 scope.go:117] "RemoveContainer" containerID="feca7ef87b1a921487bf90aab193db5a5347dbebdee98a36ece671abb9ffaee3" Feb 26 17:28:38 crc kubenswrapper[5033]: I0226 17:28:38.054189 5033 scope.go:117] "RemoveContainer" containerID="996b86f502df4fdfdf5b4f834aa2ad928df2d758d6ec7e94c15899d822732ce2" Feb 26 17:28:38 crc kubenswrapper[5033]: I0226 17:28:38.111311 5033 scope.go:117] "RemoveContainer" containerID="8dbd9a067f7457b455c91a64f6eea5771fc925d69648d01f9fb24dc077f3cced" Feb 26 17:28:38 crc kubenswrapper[5033]: I0226 17:28:38.181040 5033 scope.go:117] "RemoveContainer" containerID="4aa652ddaad40091a4a4b6543cdf3573656b96dae8216ddf5fcbf3345ad36c4a" Feb 26 17:28:38 crc kubenswrapper[5033]: I0226 17:28:38.261383 5033 scope.go:117] "RemoveContainer" containerID="af5cd57c976428c309db06e239a21d93478faf76a129279a9e6b1e8bb64e9495" Feb 26 17:28:48 crc kubenswrapper[5033]: I0226 17:28:48.628807 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:28:48 crc kubenswrapper[5033]: E0226 17:28:48.629524 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:28:56 crc kubenswrapper[5033]: I0226 17:28:56.139890 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 17:29:01 crc kubenswrapper[5033]: I0226 17:29:01.628778 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:29:01 crc kubenswrapper[5033]: E0226 17:29:01.629585 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.228981 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp"] Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.231238 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.234745 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.259224 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp"] Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.357676 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-nb\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.357740 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-dns-svc\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.357772 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-sb\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.357792 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-openstack-cell1\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.358146 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-config\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.358425 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cb4g\" (UniqueName: \"kubernetes.io/projected/10f5693b-8825-497b-89e3-c67bfa35bbf2-kube-api-access-2cb4g\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.460980 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-nb\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.461037 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-dns-svc\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.461059 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-sb\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.461077 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-openstack-cell1\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.461163 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-config\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.461196 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cb4g\" (UniqueName: \"kubernetes.io/projected/10f5693b-8825-497b-89e3-c67bfa35bbf2-kube-api-access-2cb4g\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.462231 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-sb\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.462343 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-dns-svc\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.462360 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-nb\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.462900 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-openstack-cell1\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.463366 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-config\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.483882 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cb4g\" (UniqueName: \"kubernetes.io/projected/10f5693b-8825-497b-89e3-c67bfa35bbf2-kube-api-access-2cb4g\") pod \"dnsmasq-dns-6fc5b7bfbc-xfkdp\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:12 crc kubenswrapper[5033]: I0226 17:29:12.559754 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:13 crc kubenswrapper[5033]: I0226 17:29:13.040291 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp"] Feb 26 17:29:13 crc kubenswrapper[5033]: W0226 17:29:13.063850 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10f5693b_8825_497b_89e3_c67bfa35bbf2.slice/crio-a612aab845fc7f0be70a866b94257d0afd933f61ed163d90a4f0b80cfa52dbbb WatchSource:0}: Error finding container a612aab845fc7f0be70a866b94257d0afd933f61ed163d90a4f0b80cfa52dbbb: Status 404 returned error can't find the container with id a612aab845fc7f0be70a866b94257d0afd933f61ed163d90a4f0b80cfa52dbbb Feb 26 17:29:13 crc kubenswrapper[5033]: I0226 17:29:13.208751 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" event={"ID":"10f5693b-8825-497b-89e3-c67bfa35bbf2","Type":"ContainerStarted","Data":"a612aab845fc7f0be70a866b94257d0afd933f61ed163d90a4f0b80cfa52dbbb"} Feb 26 17:29:14 crc kubenswrapper[5033]: I0226 17:29:14.217813 5033 generic.go:334] "Generic (PLEG): container finished" podID="10f5693b-8825-497b-89e3-c67bfa35bbf2" containerID="7e304696993682b426923c5632e5bcad2fc1431056d4cb47433ecaee61674749" exitCode=0 Feb 26 17:29:14 crc kubenswrapper[5033]: I0226 17:29:14.217876 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" event={"ID":"10f5693b-8825-497b-89e3-c67bfa35bbf2","Type":"ContainerDied","Data":"7e304696993682b426923c5632e5bcad2fc1431056d4cb47433ecaee61674749"} Feb 26 17:29:15 crc kubenswrapper[5033]: I0226 17:29:15.233782 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" event={"ID":"10f5693b-8825-497b-89e3-c67bfa35bbf2","Type":"ContainerStarted","Data":"43c71173c86b828a72e6a5e0697623643fe62b2e8359d5eb4b0e95b2a6bcd3f9"} Feb 26 17:29:15 crc kubenswrapper[5033]: I0226 17:29:15.234111 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:15 crc kubenswrapper[5033]: I0226 17:29:15.628959 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:29:15 crc kubenswrapper[5033]: E0226 17:29:15.629688 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.560993 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.588306 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" podStartSLOduration=10.588284666 podStartE2EDuration="10.588284666s" podCreationTimestamp="2026-02-26 17:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:29:15.259701212 +0000 UTC m=+6482.472299872" watchObservedRunningTime="2026-02-26 17:29:22.588284666 +0000 UTC m=+6489.800883316" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.628113 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d4f997557-gf2sl"] Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.628358 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" podUID="e5728a38-0f9c-4637-a636-eb6b3e90387b" containerName="dnsmasq-dns" containerID="cri-o://305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae" gracePeriod=10 Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.773963 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8586b96f6c-5jbm2"] Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.788657 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.789569 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8586b96f6c-5jbm2"] Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.891641 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-openstack-cell1\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.891737 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-ovsdbserver-sb\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.891770 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-ovsdbserver-nb\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.891802 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnmsd\" (UniqueName: \"kubernetes.io/projected/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-kube-api-access-qnmsd\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.891844 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-dns-svc\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.891887 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-config\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.993630 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-ovsdbserver-sb\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.993699 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-ovsdbserver-nb\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.993746 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnmsd\" (UniqueName: \"kubernetes.io/projected/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-kube-api-access-qnmsd\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.993806 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-dns-svc\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.993865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-config\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.994015 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-openstack-cell1\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.995034 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-openstack-cell1\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.996166 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-ovsdbserver-sb\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.996467 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-ovsdbserver-nb\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.996812 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-dns-svc\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:22 crc kubenswrapper[5033]: I0226 17:29:22.996954 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-config\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.016755 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnmsd\" (UniqueName: \"kubernetes.io/projected/4f725c7f-3187-4bf8-a8cd-ffc406ea45ac-kube-api-access-qnmsd\") pod \"dnsmasq-dns-8586b96f6c-5jbm2\" (UID: \"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac\") " pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.118932 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.262243 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.302137 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-dns-svc\") pod \"e5728a38-0f9c-4637-a636-eb6b3e90387b\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.302200 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-nb\") pod \"e5728a38-0f9c-4637-a636-eb6b3e90387b\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.302321 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-config\") pod \"e5728a38-0f9c-4637-a636-eb6b3e90387b\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.302549 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtq6v\" (UniqueName: \"kubernetes.io/projected/e5728a38-0f9c-4637-a636-eb6b3e90387b-kube-api-access-qtq6v\") pod \"e5728a38-0f9c-4637-a636-eb6b3e90387b\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.302568 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-sb\") pod \"e5728a38-0f9c-4637-a636-eb6b3e90387b\" (UID: \"e5728a38-0f9c-4637-a636-eb6b3e90387b\") " Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.338525 5033 generic.go:334] "Generic (PLEG): container finished" podID="e5728a38-0f9c-4637-a636-eb6b3e90387b" containerID="305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae" exitCode=0 Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.338650 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" event={"ID":"e5728a38-0f9c-4637-a636-eb6b3e90387b","Type":"ContainerDied","Data":"305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae"} Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.338683 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" event={"ID":"e5728a38-0f9c-4637-a636-eb6b3e90387b","Type":"ContainerDied","Data":"46eb80248786b5d70806e663de622a9baed084a9809061d1370d74f2732c954d"} Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.338728 5033 scope.go:117] "RemoveContainer" containerID="305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.339228 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4f997557-gf2sl" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.362467 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5728a38-0f9c-4637-a636-eb6b3e90387b-kube-api-access-qtq6v" (OuterVolumeSpecName: "kube-api-access-qtq6v") pod "e5728a38-0f9c-4637-a636-eb6b3e90387b" (UID: "e5728a38-0f9c-4637-a636-eb6b3e90387b"). InnerVolumeSpecName "kube-api-access-qtq6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.375117 5033 scope.go:117] "RemoveContainer" containerID="2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.404560 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtq6v\" (UniqueName: \"kubernetes.io/projected/e5728a38-0f9c-4637-a636-eb6b3e90387b-kube-api-access-qtq6v\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.414337 5033 scope.go:117] "RemoveContainer" containerID="305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.414560 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-config" (OuterVolumeSpecName: "config") pod "e5728a38-0f9c-4637-a636-eb6b3e90387b" (UID: "e5728a38-0f9c-4637-a636-eb6b3e90387b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:29:23 crc kubenswrapper[5033]: E0226 17:29:23.415293 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae\": container with ID starting with 305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae not found: ID does not exist" containerID="305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.415339 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae"} err="failed to get container status \"305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae\": rpc error: code = NotFound desc = could not find container \"305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae\": container with ID starting with 305281c9af05fd3dc464e971ed221693efa5da96f2b25dc2a92cd5919cd6f3ae not found: ID does not exist" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.415367 5033 scope.go:117] "RemoveContainer" containerID="2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5" Feb 26 17:29:23 crc kubenswrapper[5033]: E0226 17:29:23.415767 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5\": container with ID starting with 2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5 not found: ID does not exist" containerID="2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.415797 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5"} err="failed to get container status \"2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5\": rpc error: code = NotFound desc = could not find container \"2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5\": container with ID starting with 2c6ea747c0d6daadf7983b1a5b1c0145784f25d82373dfe11136cb257a7ce4d5 not found: ID does not exist" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.426148 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5728a38-0f9c-4637-a636-eb6b3e90387b" (UID: "e5728a38-0f9c-4637-a636-eb6b3e90387b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.429490 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5728a38-0f9c-4637-a636-eb6b3e90387b" (UID: "e5728a38-0f9c-4637-a636-eb6b3e90387b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.431297 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5728a38-0f9c-4637-a636-eb6b3e90387b" (UID: "e5728a38-0f9c-4637-a636-eb6b3e90387b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.506613 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.507604 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.507633 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.507648 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5728a38-0f9c-4637-a636-eb6b3e90387b-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.626641 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8586b96f6c-5jbm2"] Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.682539 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d4f997557-gf2sl"] Feb 26 17:29:23 crc kubenswrapper[5033]: I0226 17:29:23.693489 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d4f997557-gf2sl"] Feb 26 17:29:24 crc kubenswrapper[5033]: I0226 17:29:24.348897 5033 generic.go:334] "Generic (PLEG): container finished" podID="4f725c7f-3187-4bf8-a8cd-ffc406ea45ac" containerID="ae8ca47a7fb33673e7b23db04b91d7909dae454f3f775e2fd2c731bb63bf3f8c" exitCode=0 Feb 26 17:29:24 crc kubenswrapper[5033]: I0226 17:29:24.348998 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" event={"ID":"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac","Type":"ContainerDied","Data":"ae8ca47a7fb33673e7b23db04b91d7909dae454f3f775e2fd2c731bb63bf3f8c"} Feb 26 17:29:24 crc kubenswrapper[5033]: I0226 17:29:24.349230 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" event={"ID":"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac","Type":"ContainerStarted","Data":"e457f925520f55cdce819323e76454119d618f78f6f87e12a559a103d34e7072"} Feb 26 17:29:25 crc kubenswrapper[5033]: I0226 17:29:25.361297 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" event={"ID":"4f725c7f-3187-4bf8-a8cd-ffc406ea45ac","Type":"ContainerStarted","Data":"9c883e66bb43d521ba569c8a8563e3884e988210d7f8c79a7301e1f6ed0cc36d"} Feb 26 17:29:25 crc kubenswrapper[5033]: I0226 17:29:25.361594 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:25 crc kubenswrapper[5033]: I0226 17:29:25.392756 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" podStartSLOduration=3.392734527 podStartE2EDuration="3.392734527s" podCreationTimestamp="2026-02-26 17:29:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 17:29:25.383555473 +0000 UTC m=+6492.596154123" watchObservedRunningTime="2026-02-26 17:29:25.392734527 +0000 UTC m=+6492.605333187" Feb 26 17:29:25 crc kubenswrapper[5033]: I0226 17:29:25.641611 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5728a38-0f9c-4637-a636-eb6b3e90387b" path="/var/lib/kubelet/pods/e5728a38-0f9c-4637-a636-eb6b3e90387b/volumes" Feb 26 17:29:29 crc kubenswrapper[5033]: I0226 17:29:29.628673 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:29:29 crc kubenswrapper[5033]: E0226 17:29:29.629476 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.121039 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8586b96f6c-5jbm2" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.193078 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp"] Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.193516 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" podUID="10f5693b-8825-497b-89e3-c67bfa35bbf2" containerName="dnsmasq-dns" containerID="cri-o://43c71173c86b828a72e6a5e0697623643fe62b2e8359d5eb4b0e95b2a6bcd3f9" gracePeriod=10 Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.444991 5033 generic.go:334] "Generic (PLEG): container finished" podID="10f5693b-8825-497b-89e3-c67bfa35bbf2" containerID="43c71173c86b828a72e6a5e0697623643fe62b2e8359d5eb4b0e95b2a6bcd3f9" exitCode=0 Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.445363 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" event={"ID":"10f5693b-8825-497b-89e3-c67bfa35bbf2","Type":"ContainerDied","Data":"43c71173c86b828a72e6a5e0697623643fe62b2e8359d5eb4b0e95b2a6bcd3f9"} Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.734167 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.856543 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-dns-svc\") pod \"10f5693b-8825-497b-89e3-c67bfa35bbf2\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.856617 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-sb\") pod \"10f5693b-8825-497b-89e3-c67bfa35bbf2\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.856671 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cb4g\" (UniqueName: \"kubernetes.io/projected/10f5693b-8825-497b-89e3-c67bfa35bbf2-kube-api-access-2cb4g\") pod \"10f5693b-8825-497b-89e3-c67bfa35bbf2\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.856713 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-openstack-cell1\") pod \"10f5693b-8825-497b-89e3-c67bfa35bbf2\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.856845 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-config\") pod \"10f5693b-8825-497b-89e3-c67bfa35bbf2\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.856874 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-nb\") pod \"10f5693b-8825-497b-89e3-c67bfa35bbf2\" (UID: \"10f5693b-8825-497b-89e3-c67bfa35bbf2\") " Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.872463 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f5693b-8825-497b-89e3-c67bfa35bbf2-kube-api-access-2cb4g" (OuterVolumeSpecName: "kube-api-access-2cb4g") pod "10f5693b-8825-497b-89e3-c67bfa35bbf2" (UID: "10f5693b-8825-497b-89e3-c67bfa35bbf2"). InnerVolumeSpecName "kube-api-access-2cb4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.916369 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10f5693b-8825-497b-89e3-c67bfa35bbf2" (UID: "10f5693b-8825-497b-89e3-c67bfa35bbf2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.923399 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10f5693b-8825-497b-89e3-c67bfa35bbf2" (UID: "10f5693b-8825-497b-89e3-c67bfa35bbf2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.927558 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-config" (OuterVolumeSpecName: "config") pod "10f5693b-8825-497b-89e3-c67bfa35bbf2" (UID: "10f5693b-8825-497b-89e3-c67bfa35bbf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.935151 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "10f5693b-8825-497b-89e3-c67bfa35bbf2" (UID: "10f5693b-8825-497b-89e3-c67bfa35bbf2"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.935202 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10f5693b-8825-497b-89e3-c67bfa35bbf2" (UID: "10f5693b-8825-497b-89e3-c67bfa35bbf2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.961734 5033 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-config\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.961780 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.961793 5033 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.961805 5033 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.961819 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cb4g\" (UniqueName: \"kubernetes.io/projected/10f5693b-8825-497b-89e3-c67bfa35bbf2-kube-api-access-2cb4g\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:33 crc kubenswrapper[5033]: I0226 17:29:33.961832 5033 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/10f5693b-8825-497b-89e3-c67bfa35bbf2-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:29:34 crc kubenswrapper[5033]: I0226 17:29:34.456907 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" event={"ID":"10f5693b-8825-497b-89e3-c67bfa35bbf2","Type":"ContainerDied","Data":"a612aab845fc7f0be70a866b94257d0afd933f61ed163d90a4f0b80cfa52dbbb"} Feb 26 17:29:34 crc kubenswrapper[5033]: I0226 17:29:34.456956 5033 scope.go:117] "RemoveContainer" containerID="43c71173c86b828a72e6a5e0697623643fe62b2e8359d5eb4b0e95b2a6bcd3f9" Feb 26 17:29:34 crc kubenswrapper[5033]: I0226 17:29:34.457075 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp" Feb 26 17:29:34 crc kubenswrapper[5033]: I0226 17:29:34.486989 5033 scope.go:117] "RemoveContainer" containerID="7e304696993682b426923c5632e5bcad2fc1431056d4cb47433ecaee61674749" Feb 26 17:29:34 crc kubenswrapper[5033]: I0226 17:29:34.503218 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp"] Feb 26 17:29:34 crc kubenswrapper[5033]: I0226 17:29:34.516557 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fc5b7bfbc-xfkdp"] Feb 26 17:29:35 crc kubenswrapper[5033]: I0226 17:29:35.642072 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f5693b-8825-497b-89e3-c67bfa35bbf2" path="/var/lib/kubelet/pods/10f5693b-8825-497b-89e3-c67bfa35bbf2/volumes" Feb 26 17:29:39 crc kubenswrapper[5033]: E0226 17:29:39.541946 5033 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.914s" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.018325 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb"] Feb 26 17:29:44 crc kubenswrapper[5033]: E0226 17:29:44.019933 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f5693b-8825-497b-89e3-c67bfa35bbf2" containerName="init" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.019950 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f5693b-8825-497b-89e3-c67bfa35bbf2" containerName="init" Feb 26 17:29:44 crc kubenswrapper[5033]: E0226 17:29:44.019980 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5728a38-0f9c-4637-a636-eb6b3e90387b" containerName="init" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.019987 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5728a38-0f9c-4637-a636-eb6b3e90387b" containerName="init" Feb 26 17:29:44 crc kubenswrapper[5033]: E0226 17:29:44.020003 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f5693b-8825-497b-89e3-c67bfa35bbf2" containerName="dnsmasq-dns" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.020010 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f5693b-8825-497b-89e3-c67bfa35bbf2" containerName="dnsmasq-dns" Feb 26 17:29:44 crc kubenswrapper[5033]: E0226 17:29:44.020032 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5728a38-0f9c-4637-a636-eb6b3e90387b" containerName="dnsmasq-dns" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.020061 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5728a38-0f9c-4637-a636-eb6b3e90387b" containerName="dnsmasq-dns" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.020332 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5728a38-0f9c-4637-a636-eb6b3e90387b" containerName="dnsmasq-dns" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.020359 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f5693b-8825-497b-89e3-c67bfa35bbf2" containerName="dnsmasq-dns" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.021532 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.024874 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.025297 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.025433 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.038630 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.044656 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb"] Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.206281 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.206578 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm2cc\" (UniqueName: \"kubernetes.io/projected/cf05cfa2-5c69-440f-8d78-40000ee0b13a-kube-api-access-tm2cc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.206721 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.207025 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.207211 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.309774 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.310023 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.310065 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.310132 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.310168 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm2cc\" (UniqueName: \"kubernetes.io/projected/cf05cfa2-5c69-440f-8d78-40000ee0b13a-kube-api-access-tm2cc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.318232 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.318234 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.324708 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.334438 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.344366 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm2cc\" (UniqueName: \"kubernetes.io/projected/cf05cfa2-5c69-440f-8d78-40000ee0b13a-kube-api-access-tm2cc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.352588 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:29:44 crc kubenswrapper[5033]: I0226 17:29:44.628519 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:29:44 crc kubenswrapper[5033]: E0226 17:29:44.628838 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:29:45 crc kubenswrapper[5033]: I0226 17:29:45.216617 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb"] Feb 26 17:29:45 crc kubenswrapper[5033]: W0226 17:29:45.231650 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf05cfa2_5c69_440f_8d78_40000ee0b13a.slice/crio-68b9be42d21a4eb2ac6cf53591c2070ba26cfecba79bc4421938670494ae64c7 WatchSource:0}: Error finding container 68b9be42d21a4eb2ac6cf53591c2070ba26cfecba79bc4421938670494ae64c7: Status 404 returned error can't find the container with id 68b9be42d21a4eb2ac6cf53591c2070ba26cfecba79bc4421938670494ae64c7 Feb 26 17:29:45 crc kubenswrapper[5033]: I0226 17:29:45.626072 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" event={"ID":"cf05cfa2-5c69-440f-8d78-40000ee0b13a","Type":"ContainerStarted","Data":"68b9be42d21a4eb2ac6cf53591c2070ba26cfecba79bc4421938670494ae64c7"} Feb 26 17:29:55 crc kubenswrapper[5033]: I0226 17:29:55.758880 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" event={"ID":"cf05cfa2-5c69-440f-8d78-40000ee0b13a","Type":"ContainerStarted","Data":"024449f68a036e58a33291fff9f118b1def7bc55fa2a6b94a67ba53cac0acbfe"} Feb 26 17:29:55 crc kubenswrapper[5033]: I0226 17:29:55.783579 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" podStartSLOduration=2.6996949470000002 podStartE2EDuration="12.78356075s" podCreationTimestamp="2026-02-26 17:29:43 +0000 UTC" firstStartedPulling="2026-02-26 17:29:45.236147074 +0000 UTC m=+6512.448745724" lastFinishedPulling="2026-02-26 17:29:55.320012877 +0000 UTC m=+6522.532611527" observedRunningTime="2026-02-26 17:29:55.775701737 +0000 UTC m=+6522.988300427" watchObservedRunningTime="2026-02-26 17:29:55.78356075 +0000 UTC m=+6522.996159400" Feb 26 17:29:56 crc kubenswrapper[5033]: I0226 17:29:56.629411 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:29:56 crc kubenswrapper[5033]: E0226 17:29:56.629955 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.139647 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535450-wl4nv"] Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.142080 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535450-wl4nv" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.169500 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.169836 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.170661 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.197205 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk"] Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.201087 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.206815 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.267718 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535450-wl4nv"] Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.269007 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.269829 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk"] Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.292179 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmr64\" (UniqueName: \"kubernetes.io/projected/dba320fa-8dad-45ae-a794-964bbd3201c2-kube-api-access-rmr64\") pod \"auto-csr-approver-29535450-wl4nv\" (UID: \"dba320fa-8dad-45ae-a794-964bbd3201c2\") " pod="openshift-infra/auto-csr-approver-29535450-wl4nv" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.393602 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-config-volume\") pod \"collect-profiles-29535450-rtxgk\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.393713 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-secret-volume\") pod \"collect-profiles-29535450-rtxgk\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.393765 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwhsw\" (UniqueName: \"kubernetes.io/projected/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-kube-api-access-gwhsw\") pod \"collect-profiles-29535450-rtxgk\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.393840 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmr64\" (UniqueName: \"kubernetes.io/projected/dba320fa-8dad-45ae-a794-964bbd3201c2-kube-api-access-rmr64\") pod \"auto-csr-approver-29535450-wl4nv\" (UID: \"dba320fa-8dad-45ae-a794-964bbd3201c2\") " pod="openshift-infra/auto-csr-approver-29535450-wl4nv" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.416064 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmr64\" (UniqueName: \"kubernetes.io/projected/dba320fa-8dad-45ae-a794-964bbd3201c2-kube-api-access-rmr64\") pod \"auto-csr-approver-29535450-wl4nv\" (UID: \"dba320fa-8dad-45ae-a794-964bbd3201c2\") " pod="openshift-infra/auto-csr-approver-29535450-wl4nv" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.463568 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535450-wl4nv" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.495697 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-secret-volume\") pod \"collect-profiles-29535450-rtxgk\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.495786 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwhsw\" (UniqueName: \"kubernetes.io/projected/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-kube-api-access-gwhsw\") pod \"collect-profiles-29535450-rtxgk\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.495920 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-config-volume\") pod \"collect-profiles-29535450-rtxgk\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.496944 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-config-volume\") pod \"collect-profiles-29535450-rtxgk\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.502116 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-secret-volume\") pod \"collect-profiles-29535450-rtxgk\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.517147 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwhsw\" (UniqueName: \"kubernetes.io/projected/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-kube-api-access-gwhsw\") pod \"collect-profiles-29535450-rtxgk\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.526573 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:00 crc kubenswrapper[5033]: I0226 17:30:00.960472 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535450-wl4nv"] Feb 26 17:30:01 crc kubenswrapper[5033]: I0226 17:30:01.067842 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk"] Feb 26 17:30:01 crc kubenswrapper[5033]: W0226 17:30:01.073246 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d6a980a_6455_42f0_ad1a_8e1f75b1f0dc.slice/crio-11fee80739bd507e0e2ace1d43e0854e1b1910a1c14f3660585196ec30ef4312 WatchSource:0}: Error finding container 11fee80739bd507e0e2ace1d43e0854e1b1910a1c14f3660585196ec30ef4312: Status 404 returned error can't find the container with id 11fee80739bd507e0e2ace1d43e0854e1b1910a1c14f3660585196ec30ef4312 Feb 26 17:30:01 crc kubenswrapper[5033]: I0226 17:30:01.825365 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535450-wl4nv" event={"ID":"dba320fa-8dad-45ae-a794-964bbd3201c2","Type":"ContainerStarted","Data":"e56a480cb943ab8870eba79a8697725cf21e3cdcc15568eca22db3fd601356a2"} Feb 26 17:30:01 crc kubenswrapper[5033]: I0226 17:30:01.827628 5033 generic.go:334] "Generic (PLEG): container finished" podID="4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc" containerID="2b33d8ab4f61a15acb7884e0ff3a4169ad51e6f4d76821d24773e5345820b1ae" exitCode=0 Feb 26 17:30:01 crc kubenswrapper[5033]: I0226 17:30:01.827661 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" event={"ID":"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc","Type":"ContainerDied","Data":"2b33d8ab4f61a15acb7884e0ff3a4169ad51e6f4d76821d24773e5345820b1ae"} Feb 26 17:30:01 crc kubenswrapper[5033]: I0226 17:30:01.827679 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" event={"ID":"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc","Type":"ContainerStarted","Data":"11fee80739bd507e0e2ace1d43e0854e1b1910a1c14f3660585196ec30ef4312"} Feb 26 17:30:02 crc kubenswrapper[5033]: I0226 17:30:02.847674 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535450-wl4nv" event={"ID":"dba320fa-8dad-45ae-a794-964bbd3201c2","Type":"ContainerStarted","Data":"70c6a315deab0b265b007900946a556a7bdb91d1dab1bb1b8c4d73a48af3f24b"} Feb 26 17:30:02 crc kubenswrapper[5033]: I0226 17:30:02.877785 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535450-wl4nv" podStartSLOduration=1.5553560690000001 podStartE2EDuration="2.877758349s" podCreationTimestamp="2026-02-26 17:30:00 +0000 UTC" firstStartedPulling="2026-02-26 17:30:00.957061305 +0000 UTC m=+6528.169659955" lastFinishedPulling="2026-02-26 17:30:02.279463575 +0000 UTC m=+6529.492062235" observedRunningTime="2026-02-26 17:30:02.865313616 +0000 UTC m=+6530.077912276" watchObservedRunningTime="2026-02-26 17:30:02.877758349 +0000 UTC m=+6530.090357009" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.270474 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.366517 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-secret-volume\") pod \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.366905 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-config-volume\") pod \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.367157 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwhsw\" (UniqueName: \"kubernetes.io/projected/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-kube-api-access-gwhsw\") pod \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\" (UID: \"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc\") " Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.367548 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-config-volume" (OuterVolumeSpecName: "config-volume") pod "4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc" (UID: "4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.367778 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.373972 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-kube-api-access-gwhsw" (OuterVolumeSpecName: "kube-api-access-gwhsw") pod "4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc" (UID: "4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc"). InnerVolumeSpecName "kube-api-access-gwhsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.375514 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc" (UID: "4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.470472 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.470526 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwhsw\" (UniqueName: \"kubernetes.io/projected/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc-kube-api-access-gwhsw\") on node \"crc\" DevicePath \"\"" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.860408 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" event={"ID":"4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc","Type":"ContainerDied","Data":"11fee80739bd507e0e2ace1d43e0854e1b1910a1c14f3660585196ec30ef4312"} Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.860445 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.860459 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11fee80739bd507e0e2ace1d43e0854e1b1910a1c14f3660585196ec30ef4312" Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.864704 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535450-wl4nv" event={"ID":"dba320fa-8dad-45ae-a794-964bbd3201c2","Type":"ContainerDied","Data":"70c6a315deab0b265b007900946a556a7bdb91d1dab1bb1b8c4d73a48af3f24b"} Feb 26 17:30:03 crc kubenswrapper[5033]: I0226 17:30:03.864387 5033 generic.go:334] "Generic (PLEG): container finished" podID="dba320fa-8dad-45ae-a794-964bbd3201c2" containerID="70c6a315deab0b265b007900946a556a7bdb91d1dab1bb1b8c4d73a48af3f24b" exitCode=0 Feb 26 17:30:04 crc kubenswrapper[5033]: I0226 17:30:04.359010 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh"] Feb 26 17:30:04 crc kubenswrapper[5033]: I0226 17:30:04.367699 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535405-spslh"] Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.330749 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535450-wl4nv" Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.433133 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmr64\" (UniqueName: \"kubernetes.io/projected/dba320fa-8dad-45ae-a794-964bbd3201c2-kube-api-access-rmr64\") pod \"dba320fa-8dad-45ae-a794-964bbd3201c2\" (UID: \"dba320fa-8dad-45ae-a794-964bbd3201c2\") " Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.439495 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dba320fa-8dad-45ae-a794-964bbd3201c2-kube-api-access-rmr64" (OuterVolumeSpecName: "kube-api-access-rmr64") pod "dba320fa-8dad-45ae-a794-964bbd3201c2" (UID: "dba320fa-8dad-45ae-a794-964bbd3201c2"). InnerVolumeSpecName "kube-api-access-rmr64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.535811 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmr64\" (UniqueName: \"kubernetes.io/projected/dba320fa-8dad-45ae-a794-964bbd3201c2-kube-api-access-rmr64\") on node \"crc\" DevicePath \"\"" Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.645364 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24ce89a2-cf67-4dee-8c2e-a4015b437452" path="/var/lib/kubelet/pods/24ce89a2-cf67-4dee-8c2e-a4015b437452/volumes" Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.906137 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535450-wl4nv" event={"ID":"dba320fa-8dad-45ae-a794-964bbd3201c2","Type":"ContainerDied","Data":"e56a480cb943ab8870eba79a8697725cf21e3cdcc15568eca22db3fd601356a2"} Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.906203 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e56a480cb943ab8870eba79a8697725cf21e3cdcc15568eca22db3fd601356a2" Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.906326 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535450-wl4nv" Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.938036 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535444-bvrf5"] Feb 26 17:30:05 crc kubenswrapper[5033]: I0226 17:30:05.955199 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535444-bvrf5"] Feb 26 17:30:07 crc kubenswrapper[5033]: I0226 17:30:07.644909 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975decb9-6350-490b-96d1-f239d388f7f2" path="/var/lib/kubelet/pods/975decb9-6350-490b-96d1-f239d388f7f2/volumes" Feb 26 17:30:08 crc kubenswrapper[5033]: I0226 17:30:08.961653 5033 generic.go:334] "Generic (PLEG): container finished" podID="cf05cfa2-5c69-440f-8d78-40000ee0b13a" containerID="024449f68a036e58a33291fff9f118b1def7bc55fa2a6b94a67ba53cac0acbfe" exitCode=0 Feb 26 17:30:08 crc kubenswrapper[5033]: I0226 17:30:08.961792 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" event={"ID":"cf05cfa2-5c69-440f-8d78-40000ee0b13a","Type":"ContainerDied","Data":"024449f68a036e58a33291fff9f118b1def7bc55fa2a6b94a67ba53cac0acbfe"} Feb 26 17:30:09 crc kubenswrapper[5033]: I0226 17:30:09.629738 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:30:09 crc kubenswrapper[5033]: E0226 17:30:09.630702 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.583483 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.753829 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-inventory\") pod \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.753916 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-pre-adoption-validation-combined-ca-bundle\") pod \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.754060 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ceph\") pod \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.754166 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ssh-key-openstack-cell1\") pod \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.754187 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm2cc\" (UniqueName: \"kubernetes.io/projected/cf05cfa2-5c69-440f-8d78-40000ee0b13a-kube-api-access-tm2cc\") pod \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\" (UID: \"cf05cfa2-5c69-440f-8d78-40000ee0b13a\") " Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.760846 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ceph" (OuterVolumeSpecName: "ceph") pod "cf05cfa2-5c69-440f-8d78-40000ee0b13a" (UID: "cf05cfa2-5c69-440f-8d78-40000ee0b13a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.761565 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf05cfa2-5c69-440f-8d78-40000ee0b13a-kube-api-access-tm2cc" (OuterVolumeSpecName: "kube-api-access-tm2cc") pod "cf05cfa2-5c69-440f-8d78-40000ee0b13a" (UID: "cf05cfa2-5c69-440f-8d78-40000ee0b13a"). InnerVolumeSpecName "kube-api-access-tm2cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.785418 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "cf05cfa2-5c69-440f-8d78-40000ee0b13a" (UID: "cf05cfa2-5c69-440f-8d78-40000ee0b13a"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.805367 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-inventory" (OuterVolumeSpecName: "inventory") pod "cf05cfa2-5c69-440f-8d78-40000ee0b13a" (UID: "cf05cfa2-5c69-440f-8d78-40000ee0b13a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.807277 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cf05cfa2-5c69-440f-8d78-40000ee0b13a" (UID: "cf05cfa2-5c69-440f-8d78-40000ee0b13a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.856667 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.856704 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.856719 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm2cc\" (UniqueName: \"kubernetes.io/projected/cf05cfa2-5c69-440f-8d78-40000ee0b13a-kube-api-access-tm2cc\") on node \"crc\" DevicePath \"\"" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.856730 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.856740 5033 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf05cfa2-5c69-440f-8d78-40000ee0b13a-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.983578 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" event={"ID":"cf05cfa2-5c69-440f-8d78-40000ee0b13a","Type":"ContainerDied","Data":"68b9be42d21a4eb2ac6cf53591c2070ba26cfecba79bc4421938670494ae64c7"} Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.983654 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb" Feb 26 17:30:10 crc kubenswrapper[5033]: I0226 17:30:10.983660 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b9be42d21a4eb2ac6cf53591c2070ba26cfecba79bc4421938670494ae64c7" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.989953 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8"] Feb 26 17:30:16 crc kubenswrapper[5033]: E0226 17:30:16.991081 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc" containerName="collect-profiles" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.991097 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc" containerName="collect-profiles" Feb 26 17:30:16 crc kubenswrapper[5033]: E0226 17:30:16.991116 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf05cfa2-5c69-440f-8d78-40000ee0b13a" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.991126 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf05cfa2-5c69-440f-8d78-40000ee0b13a" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 26 17:30:16 crc kubenswrapper[5033]: E0226 17:30:16.991178 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dba320fa-8dad-45ae-a794-964bbd3201c2" containerName="oc" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.991188 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba320fa-8dad-45ae-a794-964bbd3201c2" containerName="oc" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.991484 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc" containerName="collect-profiles" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.991516 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dba320fa-8dad-45ae-a794-964bbd3201c2" containerName="oc" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.991538 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf05cfa2-5c69-440f-8d78-40000ee0b13a" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.992658 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.996696 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:30:16 crc kubenswrapper[5033]: I0226 17:30:16.996971 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.001445 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.008448 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.009209 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8"] Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.112016 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.112129 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.112170 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.112209 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmkd\" (UniqueName: \"kubernetes.io/projected/0418ee62-c3ed-4b01-9801-705d116841ae-kube-api-access-hkmkd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.112276 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.214626 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.214731 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.214815 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmkd\" (UniqueName: \"kubernetes.io/projected/0418ee62-c3ed-4b01-9801-705d116841ae-kube-api-access-hkmkd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.214920 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.215066 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.222633 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.223242 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.224359 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.239804 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.251805 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmkd\" (UniqueName: \"kubernetes.io/projected/0418ee62-c3ed-4b01-9801-705d116841ae-kube-api-access-hkmkd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.319544 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:30:17 crc kubenswrapper[5033]: I0226 17:30:17.971615 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8"] Feb 26 17:30:18 crc kubenswrapper[5033]: I0226 17:30:18.080581 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" event={"ID":"0418ee62-c3ed-4b01-9801-705d116841ae","Type":"ContainerStarted","Data":"b496e35075bde4a1c7110038a49a45dba6de67413a29fd5158c0d91d8389d5ec"} Feb 26 17:30:19 crc kubenswrapper[5033]: I0226 17:30:19.091020 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" event={"ID":"0418ee62-c3ed-4b01-9801-705d116841ae","Type":"ContainerStarted","Data":"48c4839b84bb0e9e53e9278f0513519ea08df9944a3a2fe9cd3af6e6370c5fb0"} Feb 26 17:30:19 crc kubenswrapper[5033]: I0226 17:30:19.115984 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" podStartSLOduration=2.652164917 podStartE2EDuration="3.115967445s" podCreationTimestamp="2026-02-26 17:30:16 +0000 UTC" firstStartedPulling="2026-02-26 17:30:17.986043207 +0000 UTC m=+6545.198641867" lastFinishedPulling="2026-02-26 17:30:18.449845735 +0000 UTC m=+6545.662444395" observedRunningTime="2026-02-26 17:30:19.107609211 +0000 UTC m=+6546.320207861" watchObservedRunningTime="2026-02-26 17:30:19.115967445 +0000 UTC m=+6546.328566095" Feb 26 17:30:21 crc kubenswrapper[5033]: I0226 17:30:21.628444 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:30:21 crc kubenswrapper[5033]: E0226 17:30:21.628971 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:30:33 crc kubenswrapper[5033]: I0226 17:30:33.637157 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:30:33 crc kubenswrapper[5033]: E0226 17:30:33.638379 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:30:39 crc kubenswrapper[5033]: I0226 17:30:39.593587 5033 scope.go:117] "RemoveContainer" containerID="f4a06bbf7a03ee5e396b5129a026ebd0effb30f02ba853e5174e29e06dfcdf75" Feb 26 17:30:39 crc kubenswrapper[5033]: I0226 17:30:39.622304 5033 scope.go:117] "RemoveContainer" containerID="d8e3eb717cc3b82058101de525cb5fcee9d71f91285b8d7a5b4582e414ff105f" Feb 26 17:30:47 crc kubenswrapper[5033]: I0226 17:30:47.629193 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:30:47 crc kubenswrapper[5033]: E0226 17:30:47.629824 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:30:59 crc kubenswrapper[5033]: I0226 17:30:59.628368 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:30:59 crc kubenswrapper[5033]: E0226 17:30:59.629631 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:31:10 crc kubenswrapper[5033]: I0226 17:31:10.628595 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:31:10 crc kubenswrapper[5033]: E0226 17:31:10.629337 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:31:17 crc kubenswrapper[5033]: I0226 17:31:17.890826 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ppgvs"] Feb 26 17:31:17 crc kubenswrapper[5033]: I0226 17:31:17.893774 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:17 crc kubenswrapper[5033]: I0226 17:31:17.913030 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppgvs"] Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.015033 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-catalog-content\") pod \"redhat-operators-ppgvs\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.015110 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-utilities\") pod \"redhat-operators-ppgvs\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.015215 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbl8s\" (UniqueName: \"kubernetes.io/projected/739f8561-8333-4d54-b008-f7d70bb7ed99-kube-api-access-jbl8s\") pod \"redhat-operators-ppgvs\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.117777 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbl8s\" (UniqueName: \"kubernetes.io/projected/739f8561-8333-4d54-b008-f7d70bb7ed99-kube-api-access-jbl8s\") pod \"redhat-operators-ppgvs\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.117942 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-catalog-content\") pod \"redhat-operators-ppgvs\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.117984 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-utilities\") pod \"redhat-operators-ppgvs\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.120071 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-utilities\") pod \"redhat-operators-ppgvs\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.120196 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-catalog-content\") pod \"redhat-operators-ppgvs\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.140680 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbl8s\" (UniqueName: \"kubernetes.io/projected/739f8561-8333-4d54-b008-f7d70bb7ed99-kube-api-access-jbl8s\") pod \"redhat-operators-ppgvs\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.220234 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.672047 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppgvs"] Feb 26 17:31:18 crc kubenswrapper[5033]: I0226 17:31:18.707763 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppgvs" event={"ID":"739f8561-8333-4d54-b008-f7d70bb7ed99","Type":"ContainerStarted","Data":"c4171ff3c7034b333588eb27c38e3ac65805695d23266c50e7ff88caa8420959"} Feb 26 17:31:19 crc kubenswrapper[5033]: I0226 17:31:19.719182 5033 generic.go:334] "Generic (PLEG): container finished" podID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerID="25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b" exitCode=0 Feb 26 17:31:19 crc kubenswrapper[5033]: I0226 17:31:19.719289 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppgvs" event={"ID":"739f8561-8333-4d54-b008-f7d70bb7ed99","Type":"ContainerDied","Data":"25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b"} Feb 26 17:31:21 crc kubenswrapper[5033]: I0226 17:31:21.740652 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppgvs" event={"ID":"739f8561-8333-4d54-b008-f7d70bb7ed99","Type":"ContainerStarted","Data":"7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15"} Feb 26 17:31:22 crc kubenswrapper[5033]: I0226 17:31:22.628358 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:31:22 crc kubenswrapper[5033]: E0226 17:31:22.628922 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:31:30 crc kubenswrapper[5033]: I0226 17:31:30.062011 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-k6nsk"] Feb 26 17:31:30 crc kubenswrapper[5033]: I0226 17:31:30.071499 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-k6nsk"] Feb 26 17:31:30 crc kubenswrapper[5033]: I0226 17:31:30.851067 5033 generic.go:334] "Generic (PLEG): container finished" podID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerID="7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15" exitCode=0 Feb 26 17:31:30 crc kubenswrapper[5033]: I0226 17:31:30.851148 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppgvs" event={"ID":"739f8561-8333-4d54-b008-f7d70bb7ed99","Type":"ContainerDied","Data":"7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15"} Feb 26 17:31:32 crc kubenswrapper[5033]: I0226 17:31:32.067606 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2239b58b-057e-49c2-8af4-1831f198a780" path="/var/lib/kubelet/pods/2239b58b-057e-49c2-8af4-1831f198a780/volumes" Feb 26 17:31:32 crc kubenswrapper[5033]: I0226 17:31:32.070923 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-3b95-account-create-update-25g8v"] Feb 26 17:31:32 crc kubenswrapper[5033]: I0226 17:31:32.070971 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-3b95-account-create-update-25g8v"] Feb 26 17:31:32 crc kubenswrapper[5033]: I0226 17:31:32.878877 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppgvs" event={"ID":"739f8561-8333-4d54-b008-f7d70bb7ed99","Type":"ContainerStarted","Data":"f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e"} Feb 26 17:31:32 crc kubenswrapper[5033]: I0226 17:31:32.900729 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ppgvs" podStartSLOduration=4.181749285 podStartE2EDuration="15.900708907s" podCreationTimestamp="2026-02-26 17:31:17 +0000 UTC" firstStartedPulling="2026-02-26 17:31:19.721706833 +0000 UTC m=+6606.934305483" lastFinishedPulling="2026-02-26 17:31:31.440666445 +0000 UTC m=+6618.653265105" observedRunningTime="2026-02-26 17:31:32.895617593 +0000 UTC m=+6620.108216253" watchObservedRunningTime="2026-02-26 17:31:32.900708907 +0000 UTC m=+6620.113307557" Feb 26 17:31:33 crc kubenswrapper[5033]: I0226 17:31:33.656067 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8545ace8-57f4-47fa-9512-ce48b7998050" path="/var/lib/kubelet/pods/8545ace8-57f4-47fa-9512-ce48b7998050/volumes" Feb 26 17:31:35 crc kubenswrapper[5033]: I0226 17:31:35.628035 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:31:35 crc kubenswrapper[5033]: E0226 17:31:35.628603 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:31:37 crc kubenswrapper[5033]: I0226 17:31:37.060086 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-qmq7p"] Feb 26 17:31:37 crc kubenswrapper[5033]: I0226 17:31:37.072156 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-qmq7p"] Feb 26 17:31:37 crc kubenswrapper[5033]: I0226 17:31:37.638620 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2" path="/var/lib/kubelet/pods/ac5d4f5e-e12e-46a1-bac1-cbc8c0aae9d2/volumes" Feb 26 17:31:38 crc kubenswrapper[5033]: I0226 17:31:38.220908 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:38 crc kubenswrapper[5033]: I0226 17:31:38.221003 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:39 crc kubenswrapper[5033]: I0226 17:31:39.033790 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-65e6-account-create-update-fd8v5"] Feb 26 17:31:39 crc kubenswrapper[5033]: I0226 17:31:39.048363 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-65e6-account-create-update-fd8v5"] Feb 26 17:31:39 crc kubenswrapper[5033]: I0226 17:31:39.268512 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ppgvs" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="registry-server" probeResult="failure" output=< Feb 26 17:31:39 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 17:31:39 crc kubenswrapper[5033]: > Feb 26 17:31:39 crc kubenswrapper[5033]: I0226 17:31:39.687535 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaedf48b-90d4-4bb6-bf73-51c442ea1a9b" path="/var/lib/kubelet/pods/aaedf48b-90d4-4bb6-bf73-51c442ea1a9b/volumes" Feb 26 17:31:39 crc kubenswrapper[5033]: I0226 17:31:39.763321 5033 scope.go:117] "RemoveContainer" containerID="e000ca1d86c14205abcae6a92631b771520e3ed0773a2cb4d0ac17dd0504d69e" Feb 26 17:31:39 crc kubenswrapper[5033]: I0226 17:31:39.825052 5033 scope.go:117] "RemoveContainer" containerID="c15f8ca1196db9ebb1f324d1e18c4ae728bae44225bcca4c0f647687b8a973c7" Feb 26 17:31:39 crc kubenswrapper[5033]: I0226 17:31:39.863442 5033 scope.go:117] "RemoveContainer" containerID="fa683db8ff4cb1baad9ad3fa63b1f0c099d8dc9559ebce37cb20e43c039e2470" Feb 26 17:31:39 crc kubenswrapper[5033]: I0226 17:31:39.931417 5033 scope.go:117] "RemoveContainer" containerID="2cba4e78489b19417a4be2c8cde5f88743f7ab972c93d7958b2a1238dee33927" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.707569 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f8g4k"] Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.710515 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.724958 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8g4k"] Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.820553 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-catalog-content\") pod \"community-operators-f8g4k\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.821079 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wrzh\" (UniqueName: \"kubernetes.io/projected/30cfc89a-8612-4d37-91e1-67c60f85d15d-kube-api-access-2wrzh\") pod \"community-operators-f8g4k\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.821609 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-utilities\") pod \"community-operators-f8g4k\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.923301 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-catalog-content\") pod \"community-operators-f8g4k\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.923458 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wrzh\" (UniqueName: \"kubernetes.io/projected/30cfc89a-8612-4d37-91e1-67c60f85d15d-kube-api-access-2wrzh\") pod \"community-operators-f8g4k\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.923533 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-utilities\") pod \"community-operators-f8g4k\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.923875 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-catalog-content\") pod \"community-operators-f8g4k\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.923930 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-utilities\") pod \"community-operators-f8g4k\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:43 crc kubenswrapper[5033]: I0226 17:31:43.957773 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wrzh\" (UniqueName: \"kubernetes.io/projected/30cfc89a-8612-4d37-91e1-67c60f85d15d-kube-api-access-2wrzh\") pod \"community-operators-f8g4k\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:44 crc kubenswrapper[5033]: I0226 17:31:44.044835 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:44 crc kubenswrapper[5033]: I0226 17:31:44.526016 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8g4k"] Feb 26 17:31:45 crc kubenswrapper[5033]: I0226 17:31:45.017088 5033 generic.go:334] "Generic (PLEG): container finished" podID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerID="d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94" exitCode=0 Feb 26 17:31:45 crc kubenswrapper[5033]: I0226 17:31:45.017321 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8g4k" event={"ID":"30cfc89a-8612-4d37-91e1-67c60f85d15d","Type":"ContainerDied","Data":"d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94"} Feb 26 17:31:45 crc kubenswrapper[5033]: I0226 17:31:45.017443 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8g4k" event={"ID":"30cfc89a-8612-4d37-91e1-67c60f85d15d","Type":"ContainerStarted","Data":"861963e80caa2e1e146714878056c6df2c8a8aedd73dd9443a1e26dcfb533444"} Feb 26 17:31:46 crc kubenswrapper[5033]: I0226 17:31:46.031829 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8g4k" event={"ID":"30cfc89a-8612-4d37-91e1-67c60f85d15d","Type":"ContainerStarted","Data":"b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764"} Feb 26 17:31:46 crc kubenswrapper[5033]: I0226 17:31:46.627980 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:31:46 crc kubenswrapper[5033]: E0226 17:31:46.629205 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:31:48 crc kubenswrapper[5033]: I0226 17:31:48.051052 5033 generic.go:334] "Generic (PLEG): container finished" podID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerID="b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764" exitCode=0 Feb 26 17:31:48 crc kubenswrapper[5033]: I0226 17:31:48.051063 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8g4k" event={"ID":"30cfc89a-8612-4d37-91e1-67c60f85d15d","Type":"ContainerDied","Data":"b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764"} Feb 26 17:31:48 crc kubenswrapper[5033]: I0226 17:31:48.054684 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:31:49 crc kubenswrapper[5033]: I0226 17:31:49.080426 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8g4k" event={"ID":"30cfc89a-8612-4d37-91e1-67c60f85d15d","Type":"ContainerStarted","Data":"b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673"} Feb 26 17:31:49 crc kubenswrapper[5033]: I0226 17:31:49.108677 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f8g4k" podStartSLOduration=2.525267539 podStartE2EDuration="6.108649165s" podCreationTimestamp="2026-02-26 17:31:43 +0000 UTC" firstStartedPulling="2026-02-26 17:31:45.020603762 +0000 UTC m=+6632.233202432" lastFinishedPulling="2026-02-26 17:31:48.603985408 +0000 UTC m=+6635.816584058" observedRunningTime="2026-02-26 17:31:49.098067286 +0000 UTC m=+6636.310665936" watchObservedRunningTime="2026-02-26 17:31:49.108649165 +0000 UTC m=+6636.321247815" Feb 26 17:31:49 crc kubenswrapper[5033]: I0226 17:31:49.273256 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ppgvs" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="registry-server" probeResult="failure" output=< Feb 26 17:31:49 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 17:31:49 crc kubenswrapper[5033]: > Feb 26 17:31:54 crc kubenswrapper[5033]: I0226 17:31:54.045663 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:54 crc kubenswrapper[5033]: I0226 17:31:54.046300 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:54 crc kubenswrapper[5033]: I0226 17:31:54.099216 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:54 crc kubenswrapper[5033]: I0226 17:31:54.189899 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:54 crc kubenswrapper[5033]: I0226 17:31:54.342667 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8g4k"] Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.154299 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f8g4k" podUID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerName="registry-server" containerID="cri-o://b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673" gracePeriod=2 Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.703988 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.814931 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-catalog-content\") pod \"30cfc89a-8612-4d37-91e1-67c60f85d15d\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.815133 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wrzh\" (UniqueName: \"kubernetes.io/projected/30cfc89a-8612-4d37-91e1-67c60f85d15d-kube-api-access-2wrzh\") pod \"30cfc89a-8612-4d37-91e1-67c60f85d15d\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.815248 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-utilities\") pod \"30cfc89a-8612-4d37-91e1-67c60f85d15d\" (UID: \"30cfc89a-8612-4d37-91e1-67c60f85d15d\") " Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.816516 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-utilities" (OuterVolumeSpecName: "utilities") pod "30cfc89a-8612-4d37-91e1-67c60f85d15d" (UID: "30cfc89a-8612-4d37-91e1-67c60f85d15d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.831521 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30cfc89a-8612-4d37-91e1-67c60f85d15d-kube-api-access-2wrzh" (OuterVolumeSpecName: "kube-api-access-2wrzh") pod "30cfc89a-8612-4d37-91e1-67c60f85d15d" (UID: "30cfc89a-8612-4d37-91e1-67c60f85d15d"). InnerVolumeSpecName "kube-api-access-2wrzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.875690 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30cfc89a-8612-4d37-91e1-67c60f85d15d" (UID: "30cfc89a-8612-4d37-91e1-67c60f85d15d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.917833 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wrzh\" (UniqueName: \"kubernetes.io/projected/30cfc89a-8612-4d37-91e1-67c60f85d15d-kube-api-access-2wrzh\") on node \"crc\" DevicePath \"\"" Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.917869 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:31:56 crc kubenswrapper[5033]: I0226 17:31:56.917888 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30cfc89a-8612-4d37-91e1-67c60f85d15d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.166692 5033 generic.go:334] "Generic (PLEG): container finished" podID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerID="b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673" exitCode=0 Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.166876 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8g4k" event={"ID":"30cfc89a-8612-4d37-91e1-67c60f85d15d","Type":"ContainerDied","Data":"b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673"} Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.167005 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8g4k" event={"ID":"30cfc89a-8612-4d37-91e1-67c60f85d15d","Type":"ContainerDied","Data":"861963e80caa2e1e146714878056c6df2c8a8aedd73dd9443a1e26dcfb533444"} Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.167030 5033 scope.go:117] "RemoveContainer" containerID="b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.166930 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8g4k" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.195185 5033 scope.go:117] "RemoveContainer" containerID="b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.216526 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8g4k"] Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.231691 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f8g4k"] Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.237059 5033 scope.go:117] "RemoveContainer" containerID="d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.291489 5033 scope.go:117] "RemoveContainer" containerID="b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673" Feb 26 17:31:57 crc kubenswrapper[5033]: E0226 17:31:57.292106 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673\": container with ID starting with b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673 not found: ID does not exist" containerID="b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.292150 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673"} err="failed to get container status \"b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673\": rpc error: code = NotFound desc = could not find container \"b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673\": container with ID starting with b2e2e8a3c8fcb5f22b05dfd236d8df0f892fbb01d7cd32bb7060fb8698cc4673 not found: ID does not exist" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.292177 5033 scope.go:117] "RemoveContainer" containerID="b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764" Feb 26 17:31:57 crc kubenswrapper[5033]: E0226 17:31:57.292712 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764\": container with ID starting with b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764 not found: ID does not exist" containerID="b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.292735 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764"} err="failed to get container status \"b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764\": rpc error: code = NotFound desc = could not find container \"b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764\": container with ID starting with b4e5c0652de21fc8746ce5d7e9aed9334e1d9a01b3b784683e6fdedb946ee764 not found: ID does not exist" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.292750 5033 scope.go:117] "RemoveContainer" containerID="d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94" Feb 26 17:31:57 crc kubenswrapper[5033]: E0226 17:31:57.293118 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94\": container with ID starting with d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94 not found: ID does not exist" containerID="d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.293139 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94"} err="failed to get container status \"d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94\": rpc error: code = NotFound desc = could not find container \"d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94\": container with ID starting with d2dc60b08ec661d941a16dea1bb5dcc39b18f38077bf1e2282a051cf38595a94 not found: ID does not exist" Feb 26 17:31:57 crc kubenswrapper[5033]: I0226 17:31:57.640200 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30cfc89a-8612-4d37-91e1-67c60f85d15d" path="/var/lib/kubelet/pods/30cfc89a-8612-4d37-91e1-67c60f85d15d/volumes" Feb 26 17:31:58 crc kubenswrapper[5033]: I0226 17:31:58.272755 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:58 crc kubenswrapper[5033]: I0226 17:31:58.334389 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:31:58 crc kubenswrapper[5033]: I0226 17:31:58.628790 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:31:58 crc kubenswrapper[5033]: E0226 17:31:58.629056 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:31:59 crc kubenswrapper[5033]: I0226 17:31:59.743189 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppgvs"] Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.145074 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535452-g5x6d"] Feb 26 17:32:00 crc kubenswrapper[5033]: E0226 17:32:00.145660 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerName="registry-server" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.145684 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerName="registry-server" Feb 26 17:32:00 crc kubenswrapper[5033]: E0226 17:32:00.145698 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerName="extract-content" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.145706 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerName="extract-content" Feb 26 17:32:00 crc kubenswrapper[5033]: E0226 17:32:00.145727 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerName="extract-utilities" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.145736 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerName="extract-utilities" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.146319 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="30cfc89a-8612-4d37-91e1-67c60f85d15d" containerName="registry-server" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.147241 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535452-g5x6d" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.149783 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.149841 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.150123 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.166575 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535452-g5x6d"] Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.196089 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ppgvs" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="registry-server" containerID="cri-o://f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e" gracePeriod=2 Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.286902 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzqbj\" (UniqueName: \"kubernetes.io/projected/3228144b-cd97-4d8f-af54-67b0dfe1cfbe-kube-api-access-fzqbj\") pod \"auto-csr-approver-29535452-g5x6d\" (UID: \"3228144b-cd97-4d8f-af54-67b0dfe1cfbe\") " pod="openshift-infra/auto-csr-approver-29535452-g5x6d" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.389064 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzqbj\" (UniqueName: \"kubernetes.io/projected/3228144b-cd97-4d8f-af54-67b0dfe1cfbe-kube-api-access-fzqbj\") pod \"auto-csr-approver-29535452-g5x6d\" (UID: \"3228144b-cd97-4d8f-af54-67b0dfe1cfbe\") " pod="openshift-infra/auto-csr-approver-29535452-g5x6d" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.413657 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzqbj\" (UniqueName: \"kubernetes.io/projected/3228144b-cd97-4d8f-af54-67b0dfe1cfbe-kube-api-access-fzqbj\") pod \"auto-csr-approver-29535452-g5x6d\" (UID: \"3228144b-cd97-4d8f-af54-67b0dfe1cfbe\") " pod="openshift-infra/auto-csr-approver-29535452-g5x6d" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.474163 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535452-g5x6d" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.715460 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.812966 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbl8s\" (UniqueName: \"kubernetes.io/projected/739f8561-8333-4d54-b008-f7d70bb7ed99-kube-api-access-jbl8s\") pod \"739f8561-8333-4d54-b008-f7d70bb7ed99\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.813087 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-catalog-content\") pod \"739f8561-8333-4d54-b008-f7d70bb7ed99\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.813120 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-utilities\") pod \"739f8561-8333-4d54-b008-f7d70bb7ed99\" (UID: \"739f8561-8333-4d54-b008-f7d70bb7ed99\") " Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.814554 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-utilities" (OuterVolumeSpecName: "utilities") pod "739f8561-8333-4d54-b008-f7d70bb7ed99" (UID: "739f8561-8333-4d54-b008-f7d70bb7ed99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.821002 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739f8561-8333-4d54-b008-f7d70bb7ed99-kube-api-access-jbl8s" (OuterVolumeSpecName: "kube-api-access-jbl8s") pod "739f8561-8333-4d54-b008-f7d70bb7ed99" (UID: "739f8561-8333-4d54-b008-f7d70bb7ed99"). InnerVolumeSpecName "kube-api-access-jbl8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.915879 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbl8s\" (UniqueName: \"kubernetes.io/projected/739f8561-8333-4d54-b008-f7d70bb7ed99-kube-api-access-jbl8s\") on node \"crc\" DevicePath \"\"" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.915923 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:32:00 crc kubenswrapper[5033]: I0226 17:32:00.941369 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "739f8561-8333-4d54-b008-f7d70bb7ed99" (UID: "739f8561-8333-4d54-b008-f7d70bb7ed99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.007681 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535452-g5x6d"] Feb 26 17:32:01 crc kubenswrapper[5033]: W0226 17:32:01.008229 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3228144b_cd97_4d8f_af54_67b0dfe1cfbe.slice/crio-f90c155020a374be855cfe6d66e995e80315adbf861fb4cae91f1e38f786694d WatchSource:0}: Error finding container f90c155020a374be855cfe6d66e995e80315adbf861fb4cae91f1e38f786694d: Status 404 returned error can't find the container with id f90c155020a374be855cfe6d66e995e80315adbf861fb4cae91f1e38f786694d Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.017788 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739f8561-8333-4d54-b008-f7d70bb7ed99-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.209278 5033 generic.go:334] "Generic (PLEG): container finished" podID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerID="f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e" exitCode=0 Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.209426 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppgvs" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.209409 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppgvs" event={"ID":"739f8561-8333-4d54-b008-f7d70bb7ed99","Type":"ContainerDied","Data":"f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e"} Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.209579 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppgvs" event={"ID":"739f8561-8333-4d54-b008-f7d70bb7ed99","Type":"ContainerDied","Data":"c4171ff3c7034b333588eb27c38e3ac65805695d23266c50e7ff88caa8420959"} Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.209603 5033 scope.go:117] "RemoveContainer" containerID="f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.211868 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535452-g5x6d" event={"ID":"3228144b-cd97-4d8f-af54-67b0dfe1cfbe","Type":"ContainerStarted","Data":"f90c155020a374be855cfe6d66e995e80315adbf861fb4cae91f1e38f786694d"} Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.232798 5033 scope.go:117] "RemoveContainer" containerID="7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.250383 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppgvs"] Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.262003 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ppgvs"] Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.275183 5033 scope.go:117] "RemoveContainer" containerID="25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.296128 5033 scope.go:117] "RemoveContainer" containerID="f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e" Feb 26 17:32:01 crc kubenswrapper[5033]: E0226 17:32:01.296563 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e\": container with ID starting with f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e not found: ID does not exist" containerID="f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.296598 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e"} err="failed to get container status \"f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e\": rpc error: code = NotFound desc = could not find container \"f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e\": container with ID starting with f45fef73d5d48b24d8cc99b42270aab90604bf005c9e59b83a9ac133ff90637e not found: ID does not exist" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.296619 5033 scope.go:117] "RemoveContainer" containerID="7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15" Feb 26 17:32:01 crc kubenswrapper[5033]: E0226 17:32:01.296935 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15\": container with ID starting with 7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15 not found: ID does not exist" containerID="7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.296959 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15"} err="failed to get container status \"7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15\": rpc error: code = NotFound desc = could not find container \"7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15\": container with ID starting with 7d70b8bdf93a70a78c7ce302e2fbd3064f3fc5abfddf8f04a7e8652041ecbc15 not found: ID does not exist" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.296972 5033 scope.go:117] "RemoveContainer" containerID="25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b" Feb 26 17:32:01 crc kubenswrapper[5033]: E0226 17:32:01.297214 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b\": container with ID starting with 25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b not found: ID does not exist" containerID="25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.297238 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b"} err="failed to get container status \"25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b\": rpc error: code = NotFound desc = could not find container \"25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b\": container with ID starting with 25c7a760c51de73ca27f16c4de34e7602787e57dbf3abd3c1563955b982b698b not found: ID does not exist" Feb 26 17:32:01 crc kubenswrapper[5033]: I0226 17:32:01.647573 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" path="/var/lib/kubelet/pods/739f8561-8333-4d54-b008-f7d70bb7ed99/volumes" Feb 26 17:32:03 crc kubenswrapper[5033]: I0226 17:32:03.235295 5033 generic.go:334] "Generic (PLEG): container finished" podID="3228144b-cd97-4d8f-af54-67b0dfe1cfbe" containerID="d232dc164fc76e2d0352f7976508e1dffd5dc08072585ad7b8651331f247354e" exitCode=0 Feb 26 17:32:03 crc kubenswrapper[5033]: I0226 17:32:03.235401 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535452-g5x6d" event={"ID":"3228144b-cd97-4d8f-af54-67b0dfe1cfbe","Type":"ContainerDied","Data":"d232dc164fc76e2d0352f7976508e1dffd5dc08072585ad7b8651331f247354e"} Feb 26 17:32:04 crc kubenswrapper[5033]: I0226 17:32:04.654461 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535452-g5x6d" Feb 26 17:32:04 crc kubenswrapper[5033]: I0226 17:32:04.818809 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzqbj\" (UniqueName: \"kubernetes.io/projected/3228144b-cd97-4d8f-af54-67b0dfe1cfbe-kube-api-access-fzqbj\") pod \"3228144b-cd97-4d8f-af54-67b0dfe1cfbe\" (UID: \"3228144b-cd97-4d8f-af54-67b0dfe1cfbe\") " Feb 26 17:32:04 crc kubenswrapper[5033]: I0226 17:32:04.827878 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3228144b-cd97-4d8f-af54-67b0dfe1cfbe-kube-api-access-fzqbj" (OuterVolumeSpecName: "kube-api-access-fzqbj") pod "3228144b-cd97-4d8f-af54-67b0dfe1cfbe" (UID: "3228144b-cd97-4d8f-af54-67b0dfe1cfbe"). InnerVolumeSpecName "kube-api-access-fzqbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:32:04 crc kubenswrapper[5033]: I0226 17:32:04.922106 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzqbj\" (UniqueName: \"kubernetes.io/projected/3228144b-cd97-4d8f-af54-67b0dfe1cfbe-kube-api-access-fzqbj\") on node \"crc\" DevicePath \"\"" Feb 26 17:32:05 crc kubenswrapper[5033]: I0226 17:32:05.258666 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535452-g5x6d" event={"ID":"3228144b-cd97-4d8f-af54-67b0dfe1cfbe","Type":"ContainerDied","Data":"f90c155020a374be855cfe6d66e995e80315adbf861fb4cae91f1e38f786694d"} Feb 26 17:32:05 crc kubenswrapper[5033]: I0226 17:32:05.258713 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535452-g5x6d" Feb 26 17:32:05 crc kubenswrapper[5033]: I0226 17:32:05.258716 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f90c155020a374be855cfe6d66e995e80315adbf861fb4cae91f1e38f786694d" Feb 26 17:32:05 crc kubenswrapper[5033]: I0226 17:32:05.725693 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535446-ldhxw"] Feb 26 17:32:05 crc kubenswrapper[5033]: I0226 17:32:05.736181 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535446-ldhxw"] Feb 26 17:32:07 crc kubenswrapper[5033]: I0226 17:32:07.658227 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="595a35be-ffc3-4329-a937-c9f2c09e4612" path="/var/lib/kubelet/pods/595a35be-ffc3-4329-a937-c9f2c09e4612/volumes" Feb 26 17:32:11 crc kubenswrapper[5033]: I0226 17:32:11.629164 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:32:11 crc kubenswrapper[5033]: E0226 17:32:11.630336 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:32:23 crc kubenswrapper[5033]: I0226 17:32:23.638953 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:32:23 crc kubenswrapper[5033]: E0226 17:32:23.639991 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:32:25 crc kubenswrapper[5033]: I0226 17:32:25.056657 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-qhr4l"] Feb 26 17:32:25 crc kubenswrapper[5033]: I0226 17:32:25.069173 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-qhr4l"] Feb 26 17:32:25 crc kubenswrapper[5033]: I0226 17:32:25.646607 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54be881-caca-4fe5-ba60-88b0c159d47e" path="/var/lib/kubelet/pods/f54be881-caca-4fe5-ba60-88b0c159d47e/volumes" Feb 26 17:32:36 crc kubenswrapper[5033]: I0226 17:32:36.628110 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:32:36 crc kubenswrapper[5033]: E0226 17:32:36.629137 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:32:40 crc kubenswrapper[5033]: I0226 17:32:40.110251 5033 scope.go:117] "RemoveContainer" containerID="10ca6f5a093084dcced6dcf596d7369be40ac7b32d8c944f3cbc327ca04bc3bd" Feb 26 17:32:40 crc kubenswrapper[5033]: I0226 17:32:40.160080 5033 scope.go:117] "RemoveContainer" containerID="f5e6ceeeb50b83c8b94a8f75de6be0aa7e5bcbc14494dc37e4d10ee8d97cefff" Feb 26 17:32:40 crc kubenswrapper[5033]: I0226 17:32:40.256281 5033 scope.go:117] "RemoveContainer" containerID="84c35f456f4001417f80fbd7e7b9e7cd21437e985ce94aee13b5762476da2342" Feb 26 17:32:51 crc kubenswrapper[5033]: I0226 17:32:51.628985 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:32:51 crc kubenswrapper[5033]: E0226 17:32:51.632138 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:33:04 crc kubenswrapper[5033]: I0226 17:33:04.628550 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:33:04 crc kubenswrapper[5033]: E0226 17:33:04.629378 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:33:19 crc kubenswrapper[5033]: I0226 17:33:19.627935 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:33:19 crc kubenswrapper[5033]: E0226 17:33:19.628689 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:33:31 crc kubenswrapper[5033]: I0226 17:33:31.629795 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:33:32 crc kubenswrapper[5033]: I0226 17:33:32.325597 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"5c869df2310547cca4ee2ef5bde1af9e7b98c69624bcb7a27c989f4e0ced38f4"} Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.164745 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535454-cv4fn"] Feb 26 17:34:00 crc kubenswrapper[5033]: E0226 17:34:00.165752 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="registry-server" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.165767 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="registry-server" Feb 26 17:34:00 crc kubenswrapper[5033]: E0226 17:34:00.165790 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="extract-content" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.165799 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="extract-content" Feb 26 17:34:00 crc kubenswrapper[5033]: E0226 17:34:00.165829 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="extract-utilities" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.165837 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="extract-utilities" Feb 26 17:34:00 crc kubenswrapper[5033]: E0226 17:34:00.165852 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3228144b-cd97-4d8f-af54-67b0dfe1cfbe" containerName="oc" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.165857 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3228144b-cd97-4d8f-af54-67b0dfe1cfbe" containerName="oc" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.166062 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3228144b-cd97-4d8f-af54-67b0dfe1cfbe" containerName="oc" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.166075 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="739f8561-8333-4d54-b008-f7d70bb7ed99" containerName="registry-server" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.166821 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535454-cv4fn" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.169865 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.170043 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.170200 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.178738 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535454-cv4fn"] Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.256889 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n25m\" (UniqueName: \"kubernetes.io/projected/ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853-kube-api-access-4n25m\") pod \"auto-csr-approver-29535454-cv4fn\" (UID: \"ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853\") " pod="openshift-infra/auto-csr-approver-29535454-cv4fn" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.359447 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n25m\" (UniqueName: \"kubernetes.io/projected/ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853-kube-api-access-4n25m\") pod \"auto-csr-approver-29535454-cv4fn\" (UID: \"ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853\") " pod="openshift-infra/auto-csr-approver-29535454-cv4fn" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.383126 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n25m\" (UniqueName: \"kubernetes.io/projected/ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853-kube-api-access-4n25m\") pod \"auto-csr-approver-29535454-cv4fn\" (UID: \"ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853\") " pod="openshift-infra/auto-csr-approver-29535454-cv4fn" Feb 26 17:34:00 crc kubenswrapper[5033]: I0226 17:34:00.507577 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535454-cv4fn" Feb 26 17:34:01 crc kubenswrapper[5033]: I0226 17:34:01.078275 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535454-cv4fn"] Feb 26 17:34:01 crc kubenswrapper[5033]: I0226 17:34:01.645175 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535454-cv4fn" event={"ID":"ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853","Type":"ContainerStarted","Data":"37243835f4316bf3afae52de5d4d34cfcde7fe56b0949caab30e6b2052ae71ad"} Feb 26 17:34:02 crc kubenswrapper[5033]: I0226 17:34:02.656194 5033 generic.go:334] "Generic (PLEG): container finished" podID="ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853" containerID="3d82343c61dffb25d61a7636e31eed4a397e1b53880881aff0c03e5950a24796" exitCode=0 Feb 26 17:34:02 crc kubenswrapper[5033]: I0226 17:34:02.656235 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535454-cv4fn" event={"ID":"ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853","Type":"ContainerDied","Data":"3d82343c61dffb25d61a7636e31eed4a397e1b53880881aff0c03e5950a24796"} Feb 26 17:34:04 crc kubenswrapper[5033]: I0226 17:34:04.023870 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535454-cv4fn" Feb 26 17:34:04 crc kubenswrapper[5033]: I0226 17:34:04.144082 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n25m\" (UniqueName: \"kubernetes.io/projected/ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853-kube-api-access-4n25m\") pod \"ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853\" (UID: \"ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853\") " Feb 26 17:34:04 crc kubenswrapper[5033]: I0226 17:34:04.150808 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853-kube-api-access-4n25m" (OuterVolumeSpecName: "kube-api-access-4n25m") pod "ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853" (UID: "ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853"). InnerVolumeSpecName "kube-api-access-4n25m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:34:04 crc kubenswrapper[5033]: I0226 17:34:04.247733 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n25m\" (UniqueName: \"kubernetes.io/projected/ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853-kube-api-access-4n25m\") on node \"crc\" DevicePath \"\"" Feb 26 17:34:04 crc kubenswrapper[5033]: I0226 17:34:04.677544 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535454-cv4fn" event={"ID":"ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853","Type":"ContainerDied","Data":"37243835f4316bf3afae52de5d4d34cfcde7fe56b0949caab30e6b2052ae71ad"} Feb 26 17:34:04 crc kubenswrapper[5033]: I0226 17:34:04.677593 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37243835f4316bf3afae52de5d4d34cfcde7fe56b0949caab30e6b2052ae71ad" Feb 26 17:34:04 crc kubenswrapper[5033]: I0226 17:34:04.677632 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535454-cv4fn" Feb 26 17:34:05 crc kubenswrapper[5033]: I0226 17:34:05.108056 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535448-4nc4j"] Feb 26 17:34:05 crc kubenswrapper[5033]: I0226 17:34:05.116113 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535448-4nc4j"] Feb 26 17:34:05 crc kubenswrapper[5033]: I0226 17:34:05.665128 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82997903-13a2-41d6-91f5-3719bac90196" path="/var/lib/kubelet/pods/82997903-13a2-41d6-91f5-3719bac90196/volumes" Feb 26 17:34:40 crc kubenswrapper[5033]: I0226 17:34:40.411650 5033 scope.go:117] "RemoveContainer" containerID="e6d3d2638c9162311d6e2812fde55c34c3acd6e97c12ca0d8c686dacccc427e4" Feb 26 17:34:56 crc kubenswrapper[5033]: I0226 17:34:56.053210 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-dc7a-account-create-update-4c7gz"] Feb 26 17:34:56 crc kubenswrapper[5033]: I0226 17:34:56.065418 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-drxqc"] Feb 26 17:34:56 crc kubenswrapper[5033]: I0226 17:34:56.074894 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-drxqc"] Feb 26 17:34:56 crc kubenswrapper[5033]: I0226 17:34:56.085894 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-dc7a-account-create-update-4c7gz"] Feb 26 17:34:57 crc kubenswrapper[5033]: I0226 17:34:57.651157 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c9c0b9-1c57-4874-ba89-996a3658c80c" path="/var/lib/kubelet/pods/98c9c0b9-1c57-4874-ba89-996a3658c80c/volumes" Feb 26 17:34:57 crc kubenswrapper[5033]: I0226 17:34:57.654034 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f351bf98-a88f-4ff7-a97b-b86356235a46" path="/var/lib/kubelet/pods/f351bf98-a88f-4ff7-a97b-b86356235a46/volumes" Feb 26 17:35:12 crc kubenswrapper[5033]: I0226 17:35:12.066380 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-9htgj"] Feb 26 17:35:12 crc kubenswrapper[5033]: I0226 17:35:12.078876 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-9htgj"] Feb 26 17:35:13 crc kubenswrapper[5033]: I0226 17:35:13.645554 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8417b29d-8e5f-4f31-888c-668d457abeb3" path="/var/lib/kubelet/pods/8417b29d-8e5f-4f31-888c-668d457abeb3/volumes" Feb 26 17:35:40 crc kubenswrapper[5033]: I0226 17:35:40.508089 5033 scope.go:117] "RemoveContainer" containerID="c8598d83ee14863f97512851fb3659ad563715e39421b2d9750209b7c042d408" Feb 26 17:35:40 crc kubenswrapper[5033]: I0226 17:35:40.546392 5033 scope.go:117] "RemoveContainer" containerID="bfdb44787466711c09ce1c8137e64ccf7c6c391fb7a693e6157ba80b43dbd2f0" Feb 26 17:35:40 crc kubenswrapper[5033]: I0226 17:35:40.607970 5033 scope.go:117] "RemoveContainer" containerID="ee51e15235be0713b31ed96af90b25441df3ccde8551dc6b5d364bc445733518" Feb 26 17:35:50 crc kubenswrapper[5033]: I0226 17:35:50.189376 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:35:50 crc kubenswrapper[5033]: I0226 17:35:50.189845 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.148928 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535456-p2rrp"] Feb 26 17:36:00 crc kubenswrapper[5033]: E0226 17:36:00.149927 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853" containerName="oc" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.149941 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853" containerName="oc" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.150200 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853" containerName="oc" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.151087 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535456-p2rrp" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.153332 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.153518 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.153641 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.165885 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535456-p2rrp"] Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.250345 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bshtx\" (UniqueName: \"kubernetes.io/projected/da172082-e5f5-4c27-bcee-a84478a084b0-kube-api-access-bshtx\") pod \"auto-csr-approver-29535456-p2rrp\" (UID: \"da172082-e5f5-4c27-bcee-a84478a084b0\") " pod="openshift-infra/auto-csr-approver-29535456-p2rrp" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.352511 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bshtx\" (UniqueName: \"kubernetes.io/projected/da172082-e5f5-4c27-bcee-a84478a084b0-kube-api-access-bshtx\") pod \"auto-csr-approver-29535456-p2rrp\" (UID: \"da172082-e5f5-4c27-bcee-a84478a084b0\") " pod="openshift-infra/auto-csr-approver-29535456-p2rrp" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.374305 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bshtx\" (UniqueName: \"kubernetes.io/projected/da172082-e5f5-4c27-bcee-a84478a084b0-kube-api-access-bshtx\") pod \"auto-csr-approver-29535456-p2rrp\" (UID: \"da172082-e5f5-4c27-bcee-a84478a084b0\") " pod="openshift-infra/auto-csr-approver-29535456-p2rrp" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.492590 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535456-p2rrp" Feb 26 17:36:00 crc kubenswrapper[5033]: I0226 17:36:00.971715 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535456-p2rrp"] Feb 26 17:36:01 crc kubenswrapper[5033]: I0226 17:36:01.072588 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535456-p2rrp" event={"ID":"da172082-e5f5-4c27-bcee-a84478a084b0","Type":"ContainerStarted","Data":"54416199bf8bcccf665011c9398eae0b65950afe77a5378b6fd28c410d9ab5fd"} Feb 26 17:36:03 crc kubenswrapper[5033]: I0226 17:36:03.091856 5033 generic.go:334] "Generic (PLEG): container finished" podID="da172082-e5f5-4c27-bcee-a84478a084b0" containerID="92aa1ac73e90fbc4de46224b24cbbcc92757a6ee196bc76a2c473f6b03335a12" exitCode=0 Feb 26 17:36:03 crc kubenswrapper[5033]: I0226 17:36:03.091908 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535456-p2rrp" event={"ID":"da172082-e5f5-4c27-bcee-a84478a084b0","Type":"ContainerDied","Data":"92aa1ac73e90fbc4de46224b24cbbcc92757a6ee196bc76a2c473f6b03335a12"} Feb 26 17:36:04 crc kubenswrapper[5033]: I0226 17:36:04.525131 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535456-p2rrp" Feb 26 17:36:04 crc kubenswrapper[5033]: I0226 17:36:04.553633 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bshtx\" (UniqueName: \"kubernetes.io/projected/da172082-e5f5-4c27-bcee-a84478a084b0-kube-api-access-bshtx\") pod \"da172082-e5f5-4c27-bcee-a84478a084b0\" (UID: \"da172082-e5f5-4c27-bcee-a84478a084b0\") " Feb 26 17:36:04 crc kubenswrapper[5033]: I0226 17:36:04.563961 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da172082-e5f5-4c27-bcee-a84478a084b0-kube-api-access-bshtx" (OuterVolumeSpecName: "kube-api-access-bshtx") pod "da172082-e5f5-4c27-bcee-a84478a084b0" (UID: "da172082-e5f5-4c27-bcee-a84478a084b0"). InnerVolumeSpecName "kube-api-access-bshtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:36:04 crc kubenswrapper[5033]: I0226 17:36:04.656189 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bshtx\" (UniqueName: \"kubernetes.io/projected/da172082-e5f5-4c27-bcee-a84478a084b0-kube-api-access-bshtx\") on node \"crc\" DevicePath \"\"" Feb 26 17:36:05 crc kubenswrapper[5033]: I0226 17:36:05.116065 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535456-p2rrp" event={"ID":"da172082-e5f5-4c27-bcee-a84478a084b0","Type":"ContainerDied","Data":"54416199bf8bcccf665011c9398eae0b65950afe77a5378b6fd28c410d9ab5fd"} Feb 26 17:36:05 crc kubenswrapper[5033]: I0226 17:36:05.116103 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54416199bf8bcccf665011c9398eae0b65950afe77a5378b6fd28c410d9ab5fd" Feb 26 17:36:05 crc kubenswrapper[5033]: I0226 17:36:05.116137 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535456-p2rrp" Feb 26 17:36:05 crc kubenswrapper[5033]: I0226 17:36:05.597624 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535450-wl4nv"] Feb 26 17:36:05 crc kubenswrapper[5033]: I0226 17:36:05.610871 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535450-wl4nv"] Feb 26 17:36:05 crc kubenswrapper[5033]: I0226 17:36:05.640521 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dba320fa-8dad-45ae-a794-964bbd3201c2" path="/var/lib/kubelet/pods/dba320fa-8dad-45ae-a794-964bbd3201c2/volumes" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.673604 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2qfdc"] Feb 26 17:36:10 crc kubenswrapper[5033]: E0226 17:36:10.674664 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da172082-e5f5-4c27-bcee-a84478a084b0" containerName="oc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.674681 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="da172082-e5f5-4c27-bcee-a84478a084b0" containerName="oc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.674962 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="da172082-e5f5-4c27-bcee-a84478a084b0" containerName="oc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.676869 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.684360 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-utilities\") pod \"redhat-marketplace-2qfdc\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.684458 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-catalog-content\") pod \"redhat-marketplace-2qfdc\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.684699 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72zf\" (UniqueName: \"kubernetes.io/projected/f01a4c64-7231-4168-bd0c-dfb38b894cff-kube-api-access-v72zf\") pod \"redhat-marketplace-2qfdc\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.705188 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qfdc"] Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.786816 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-catalog-content\") pod \"redhat-marketplace-2qfdc\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.786973 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72zf\" (UniqueName: \"kubernetes.io/projected/f01a4c64-7231-4168-bd0c-dfb38b894cff-kube-api-access-v72zf\") pod \"redhat-marketplace-2qfdc\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.787061 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-utilities\") pod \"redhat-marketplace-2qfdc\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.787446 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-catalog-content\") pod \"redhat-marketplace-2qfdc\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.787514 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-utilities\") pod \"redhat-marketplace-2qfdc\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:10 crc kubenswrapper[5033]: I0226 17:36:10.812143 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72zf\" (UniqueName: \"kubernetes.io/projected/f01a4c64-7231-4168-bd0c-dfb38b894cff-kube-api-access-v72zf\") pod \"redhat-marketplace-2qfdc\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:11 crc kubenswrapper[5033]: I0226 17:36:11.004977 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:11 crc kubenswrapper[5033]: I0226 17:36:11.487228 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qfdc"] Feb 26 17:36:12 crc kubenswrapper[5033]: I0226 17:36:12.187042 5033 generic.go:334] "Generic (PLEG): container finished" podID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerID="1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c" exitCode=0 Feb 26 17:36:12 crc kubenswrapper[5033]: I0226 17:36:12.187089 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qfdc" event={"ID":"f01a4c64-7231-4168-bd0c-dfb38b894cff","Type":"ContainerDied","Data":"1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c"} Feb 26 17:36:12 crc kubenswrapper[5033]: I0226 17:36:12.187116 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qfdc" event={"ID":"f01a4c64-7231-4168-bd0c-dfb38b894cff","Type":"ContainerStarted","Data":"3c50df69adda09b568e4f318e2b5d2b6ea1be8f43fdece1ad6386e8f0180ab15"} Feb 26 17:36:13 crc kubenswrapper[5033]: I0226 17:36:13.197162 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qfdc" event={"ID":"f01a4c64-7231-4168-bd0c-dfb38b894cff","Type":"ContainerStarted","Data":"e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0"} Feb 26 17:36:14 crc kubenswrapper[5033]: I0226 17:36:14.209363 5033 generic.go:334] "Generic (PLEG): container finished" podID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerID="e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0" exitCode=0 Feb 26 17:36:14 crc kubenswrapper[5033]: I0226 17:36:14.209440 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qfdc" event={"ID":"f01a4c64-7231-4168-bd0c-dfb38b894cff","Type":"ContainerDied","Data":"e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0"} Feb 26 17:36:15 crc kubenswrapper[5033]: I0226 17:36:15.229133 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qfdc" event={"ID":"f01a4c64-7231-4168-bd0c-dfb38b894cff","Type":"ContainerStarted","Data":"629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5"} Feb 26 17:36:15 crc kubenswrapper[5033]: I0226 17:36:15.255775 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2qfdc" podStartSLOduration=2.523073395 podStartE2EDuration="5.255757401s" podCreationTimestamp="2026-02-26 17:36:10 +0000 UTC" firstStartedPulling="2026-02-26 17:36:12.189111997 +0000 UTC m=+6899.401710647" lastFinishedPulling="2026-02-26 17:36:14.921796003 +0000 UTC m=+6902.134394653" observedRunningTime="2026-02-26 17:36:15.249418116 +0000 UTC m=+6902.462016776" watchObservedRunningTime="2026-02-26 17:36:15.255757401 +0000 UTC m=+6902.468356051" Feb 26 17:36:20 crc kubenswrapper[5033]: I0226 17:36:20.189471 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:36:20 crc kubenswrapper[5033]: I0226 17:36:20.190030 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:36:21 crc kubenswrapper[5033]: I0226 17:36:21.006297 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:21 crc kubenswrapper[5033]: I0226 17:36:21.006641 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:21 crc kubenswrapper[5033]: I0226 17:36:21.080749 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:21 crc kubenswrapper[5033]: I0226 17:36:21.335128 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:21 crc kubenswrapper[5033]: I0226 17:36:21.385546 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qfdc"] Feb 26 17:36:23 crc kubenswrapper[5033]: I0226 17:36:23.302471 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2qfdc" podUID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerName="registry-server" containerID="cri-o://629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5" gracePeriod=2 Feb 26 17:36:23 crc kubenswrapper[5033]: I0226 17:36:23.812344 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:23 crc kubenswrapper[5033]: I0226 17:36:23.906559 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-utilities\") pod \"f01a4c64-7231-4168-bd0c-dfb38b894cff\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " Feb 26 17:36:23 crc kubenswrapper[5033]: I0226 17:36:23.906611 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v72zf\" (UniqueName: \"kubernetes.io/projected/f01a4c64-7231-4168-bd0c-dfb38b894cff-kube-api-access-v72zf\") pod \"f01a4c64-7231-4168-bd0c-dfb38b894cff\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " Feb 26 17:36:23 crc kubenswrapper[5033]: I0226 17:36:23.906850 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-catalog-content\") pod \"f01a4c64-7231-4168-bd0c-dfb38b894cff\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " Feb 26 17:36:23 crc kubenswrapper[5033]: I0226 17:36:23.907485 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-utilities" (OuterVolumeSpecName: "utilities") pod "f01a4c64-7231-4168-bd0c-dfb38b894cff" (UID: "f01a4c64-7231-4168-bd0c-dfb38b894cff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:36:23 crc kubenswrapper[5033]: I0226 17:36:23.907861 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:36:23 crc kubenswrapper[5033]: I0226 17:36:23.914600 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01a4c64-7231-4168-bd0c-dfb38b894cff-kube-api-access-v72zf" (OuterVolumeSpecName: "kube-api-access-v72zf") pod "f01a4c64-7231-4168-bd0c-dfb38b894cff" (UID: "f01a4c64-7231-4168-bd0c-dfb38b894cff"). InnerVolumeSpecName "kube-api-access-v72zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.010339 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v72zf\" (UniqueName: \"kubernetes.io/projected/f01a4c64-7231-4168-bd0c-dfb38b894cff-kube-api-access-v72zf\") on node \"crc\" DevicePath \"\"" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.111138 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f01a4c64-7231-4168-bd0c-dfb38b894cff" (UID: "f01a4c64-7231-4168-bd0c-dfb38b894cff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.111637 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-catalog-content\") pod \"f01a4c64-7231-4168-bd0c-dfb38b894cff\" (UID: \"f01a4c64-7231-4168-bd0c-dfb38b894cff\") " Feb 26 17:36:24 crc kubenswrapper[5033]: W0226 17:36:24.111797 5033 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f01a4c64-7231-4168-bd0c-dfb38b894cff/volumes/kubernetes.io~empty-dir/catalog-content Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.111872 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f01a4c64-7231-4168-bd0c-dfb38b894cff" (UID: "f01a4c64-7231-4168-bd0c-dfb38b894cff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.112545 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01a4c64-7231-4168-bd0c-dfb38b894cff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.312947 5033 generic.go:334] "Generic (PLEG): container finished" podID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerID="629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5" exitCode=0 Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.313009 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qfdc" event={"ID":"f01a4c64-7231-4168-bd0c-dfb38b894cff","Type":"ContainerDied","Data":"629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5"} Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.313084 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qfdc" event={"ID":"f01a4c64-7231-4168-bd0c-dfb38b894cff","Type":"ContainerDied","Data":"3c50df69adda09b568e4f318e2b5d2b6ea1be8f43fdece1ad6386e8f0180ab15"} Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.313115 5033 scope.go:117] "RemoveContainer" containerID="629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.314393 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qfdc" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.336610 5033 scope.go:117] "RemoveContainer" containerID="e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.359562 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qfdc"] Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.366999 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qfdc"] Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.385578 5033 scope.go:117] "RemoveContainer" containerID="1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.424161 5033 scope.go:117] "RemoveContainer" containerID="629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5" Feb 26 17:36:24 crc kubenswrapper[5033]: E0226 17:36:24.424936 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5\": container with ID starting with 629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5 not found: ID does not exist" containerID="629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.424987 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5"} err="failed to get container status \"629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5\": rpc error: code = NotFound desc = could not find container \"629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5\": container with ID starting with 629bf9427e2b0ebafb615df5865e0cf4e9ed3ce108913a36a2695526596d0ad5 not found: ID does not exist" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.425021 5033 scope.go:117] "RemoveContainer" containerID="e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0" Feb 26 17:36:24 crc kubenswrapper[5033]: E0226 17:36:24.425745 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0\": container with ID starting with e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0 not found: ID does not exist" containerID="e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.425807 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0"} err="failed to get container status \"e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0\": rpc error: code = NotFound desc = could not find container \"e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0\": container with ID starting with e045a72554f7634695d821cf58561cc822b906dd2cffaa2e8289063c4fb3bcc0 not found: ID does not exist" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.425843 5033 scope.go:117] "RemoveContainer" containerID="1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c" Feb 26 17:36:24 crc kubenswrapper[5033]: E0226 17:36:24.426280 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c\": container with ID starting with 1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c not found: ID does not exist" containerID="1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c" Feb 26 17:36:24 crc kubenswrapper[5033]: I0226 17:36:24.426333 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c"} err="failed to get container status \"1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c\": rpc error: code = NotFound desc = could not find container \"1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c\": container with ID starting with 1c7184c5ad6e375d438b1e7de7c8d56c4d0a3f4028b9d24b2c70aa3d0e5b4e7c not found: ID does not exist" Feb 26 17:36:25 crc kubenswrapper[5033]: I0226 17:36:25.670096 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f01a4c64-7231-4168-bd0c-dfb38b894cff" path="/var/lib/kubelet/pods/f01a4c64-7231-4168-bd0c-dfb38b894cff/volumes" Feb 26 17:36:40 crc kubenswrapper[5033]: I0226 17:36:40.748060 5033 scope.go:117] "RemoveContainer" containerID="70c6a315deab0b265b007900946a556a7bdb91d1dab1bb1b8c4d73a48af3f24b" Feb 26 17:36:50 crc kubenswrapper[5033]: I0226 17:36:50.189284 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:36:50 crc kubenswrapper[5033]: I0226 17:36:50.189784 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:36:50 crc kubenswrapper[5033]: I0226 17:36:50.189829 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:36:50 crc kubenswrapper[5033]: I0226 17:36:50.190738 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c869df2310547cca4ee2ef5bde1af9e7b98c69624bcb7a27c989f4e0ced38f4"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:36:50 crc kubenswrapper[5033]: I0226 17:36:50.190805 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://5c869df2310547cca4ee2ef5bde1af9e7b98c69624bcb7a27c989f4e0ced38f4" gracePeriod=600 Feb 26 17:36:50 crc kubenswrapper[5033]: I0226 17:36:50.620025 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="5c869df2310547cca4ee2ef5bde1af9e7b98c69624bcb7a27c989f4e0ced38f4" exitCode=0 Feb 26 17:36:50 crc kubenswrapper[5033]: I0226 17:36:50.620110 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"5c869df2310547cca4ee2ef5bde1af9e7b98c69624bcb7a27c989f4e0ced38f4"} Feb 26 17:36:50 crc kubenswrapper[5033]: I0226 17:36:50.620376 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b"} Feb 26 17:36:50 crc kubenswrapper[5033]: I0226 17:36:50.620399 5033 scope.go:117] "RemoveContainer" containerID="14586cf277c9886082cb3c5a21d9a4f1f492f7855664812631e9ac34724048fb" Feb 26 17:37:22 crc kubenswrapper[5033]: I0226 17:37:22.061136 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-ffj5s"] Feb 26 17:37:22 crc kubenswrapper[5033]: I0226 17:37:22.073875 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-ffj5s"] Feb 26 17:37:23 crc kubenswrapper[5033]: I0226 17:37:23.044454 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-8dc9-account-create-update-vhg8q"] Feb 26 17:37:23 crc kubenswrapper[5033]: I0226 17:37:23.062310 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-8dc9-account-create-update-vhg8q"] Feb 26 17:37:23 crc kubenswrapper[5033]: I0226 17:37:23.642088 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3976cdda-4518-4d6a-b33e-13b88d1d74fc" path="/var/lib/kubelet/pods/3976cdda-4518-4d6a-b33e-13b88d1d74fc/volumes" Feb 26 17:37:23 crc kubenswrapper[5033]: I0226 17:37:23.642723 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c609b584-a871-4af1-bdc9-123c4b497ed9" path="/var/lib/kubelet/pods/c609b584-a871-4af1-bdc9-123c4b497ed9/volumes" Feb 26 17:37:34 crc kubenswrapper[5033]: I0226 17:37:34.076272 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-wzprc"] Feb 26 17:37:34 crc kubenswrapper[5033]: I0226 17:37:34.090156 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-wzprc"] Feb 26 17:37:35 crc kubenswrapper[5033]: I0226 17:37:35.643301 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce55d9fc-6d50-4c54-8a55-3eb6e238b177" path="/var/lib/kubelet/pods/ce55d9fc-6d50-4c54-8a55-3eb6e238b177/volumes" Feb 26 17:37:40 crc kubenswrapper[5033]: I0226 17:37:40.841239 5033 scope.go:117] "RemoveContainer" containerID="00ffcabc969d84b462a2994235efea576bd93b546d346ba94bd98a9b3fea0a8e" Feb 26 17:37:40 crc kubenswrapper[5033]: I0226 17:37:40.881068 5033 scope.go:117] "RemoveContainer" containerID="5ce1fed72de0d02a065df1ec3e6a489ec80c006575a2cec1d674238007203885" Feb 26 17:37:40 crc kubenswrapper[5033]: I0226 17:37:40.947315 5033 scope.go:117] "RemoveContainer" containerID="566c866c992097aa590ec283ad79dd20bb6644916bd792ce18bc5b2c232d2ec4" Feb 26 17:37:57 crc kubenswrapper[5033]: I0226 17:37:57.041165 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-wv4qk"] Feb 26 17:37:57 crc kubenswrapper[5033]: I0226 17:37:57.053586 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-6808-account-create-update-69xc5"] Feb 26 17:37:57 crc kubenswrapper[5033]: I0226 17:37:57.061433 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-wv4qk"] Feb 26 17:37:57 crc kubenswrapper[5033]: I0226 17:37:57.069518 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-6808-account-create-update-69xc5"] Feb 26 17:37:57 crc kubenswrapper[5033]: I0226 17:37:57.652005 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02b7ac8c-763d-4ee5-b3d8-3c07504dd494" path="/var/lib/kubelet/pods/02b7ac8c-763d-4ee5-b3d8-3c07504dd494/volumes" Feb 26 17:37:57 crc kubenswrapper[5033]: I0226 17:37:57.653028 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7976ad2e-ed8e-4f4d-9832-655824619d62" path="/var/lib/kubelet/pods/7976ad2e-ed8e-4f4d-9832-655824619d62/volumes" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.156035 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535458-m65tm"] Feb 26 17:38:00 crc kubenswrapper[5033]: E0226 17:38:00.157074 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerName="registry-server" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.157086 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerName="registry-server" Feb 26 17:38:00 crc kubenswrapper[5033]: E0226 17:38:00.157101 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerName="extract-content" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.157106 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerName="extract-content" Feb 26 17:38:00 crc kubenswrapper[5033]: E0226 17:38:00.157122 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerName="extract-utilities" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.157128 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerName="extract-utilities" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.157361 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01a4c64-7231-4168-bd0c-dfb38b894cff" containerName="registry-server" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.158383 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535458-m65tm" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.160913 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.161112 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.161407 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.166904 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsn9v\" (UniqueName: \"kubernetes.io/projected/4272dacd-15d6-4e3b-80ea-b66dcbe48f2f-kube-api-access-fsn9v\") pod \"auto-csr-approver-29535458-m65tm\" (UID: \"4272dacd-15d6-4e3b-80ea-b66dcbe48f2f\") " pod="openshift-infra/auto-csr-approver-29535458-m65tm" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.168054 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535458-m65tm"] Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.270439 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsn9v\" (UniqueName: \"kubernetes.io/projected/4272dacd-15d6-4e3b-80ea-b66dcbe48f2f-kube-api-access-fsn9v\") pod \"auto-csr-approver-29535458-m65tm\" (UID: \"4272dacd-15d6-4e3b-80ea-b66dcbe48f2f\") " pod="openshift-infra/auto-csr-approver-29535458-m65tm" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.290984 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsn9v\" (UniqueName: \"kubernetes.io/projected/4272dacd-15d6-4e3b-80ea-b66dcbe48f2f-kube-api-access-fsn9v\") pod \"auto-csr-approver-29535458-m65tm\" (UID: \"4272dacd-15d6-4e3b-80ea-b66dcbe48f2f\") " pod="openshift-infra/auto-csr-approver-29535458-m65tm" Feb 26 17:38:00 crc kubenswrapper[5033]: I0226 17:38:00.481717 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535458-m65tm" Feb 26 17:38:01 crc kubenswrapper[5033]: I0226 17:38:01.018210 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:38:01 crc kubenswrapper[5033]: I0226 17:38:01.023780 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535458-m65tm"] Feb 26 17:38:01 crc kubenswrapper[5033]: I0226 17:38:01.439764 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535458-m65tm" event={"ID":"4272dacd-15d6-4e3b-80ea-b66dcbe48f2f","Type":"ContainerStarted","Data":"5ba485deaa8e1caaf7b4643c346e1293c7271a8997aeed3805397d4395221d61"} Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.270923 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9hdq7"] Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.274631 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.303013 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9hdq7"] Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.315806 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-catalog-content\") pod \"certified-operators-9hdq7\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.316018 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-utilities\") pod \"certified-operators-9hdq7\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.316199 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw7c5\" (UniqueName: \"kubernetes.io/projected/a1fb3703-4006-451b-9823-04df659a6d2e-kube-api-access-lw7c5\") pod \"certified-operators-9hdq7\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.419002 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-catalog-content\") pod \"certified-operators-9hdq7\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.419089 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-utilities\") pod \"certified-operators-9hdq7\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.419162 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw7c5\" (UniqueName: \"kubernetes.io/projected/a1fb3703-4006-451b-9823-04df659a6d2e-kube-api-access-lw7c5\") pod \"certified-operators-9hdq7\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.420112 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-catalog-content\") pod \"certified-operators-9hdq7\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.420425 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-utilities\") pod \"certified-operators-9hdq7\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.440582 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw7c5\" (UniqueName: \"kubernetes.io/projected/a1fb3703-4006-451b-9823-04df659a6d2e-kube-api-access-lw7c5\") pod \"certified-operators-9hdq7\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:02 crc kubenswrapper[5033]: I0226 17:38:02.612823 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:03 crc kubenswrapper[5033]: I0226 17:38:03.121031 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9hdq7"] Feb 26 17:38:03 crc kubenswrapper[5033]: W0226 17:38:03.122209 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1fb3703_4006_451b_9823_04df659a6d2e.slice/crio-66c94d092a8ad9185d171b7ede2170675cab3555d5bd19d9cee2a8b28ddfd00c WatchSource:0}: Error finding container 66c94d092a8ad9185d171b7ede2170675cab3555d5bd19d9cee2a8b28ddfd00c: Status 404 returned error can't find the container with id 66c94d092a8ad9185d171b7ede2170675cab3555d5bd19d9cee2a8b28ddfd00c Feb 26 17:38:03 crc kubenswrapper[5033]: I0226 17:38:03.487617 5033 generic.go:334] "Generic (PLEG): container finished" podID="a1fb3703-4006-451b-9823-04df659a6d2e" containerID="6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a" exitCode=0 Feb 26 17:38:03 crc kubenswrapper[5033]: I0226 17:38:03.487823 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hdq7" event={"ID":"a1fb3703-4006-451b-9823-04df659a6d2e","Type":"ContainerDied","Data":"6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a"} Feb 26 17:38:03 crc kubenswrapper[5033]: I0226 17:38:03.494571 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hdq7" event={"ID":"a1fb3703-4006-451b-9823-04df659a6d2e","Type":"ContainerStarted","Data":"66c94d092a8ad9185d171b7ede2170675cab3555d5bd19d9cee2a8b28ddfd00c"} Feb 26 17:38:03 crc kubenswrapper[5033]: I0226 17:38:03.497597 5033 generic.go:334] "Generic (PLEG): container finished" podID="4272dacd-15d6-4e3b-80ea-b66dcbe48f2f" containerID="811adef759b03aa89440e2449dfbceb88e0d57f239df90e8256e046de203cc9d" exitCode=0 Feb 26 17:38:03 crc kubenswrapper[5033]: I0226 17:38:03.497705 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535458-m65tm" event={"ID":"4272dacd-15d6-4e3b-80ea-b66dcbe48f2f","Type":"ContainerDied","Data":"811adef759b03aa89440e2449dfbceb88e0d57f239df90e8256e046de203cc9d"} Feb 26 17:38:04 crc kubenswrapper[5033]: I0226 17:38:04.510488 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hdq7" event={"ID":"a1fb3703-4006-451b-9823-04df659a6d2e","Type":"ContainerStarted","Data":"747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0"} Feb 26 17:38:04 crc kubenswrapper[5033]: I0226 17:38:04.924327 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535458-m65tm" Feb 26 17:38:05 crc kubenswrapper[5033]: I0226 17:38:05.082540 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsn9v\" (UniqueName: \"kubernetes.io/projected/4272dacd-15d6-4e3b-80ea-b66dcbe48f2f-kube-api-access-fsn9v\") pod \"4272dacd-15d6-4e3b-80ea-b66dcbe48f2f\" (UID: \"4272dacd-15d6-4e3b-80ea-b66dcbe48f2f\") " Feb 26 17:38:05 crc kubenswrapper[5033]: I0226 17:38:05.089229 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4272dacd-15d6-4e3b-80ea-b66dcbe48f2f-kube-api-access-fsn9v" (OuterVolumeSpecName: "kube-api-access-fsn9v") pod "4272dacd-15d6-4e3b-80ea-b66dcbe48f2f" (UID: "4272dacd-15d6-4e3b-80ea-b66dcbe48f2f"). InnerVolumeSpecName "kube-api-access-fsn9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:38:05 crc kubenswrapper[5033]: I0226 17:38:05.185013 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsn9v\" (UniqueName: \"kubernetes.io/projected/4272dacd-15d6-4e3b-80ea-b66dcbe48f2f-kube-api-access-fsn9v\") on node \"crc\" DevicePath \"\"" Feb 26 17:38:05 crc kubenswrapper[5033]: I0226 17:38:05.547107 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535458-m65tm" Feb 26 17:38:05 crc kubenswrapper[5033]: I0226 17:38:05.547301 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535458-m65tm" event={"ID":"4272dacd-15d6-4e3b-80ea-b66dcbe48f2f","Type":"ContainerDied","Data":"5ba485deaa8e1caaf7b4643c346e1293c7271a8997aeed3805397d4395221d61"} Feb 26 17:38:05 crc kubenswrapper[5033]: I0226 17:38:05.548544 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ba485deaa8e1caaf7b4643c346e1293c7271a8997aeed3805397d4395221d61" Feb 26 17:38:06 crc kubenswrapper[5033]: I0226 17:38:06.035337 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535452-g5x6d"] Feb 26 17:38:06 crc kubenswrapper[5033]: I0226 17:38:06.045243 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535452-g5x6d"] Feb 26 17:38:07 crc kubenswrapper[5033]: I0226 17:38:07.575522 5033 generic.go:334] "Generic (PLEG): container finished" podID="a1fb3703-4006-451b-9823-04df659a6d2e" containerID="747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0" exitCode=0 Feb 26 17:38:07 crc kubenswrapper[5033]: I0226 17:38:07.575614 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hdq7" event={"ID":"a1fb3703-4006-451b-9823-04df659a6d2e","Type":"ContainerDied","Data":"747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0"} Feb 26 17:38:07 crc kubenswrapper[5033]: I0226 17:38:07.643521 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3228144b-cd97-4d8f-af54-67b0dfe1cfbe" path="/var/lib/kubelet/pods/3228144b-cd97-4d8f-af54-67b0dfe1cfbe/volumes" Feb 26 17:38:08 crc kubenswrapper[5033]: I0226 17:38:08.042096 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-wcsql"] Feb 26 17:38:08 crc kubenswrapper[5033]: I0226 17:38:08.054113 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-wcsql"] Feb 26 17:38:09 crc kubenswrapper[5033]: I0226 17:38:09.597223 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hdq7" event={"ID":"a1fb3703-4006-451b-9823-04df659a6d2e","Type":"ContainerStarted","Data":"997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888"} Feb 26 17:38:09 crc kubenswrapper[5033]: I0226 17:38:09.624693 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9hdq7" podStartSLOduration=2.584577004 podStartE2EDuration="7.624676791s" podCreationTimestamp="2026-02-26 17:38:02 +0000 UTC" firstStartedPulling="2026-02-26 17:38:03.489994768 +0000 UTC m=+7010.702593428" lastFinishedPulling="2026-02-26 17:38:08.530094565 +0000 UTC m=+7015.742693215" observedRunningTime="2026-02-26 17:38:09.61891057 +0000 UTC m=+7016.831509220" watchObservedRunningTime="2026-02-26 17:38:09.624676791 +0000 UTC m=+7016.837275441" Feb 26 17:38:09 crc kubenswrapper[5033]: I0226 17:38:09.641448 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9" path="/var/lib/kubelet/pods/9556b055-cd2e-4dd2-a4c7-fedb56d1f3b9/volumes" Feb 26 17:38:12 crc kubenswrapper[5033]: I0226 17:38:12.613382 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:12 crc kubenswrapper[5033]: I0226 17:38:12.614009 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:13 crc kubenswrapper[5033]: I0226 17:38:13.675733 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9hdq7" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" containerName="registry-server" probeResult="failure" output=< Feb 26 17:38:13 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 17:38:13 crc kubenswrapper[5033]: > Feb 26 17:38:22 crc kubenswrapper[5033]: I0226 17:38:22.664016 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:22 crc kubenswrapper[5033]: I0226 17:38:22.730721 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:22 crc kubenswrapper[5033]: I0226 17:38:22.908153 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9hdq7"] Feb 26 17:38:23 crc kubenswrapper[5033]: I0226 17:38:23.751987 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9hdq7" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" containerName="registry-server" containerID="cri-o://997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888" gracePeriod=2 Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.617157 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.681202 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-utilities\") pod \"a1fb3703-4006-451b-9823-04df659a6d2e\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.681448 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-catalog-content\") pod \"a1fb3703-4006-451b-9823-04df659a6d2e\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.681532 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw7c5\" (UniqueName: \"kubernetes.io/projected/a1fb3703-4006-451b-9823-04df659a6d2e-kube-api-access-lw7c5\") pod \"a1fb3703-4006-451b-9823-04df659a6d2e\" (UID: \"a1fb3703-4006-451b-9823-04df659a6d2e\") " Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.682391 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-utilities" (OuterVolumeSpecName: "utilities") pod "a1fb3703-4006-451b-9823-04df659a6d2e" (UID: "a1fb3703-4006-451b-9823-04df659a6d2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.690545 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1fb3703-4006-451b-9823-04df659a6d2e-kube-api-access-lw7c5" (OuterVolumeSpecName: "kube-api-access-lw7c5") pod "a1fb3703-4006-451b-9823-04df659a6d2e" (UID: "a1fb3703-4006-451b-9823-04df659a6d2e"). InnerVolumeSpecName "kube-api-access-lw7c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.741790 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1fb3703-4006-451b-9823-04df659a6d2e" (UID: "a1fb3703-4006-451b-9823-04df659a6d2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.763290 5033 generic.go:334] "Generic (PLEG): container finished" podID="a1fb3703-4006-451b-9823-04df659a6d2e" containerID="997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888" exitCode=0 Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.763343 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hdq7" event={"ID":"a1fb3703-4006-451b-9823-04df659a6d2e","Type":"ContainerDied","Data":"997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888"} Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.763381 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hdq7" event={"ID":"a1fb3703-4006-451b-9823-04df659a6d2e","Type":"ContainerDied","Data":"66c94d092a8ad9185d171b7ede2170675cab3555d5bd19d9cee2a8b28ddfd00c"} Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.763401 5033 scope.go:117] "RemoveContainer" containerID="997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.763398 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hdq7" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.783209 5033 scope.go:117] "RemoveContainer" containerID="747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.784167 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.784197 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw7c5\" (UniqueName: \"kubernetes.io/projected/a1fb3703-4006-451b-9823-04df659a6d2e-kube-api-access-lw7c5\") on node \"crc\" DevicePath \"\"" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.784212 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fb3703-4006-451b-9823-04df659a6d2e-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.810042 5033 scope.go:117] "RemoveContainer" containerID="6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.815361 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9hdq7"] Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.828638 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9hdq7"] Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.855918 5033 scope.go:117] "RemoveContainer" containerID="997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888" Feb 26 17:38:24 crc kubenswrapper[5033]: E0226 17:38:24.856539 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888\": container with ID starting with 997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888 not found: ID does not exist" containerID="997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.856607 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888"} err="failed to get container status \"997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888\": rpc error: code = NotFound desc = could not find container \"997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888\": container with ID starting with 997bda1d22f60ccd881fb7eddd6a47925da72244b3e36b3015fb1975a8677888 not found: ID does not exist" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.856655 5033 scope.go:117] "RemoveContainer" containerID="747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0" Feb 26 17:38:24 crc kubenswrapper[5033]: E0226 17:38:24.857163 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0\": container with ID starting with 747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0 not found: ID does not exist" containerID="747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.857195 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0"} err="failed to get container status \"747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0\": rpc error: code = NotFound desc = could not find container \"747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0\": container with ID starting with 747e377711e1a5037ff48550090e4c288cc60f03a7065f5e26a91325acbe56f0 not found: ID does not exist" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.857215 5033 scope.go:117] "RemoveContainer" containerID="6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a" Feb 26 17:38:24 crc kubenswrapper[5033]: E0226 17:38:24.857538 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a\": container with ID starting with 6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a not found: ID does not exist" containerID="6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a" Feb 26 17:38:24 crc kubenswrapper[5033]: I0226 17:38:24.857584 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a"} err="failed to get container status \"6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a\": rpc error: code = NotFound desc = could not find container \"6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a\": container with ID starting with 6710d7b801942ec77996ae53c7e4a8054540ef6ca1a03f109331cb11fe9d911a not found: ID does not exist" Feb 26 17:38:25 crc kubenswrapper[5033]: I0226 17:38:25.650659 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" path="/var/lib/kubelet/pods/a1fb3703-4006-451b-9823-04df659a6d2e/volumes" Feb 26 17:38:41 crc kubenswrapper[5033]: I0226 17:38:41.076705 5033 scope.go:117] "RemoveContainer" containerID="1876c936a0e912dfcc133e7918d64b41ca1b00c2b046134731ff4ab5aa10dd7b" Feb 26 17:38:41 crc kubenswrapper[5033]: I0226 17:38:41.135078 5033 scope.go:117] "RemoveContainer" containerID="605988be20a0afc88fc23ff0d6799751e057fa0b49deb246cee8c771e6623fb9" Feb 26 17:38:41 crc kubenswrapper[5033]: I0226 17:38:41.184035 5033 scope.go:117] "RemoveContainer" containerID="d232dc164fc76e2d0352f7976508e1dffd5dc08072585ad7b8651331f247354e" Feb 26 17:38:41 crc kubenswrapper[5033]: I0226 17:38:41.274127 5033 scope.go:117] "RemoveContainer" containerID="9163fdac3e58ff8672675378c62b3c326c527eaa5ae675bfa3beea7908506825" Feb 26 17:38:50 crc kubenswrapper[5033]: I0226 17:38:50.188543 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:38:50 crc kubenswrapper[5033]: I0226 17:38:50.189140 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:39:20 crc kubenswrapper[5033]: I0226 17:39:20.191185 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:39:20 crc kubenswrapper[5033]: I0226 17:39:20.191859 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:39:50 crc kubenswrapper[5033]: I0226 17:39:50.188908 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:39:50 crc kubenswrapper[5033]: I0226 17:39:50.189416 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:39:50 crc kubenswrapper[5033]: I0226 17:39:50.189477 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:39:50 crc kubenswrapper[5033]: I0226 17:39:50.190282 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:39:50 crc kubenswrapper[5033]: I0226 17:39:50.190329 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" gracePeriod=600 Feb 26 17:39:50 crc kubenswrapper[5033]: E0226 17:39:50.319892 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:39:50 crc kubenswrapper[5033]: I0226 17:39:50.683922 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" exitCode=0 Feb 26 17:39:50 crc kubenswrapper[5033]: I0226 17:39:50.683973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b"} Feb 26 17:39:50 crc kubenswrapper[5033]: I0226 17:39:50.684017 5033 scope.go:117] "RemoveContainer" containerID="5c869df2310547cca4ee2ef5bde1af9e7b98c69624bcb7a27c989f4e0ced38f4" Feb 26 17:39:50 crc kubenswrapper[5033]: I0226 17:39:50.684938 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:39:50 crc kubenswrapper[5033]: E0226 17:39:50.685338 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.155331 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535460-jwx9c"] Feb 26 17:40:00 crc kubenswrapper[5033]: E0226 17:40:00.156727 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" containerName="extract-utilities" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.156746 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" containerName="extract-utilities" Feb 26 17:40:00 crc kubenswrapper[5033]: E0226 17:40:00.156778 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" containerName="extract-content" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.156786 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" containerName="extract-content" Feb 26 17:40:00 crc kubenswrapper[5033]: E0226 17:40:00.156804 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" containerName="registry-server" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.156814 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" containerName="registry-server" Feb 26 17:40:00 crc kubenswrapper[5033]: E0226 17:40:00.156839 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4272dacd-15d6-4e3b-80ea-b66dcbe48f2f" containerName="oc" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.156847 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4272dacd-15d6-4e3b-80ea-b66dcbe48f2f" containerName="oc" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.157108 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1fb3703-4006-451b-9823-04df659a6d2e" containerName="registry-server" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.157150 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4272dacd-15d6-4e3b-80ea-b66dcbe48f2f" containerName="oc" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.158220 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535460-jwx9c" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.160898 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.161437 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.161719 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.171972 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535460-jwx9c"] Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.198011 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2blw\" (UniqueName: \"kubernetes.io/projected/f1738236-8698-4059-9d96-1c010907fdd7-kube-api-access-v2blw\") pod \"auto-csr-approver-29535460-jwx9c\" (UID: \"f1738236-8698-4059-9d96-1c010907fdd7\") " pod="openshift-infra/auto-csr-approver-29535460-jwx9c" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.300406 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2blw\" (UniqueName: \"kubernetes.io/projected/f1738236-8698-4059-9d96-1c010907fdd7-kube-api-access-v2blw\") pod \"auto-csr-approver-29535460-jwx9c\" (UID: \"f1738236-8698-4059-9d96-1c010907fdd7\") " pod="openshift-infra/auto-csr-approver-29535460-jwx9c" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.342130 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2blw\" (UniqueName: \"kubernetes.io/projected/f1738236-8698-4059-9d96-1c010907fdd7-kube-api-access-v2blw\") pod \"auto-csr-approver-29535460-jwx9c\" (UID: \"f1738236-8698-4059-9d96-1c010907fdd7\") " pod="openshift-infra/auto-csr-approver-29535460-jwx9c" Feb 26 17:40:00 crc kubenswrapper[5033]: I0226 17:40:00.492566 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535460-jwx9c" Feb 26 17:40:01 crc kubenswrapper[5033]: I0226 17:40:01.024849 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535460-jwx9c"] Feb 26 17:40:01 crc kubenswrapper[5033]: W0226 17:40:01.031131 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1738236_8698_4059_9d96_1c010907fdd7.slice/crio-8b78347d3835a9a5434baf9eef9e318476a7313a0189d13022d4b82b2dc59eb9 WatchSource:0}: Error finding container 8b78347d3835a9a5434baf9eef9e318476a7313a0189d13022d4b82b2dc59eb9: Status 404 returned error can't find the container with id 8b78347d3835a9a5434baf9eef9e318476a7313a0189d13022d4b82b2dc59eb9 Feb 26 17:40:01 crc kubenswrapper[5033]: I0226 17:40:01.628971 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:40:01 crc kubenswrapper[5033]: E0226 17:40:01.629749 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:40:01 crc kubenswrapper[5033]: I0226 17:40:01.816239 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535460-jwx9c" event={"ID":"f1738236-8698-4059-9d96-1c010907fdd7","Type":"ContainerStarted","Data":"8b78347d3835a9a5434baf9eef9e318476a7313a0189d13022d4b82b2dc59eb9"} Feb 26 17:40:03 crc kubenswrapper[5033]: I0226 17:40:03.838499 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1738236-8698-4059-9d96-1c010907fdd7" containerID="5a2aa3b91f3d0956001063cdf2811b5388d0d59ace18f241a2b0a0c57d8b7bad" exitCode=0 Feb 26 17:40:03 crc kubenswrapper[5033]: I0226 17:40:03.838719 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535460-jwx9c" event={"ID":"f1738236-8698-4059-9d96-1c010907fdd7","Type":"ContainerDied","Data":"5a2aa3b91f3d0956001063cdf2811b5388d0d59ace18f241a2b0a0c57d8b7bad"} Feb 26 17:40:05 crc kubenswrapper[5033]: I0226 17:40:05.277360 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535460-jwx9c" Feb 26 17:40:05 crc kubenswrapper[5033]: I0226 17:40:05.322348 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2blw\" (UniqueName: \"kubernetes.io/projected/f1738236-8698-4059-9d96-1c010907fdd7-kube-api-access-v2blw\") pod \"f1738236-8698-4059-9d96-1c010907fdd7\" (UID: \"f1738236-8698-4059-9d96-1c010907fdd7\") " Feb 26 17:40:05 crc kubenswrapper[5033]: I0226 17:40:05.329776 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1738236-8698-4059-9d96-1c010907fdd7-kube-api-access-v2blw" (OuterVolumeSpecName: "kube-api-access-v2blw") pod "f1738236-8698-4059-9d96-1c010907fdd7" (UID: "f1738236-8698-4059-9d96-1c010907fdd7"). InnerVolumeSpecName "kube-api-access-v2blw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:40:05 crc kubenswrapper[5033]: I0226 17:40:05.426464 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2blw\" (UniqueName: \"kubernetes.io/projected/f1738236-8698-4059-9d96-1c010907fdd7-kube-api-access-v2blw\") on node \"crc\" DevicePath \"\"" Feb 26 17:40:05 crc kubenswrapper[5033]: I0226 17:40:05.873496 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535460-jwx9c" event={"ID":"f1738236-8698-4059-9d96-1c010907fdd7","Type":"ContainerDied","Data":"8b78347d3835a9a5434baf9eef9e318476a7313a0189d13022d4b82b2dc59eb9"} Feb 26 17:40:05 crc kubenswrapper[5033]: I0226 17:40:05.873539 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b78347d3835a9a5434baf9eef9e318476a7313a0189d13022d4b82b2dc59eb9" Feb 26 17:40:05 crc kubenswrapper[5033]: I0226 17:40:05.873584 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535460-jwx9c" Feb 26 17:40:06 crc kubenswrapper[5033]: I0226 17:40:06.345389 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535454-cv4fn"] Feb 26 17:40:06 crc kubenswrapper[5033]: I0226 17:40:06.354082 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535454-cv4fn"] Feb 26 17:40:07 crc kubenswrapper[5033]: I0226 17:40:07.641423 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853" path="/var/lib/kubelet/pods/ec94c6c7-bdb0-4b2c-998f-38bfe1d2c853/volumes" Feb 26 17:40:15 crc kubenswrapper[5033]: I0226 17:40:15.629375 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:40:15 crc kubenswrapper[5033]: E0226 17:40:15.630923 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:40:27 crc kubenswrapper[5033]: I0226 17:40:27.629096 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:40:27 crc kubenswrapper[5033]: E0226 17:40:27.630240 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:40:40 crc kubenswrapper[5033]: I0226 17:40:40.628354 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:40:40 crc kubenswrapper[5033]: E0226 17:40:40.629142 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:40:41 crc kubenswrapper[5033]: I0226 17:40:41.501791 5033 scope.go:117] "RemoveContainer" containerID="3d82343c61dffb25d61a7636e31eed4a397e1b53880881aff0c03e5950a24796" Feb 26 17:40:51 crc kubenswrapper[5033]: I0226 17:40:51.628557 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:40:51 crc kubenswrapper[5033]: E0226 17:40:51.629828 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:41:04 crc kubenswrapper[5033]: I0226 17:41:04.492385 5033 generic.go:334] "Generic (PLEG): container finished" podID="0418ee62-c3ed-4b01-9801-705d116841ae" containerID="48c4839b84bb0e9e53e9278f0513519ea08df9944a3a2fe9cd3af6e6370c5fb0" exitCode=0 Feb 26 17:41:04 crc kubenswrapper[5033]: I0226 17:41:04.492565 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" event={"ID":"0418ee62-c3ed-4b01-9801-705d116841ae","Type":"ContainerDied","Data":"48c4839b84bb0e9e53e9278f0513519ea08df9944a3a2fe9cd3af6e6370c5fb0"} Feb 26 17:41:05 crc kubenswrapper[5033]: I0226 17:41:05.628517 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:41:05 crc kubenswrapper[5033]: E0226 17:41:05.628887 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.014378 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.060927 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-inventory\") pod \"0418ee62-c3ed-4b01-9801-705d116841ae\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.061067 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ceph\") pod \"0418ee62-c3ed-4b01-9801-705d116841ae\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.061135 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkmkd\" (UniqueName: \"kubernetes.io/projected/0418ee62-c3ed-4b01-9801-705d116841ae-kube-api-access-hkmkd\") pod \"0418ee62-c3ed-4b01-9801-705d116841ae\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.061220 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ssh-key-openstack-cell1\") pod \"0418ee62-c3ed-4b01-9801-705d116841ae\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.062172 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-tripleo-cleanup-combined-ca-bundle\") pod \"0418ee62-c3ed-4b01-9801-705d116841ae\" (UID: \"0418ee62-c3ed-4b01-9801-705d116841ae\") " Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.066519 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ceph" (OuterVolumeSpecName: "ceph") pod "0418ee62-c3ed-4b01-9801-705d116841ae" (UID: "0418ee62-c3ed-4b01-9801-705d116841ae"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.071177 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0418ee62-c3ed-4b01-9801-705d116841ae-kube-api-access-hkmkd" (OuterVolumeSpecName: "kube-api-access-hkmkd") pod "0418ee62-c3ed-4b01-9801-705d116841ae" (UID: "0418ee62-c3ed-4b01-9801-705d116841ae"). InnerVolumeSpecName "kube-api-access-hkmkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.076786 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "0418ee62-c3ed-4b01-9801-705d116841ae" (UID: "0418ee62-c3ed-4b01-9801-705d116841ae"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.094906 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-inventory" (OuterVolumeSpecName: "inventory") pod "0418ee62-c3ed-4b01-9801-705d116841ae" (UID: "0418ee62-c3ed-4b01-9801-705d116841ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.105456 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0418ee62-c3ed-4b01-9801-705d116841ae" (UID: "0418ee62-c3ed-4b01-9801-705d116841ae"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.165617 5033 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.165939 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.165972 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.167148 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkmkd\" (UniqueName: \"kubernetes.io/projected/0418ee62-c3ed-4b01-9801-705d116841ae-kube-api-access-hkmkd\") on node \"crc\" DevicePath \"\"" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.167169 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0418ee62-c3ed-4b01-9801-705d116841ae-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.519072 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" event={"ID":"0418ee62-c3ed-4b01-9801-705d116841ae","Type":"ContainerDied","Data":"b496e35075bde4a1c7110038a49a45dba6de67413a29fd5158c0d91d8389d5ec"} Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.519120 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b496e35075bde4a1c7110038a49a45dba6de67413a29fd5158c0d91d8389d5ec" Feb 26 17:41:06 crc kubenswrapper[5033]: I0226 17:41:06.519145 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.689968 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-hqjgv"] Feb 26 17:41:15 crc kubenswrapper[5033]: E0226 17:41:15.691309 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0418ee62-c3ed-4b01-9801-705d116841ae" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.691332 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0418ee62-c3ed-4b01-9801-705d116841ae" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 26 17:41:15 crc kubenswrapper[5033]: E0226 17:41:15.691374 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1738236-8698-4059-9d96-1c010907fdd7" containerName="oc" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.691383 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1738236-8698-4059-9d96-1c010907fdd7" containerName="oc" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.691649 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0418ee62-c3ed-4b01-9801-705d116841ae" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.691671 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1738236-8698-4059-9d96-1c010907fdd7" containerName="oc" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.692977 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.700798 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-hqjgv"] Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.717222 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.717292 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.717321 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.717834 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.786717 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.786789 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ceph\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.786893 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6nmk\" (UniqueName: \"kubernetes.io/projected/cdd70c36-4e9c-4608-816c-7fd46b747577-kube-api-access-b6nmk\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.787043 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-inventory\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.787077 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.889615 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.889829 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ceph\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.890901 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6nmk\" (UniqueName: \"kubernetes.io/projected/cdd70c36-4e9c-4608-816c-7fd46b747577-kube-api-access-b6nmk\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.891204 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-inventory\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.891330 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.896189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.897062 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ceph\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.898143 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-inventory\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.898300 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:15 crc kubenswrapper[5033]: I0226 17:41:15.908829 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6nmk\" (UniqueName: \"kubernetes.io/projected/cdd70c36-4e9c-4608-816c-7fd46b747577-kube-api-access-b6nmk\") pod \"bootstrap-openstack-openstack-cell1-hqjgv\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:16 crc kubenswrapper[5033]: I0226 17:41:16.051223 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:41:16 crc kubenswrapper[5033]: I0226 17:41:16.629373 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:41:16 crc kubenswrapper[5033]: I0226 17:41:16.631225 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-hqjgv"] Feb 26 17:41:16 crc kubenswrapper[5033]: E0226 17:41:16.631584 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:41:16 crc kubenswrapper[5033]: I0226 17:41:16.652104 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" event={"ID":"cdd70c36-4e9c-4608-816c-7fd46b747577","Type":"ContainerStarted","Data":"1bb3bc3c75741d4e57641eda61351788402ad887c383ad57d9924099cc755968"} Feb 26 17:41:17 crc kubenswrapper[5033]: I0226 17:41:17.672021 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" event={"ID":"cdd70c36-4e9c-4608-816c-7fd46b747577","Type":"ContainerStarted","Data":"07988d3ba0998f4971c92699e61bf052f8642ebf3973213d2219d08ae3c415f4"} Feb 26 17:41:30 crc kubenswrapper[5033]: I0226 17:41:30.628671 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:41:30 crc kubenswrapper[5033]: E0226 17:41:30.629522 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:41:42 crc kubenswrapper[5033]: I0226 17:41:42.628849 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:41:42 crc kubenswrapper[5033]: E0226 17:41:42.630059 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.628583 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:41:54 crc kubenswrapper[5033]: E0226 17:41:54.629400 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.707312 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" podStartSLOduration=39.25814931 podStartE2EDuration="39.707293309s" podCreationTimestamp="2026-02-26 17:41:15 +0000 UTC" firstStartedPulling="2026-02-26 17:41:16.635431829 +0000 UTC m=+7203.848030479" lastFinishedPulling="2026-02-26 17:41:17.084575828 +0000 UTC m=+7204.297174478" observedRunningTime="2026-02-26 17:41:17.700402647 +0000 UTC m=+7204.913001317" watchObservedRunningTime="2026-02-26 17:41:54.707293309 +0000 UTC m=+7241.919891959" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.712162 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pscms"] Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.714247 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.726371 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pscms"] Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.862998 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ac064e0-a007-41dd-9783-f7be886e19a0-utilities\") pod \"redhat-operators-pscms\" (UID: \"4ac064e0-a007-41dd-9783-f7be886e19a0\") " pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.863625 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k47k\" (UniqueName: \"kubernetes.io/projected/4ac064e0-a007-41dd-9783-f7be886e19a0-kube-api-access-8k47k\") pod \"redhat-operators-pscms\" (UID: \"4ac064e0-a007-41dd-9783-f7be886e19a0\") " pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.863693 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ac064e0-a007-41dd-9783-f7be886e19a0-catalog-content\") pod \"redhat-operators-pscms\" (UID: \"4ac064e0-a007-41dd-9783-f7be886e19a0\") " pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.966014 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k47k\" (UniqueName: \"kubernetes.io/projected/4ac064e0-a007-41dd-9783-f7be886e19a0-kube-api-access-8k47k\") pod \"redhat-operators-pscms\" (UID: \"4ac064e0-a007-41dd-9783-f7be886e19a0\") " pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.966140 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ac064e0-a007-41dd-9783-f7be886e19a0-catalog-content\") pod \"redhat-operators-pscms\" (UID: \"4ac064e0-a007-41dd-9783-f7be886e19a0\") " pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.966386 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ac064e0-a007-41dd-9783-f7be886e19a0-utilities\") pod \"redhat-operators-pscms\" (UID: \"4ac064e0-a007-41dd-9783-f7be886e19a0\") " pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.966860 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ac064e0-a007-41dd-9783-f7be886e19a0-utilities\") pod \"redhat-operators-pscms\" (UID: \"4ac064e0-a007-41dd-9783-f7be886e19a0\") " pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.966863 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ac064e0-a007-41dd-9783-f7be886e19a0-catalog-content\") pod \"redhat-operators-pscms\" (UID: \"4ac064e0-a007-41dd-9783-f7be886e19a0\") " pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:54 crc kubenswrapper[5033]: I0226 17:41:54.991159 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k47k\" (UniqueName: \"kubernetes.io/projected/4ac064e0-a007-41dd-9783-f7be886e19a0-kube-api-access-8k47k\") pod \"redhat-operators-pscms\" (UID: \"4ac064e0-a007-41dd-9783-f7be886e19a0\") " pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:55 crc kubenswrapper[5033]: I0226 17:41:55.051210 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:41:55 crc kubenswrapper[5033]: I0226 17:41:55.595626 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pscms"] Feb 26 17:41:56 crc kubenswrapper[5033]: I0226 17:41:56.091086 5033 generic.go:334] "Generic (PLEG): container finished" podID="4ac064e0-a007-41dd-9783-f7be886e19a0" containerID="a527839f0c053b6a371c7825e3b3ed52d3cc73e09852787f0e222bedce7577f1" exitCode=0 Feb 26 17:41:56 crc kubenswrapper[5033]: I0226 17:41:56.091131 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pscms" event={"ID":"4ac064e0-a007-41dd-9783-f7be886e19a0","Type":"ContainerDied","Data":"a527839f0c053b6a371c7825e3b3ed52d3cc73e09852787f0e222bedce7577f1"} Feb 26 17:41:56 crc kubenswrapper[5033]: I0226 17:41:56.091182 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pscms" event={"ID":"4ac064e0-a007-41dd-9783-f7be886e19a0","Type":"ContainerStarted","Data":"efeef3fe7370d25867c355e03ecac2613194621d90ede209183ccc2b99ffb030"} Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.175660 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535462-k6jww"] Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.177632 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535462-k6jww" Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.179776 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.179939 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.179952 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.201833 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535462-k6jww"] Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.304975 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74hln\" (UniqueName: \"kubernetes.io/projected/bd60cc72-b742-4acc-ac35-b8aee208b9fb-kube-api-access-74hln\") pod \"auto-csr-approver-29535462-k6jww\" (UID: \"bd60cc72-b742-4acc-ac35-b8aee208b9fb\") " pod="openshift-infra/auto-csr-approver-29535462-k6jww" Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.407432 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74hln\" (UniqueName: \"kubernetes.io/projected/bd60cc72-b742-4acc-ac35-b8aee208b9fb-kube-api-access-74hln\") pod \"auto-csr-approver-29535462-k6jww\" (UID: \"bd60cc72-b742-4acc-ac35-b8aee208b9fb\") " pod="openshift-infra/auto-csr-approver-29535462-k6jww" Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.425658 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74hln\" (UniqueName: \"kubernetes.io/projected/bd60cc72-b742-4acc-ac35-b8aee208b9fb-kube-api-access-74hln\") pod \"auto-csr-approver-29535462-k6jww\" (UID: \"bd60cc72-b742-4acc-ac35-b8aee208b9fb\") " pod="openshift-infra/auto-csr-approver-29535462-k6jww" Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.497696 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535462-k6jww" Feb 26 17:42:00 crc kubenswrapper[5033]: I0226 17:42:00.997229 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535462-k6jww"] Feb 26 17:42:01 crc kubenswrapper[5033]: I0226 17:42:01.150939 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535462-k6jww" event={"ID":"bd60cc72-b742-4acc-ac35-b8aee208b9fb","Type":"ContainerStarted","Data":"897f97952d35a9381952a846ba545f0fd0e572a8e63542f95808201e556e1a0f"} Feb 26 17:42:07 crc kubenswrapper[5033]: I0226 17:42:07.628953 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:42:07 crc kubenswrapper[5033]: E0226 17:42:07.629773 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:42:10 crc kubenswrapper[5033]: I0226 17:42:10.252832 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535462-k6jww" event={"ID":"bd60cc72-b742-4acc-ac35-b8aee208b9fb","Type":"ContainerStarted","Data":"a77786ff81d9c85ac1f76dbc6b708498a28501206de1d52ebe35a1dbf1fef858"} Feb 26 17:42:10 crc kubenswrapper[5033]: I0226 17:42:10.256492 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pscms" event={"ID":"4ac064e0-a007-41dd-9783-f7be886e19a0","Type":"ContainerStarted","Data":"4d3d31faf7f3c5adeb5a96b6d8c4b81125216b3dda45083c461f0735766933b8"} Feb 26 17:42:11 crc kubenswrapper[5033]: I0226 17:42:11.283549 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535462-k6jww" podStartSLOduration=2.406654496 podStartE2EDuration="11.283528872s" podCreationTimestamp="2026-02-26 17:42:00 +0000 UTC" firstStartedPulling="2026-02-26 17:42:01.006737921 +0000 UTC m=+7248.219336581" lastFinishedPulling="2026-02-26 17:42:09.883612307 +0000 UTC m=+7257.096210957" observedRunningTime="2026-02-26 17:42:11.276490161 +0000 UTC m=+7258.489088821" watchObservedRunningTime="2026-02-26 17:42:11.283528872 +0000 UTC m=+7258.496127522" Feb 26 17:42:13 crc kubenswrapper[5033]: I0226 17:42:13.284299 5033 generic.go:334] "Generic (PLEG): container finished" podID="4ac064e0-a007-41dd-9783-f7be886e19a0" containerID="4d3d31faf7f3c5adeb5a96b6d8c4b81125216b3dda45083c461f0735766933b8" exitCode=0 Feb 26 17:42:13 crc kubenswrapper[5033]: I0226 17:42:13.284369 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pscms" event={"ID":"4ac064e0-a007-41dd-9783-f7be886e19a0","Type":"ContainerDied","Data":"4d3d31faf7f3c5adeb5a96b6d8c4b81125216b3dda45083c461f0735766933b8"} Feb 26 17:42:15 crc kubenswrapper[5033]: I0226 17:42:15.308959 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pscms" event={"ID":"4ac064e0-a007-41dd-9783-f7be886e19a0","Type":"ContainerStarted","Data":"2686061555ff1c3855ee88633b11845d59820abbd274b5dea50b2796cc2613a8"} Feb 26 17:42:15 crc kubenswrapper[5033]: I0226 17:42:15.355034 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pscms" podStartSLOduration=2.520019107 podStartE2EDuration="21.355012813s" podCreationTimestamp="2026-02-26 17:41:54 +0000 UTC" firstStartedPulling="2026-02-26 17:41:56.094220227 +0000 UTC m=+7243.306818887" lastFinishedPulling="2026-02-26 17:42:14.929213943 +0000 UTC m=+7262.141812593" observedRunningTime="2026-02-26 17:42:15.342629891 +0000 UTC m=+7262.555228581" watchObservedRunningTime="2026-02-26 17:42:15.355012813 +0000 UTC m=+7262.567611463" Feb 26 17:42:15 crc kubenswrapper[5033]: E0226 17:42:15.524738 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd60cc72_b742_4acc_ac35_b8aee208b9fb.slice/crio-a77786ff81d9c85ac1f76dbc6b708498a28501206de1d52ebe35a1dbf1fef858.scope\": RecentStats: unable to find data in memory cache]" Feb 26 17:42:16 crc kubenswrapper[5033]: I0226 17:42:16.320339 5033 generic.go:334] "Generic (PLEG): container finished" podID="bd60cc72-b742-4acc-ac35-b8aee208b9fb" containerID="a77786ff81d9c85ac1f76dbc6b708498a28501206de1d52ebe35a1dbf1fef858" exitCode=0 Feb 26 17:42:16 crc kubenswrapper[5033]: I0226 17:42:16.320387 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535462-k6jww" event={"ID":"bd60cc72-b742-4acc-ac35-b8aee208b9fb","Type":"ContainerDied","Data":"a77786ff81d9c85ac1f76dbc6b708498a28501206de1d52ebe35a1dbf1fef858"} Feb 26 17:42:17 crc kubenswrapper[5033]: I0226 17:42:17.739010 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535462-k6jww" Feb 26 17:42:17 crc kubenswrapper[5033]: I0226 17:42:17.910937 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74hln\" (UniqueName: \"kubernetes.io/projected/bd60cc72-b742-4acc-ac35-b8aee208b9fb-kube-api-access-74hln\") pod \"bd60cc72-b742-4acc-ac35-b8aee208b9fb\" (UID: \"bd60cc72-b742-4acc-ac35-b8aee208b9fb\") " Feb 26 17:42:17 crc kubenswrapper[5033]: I0226 17:42:17.917383 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd60cc72-b742-4acc-ac35-b8aee208b9fb-kube-api-access-74hln" (OuterVolumeSpecName: "kube-api-access-74hln") pod "bd60cc72-b742-4acc-ac35-b8aee208b9fb" (UID: "bd60cc72-b742-4acc-ac35-b8aee208b9fb"). InnerVolumeSpecName "kube-api-access-74hln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:42:18 crc kubenswrapper[5033]: I0226 17:42:18.013257 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74hln\" (UniqueName: \"kubernetes.io/projected/bd60cc72-b742-4acc-ac35-b8aee208b9fb-kube-api-access-74hln\") on node \"crc\" DevicePath \"\"" Feb 26 17:42:18 crc kubenswrapper[5033]: I0226 17:42:18.341023 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535462-k6jww" event={"ID":"bd60cc72-b742-4acc-ac35-b8aee208b9fb","Type":"ContainerDied","Data":"897f97952d35a9381952a846ba545f0fd0e572a8e63542f95808201e556e1a0f"} Feb 26 17:42:18 crc kubenswrapper[5033]: I0226 17:42:18.341061 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="897f97952d35a9381952a846ba545f0fd0e572a8e63542f95808201e556e1a0f" Feb 26 17:42:18 crc kubenswrapper[5033]: I0226 17:42:18.341093 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535462-k6jww" Feb 26 17:42:18 crc kubenswrapper[5033]: I0226 17:42:18.442652 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535456-p2rrp"] Feb 26 17:42:18 crc kubenswrapper[5033]: I0226 17:42:18.451667 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535456-p2rrp"] Feb 26 17:42:18 crc kubenswrapper[5033]: I0226 17:42:18.627936 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:42:18 crc kubenswrapper[5033]: E0226 17:42:18.628368 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:42:19 crc kubenswrapper[5033]: I0226 17:42:19.642149 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da172082-e5f5-4c27-bcee-a84478a084b0" path="/var/lib/kubelet/pods/da172082-e5f5-4c27-bcee-a84478a084b0/volumes" Feb 26 17:42:25 crc kubenswrapper[5033]: I0226 17:42:25.052899 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:42:25 crc kubenswrapper[5033]: I0226 17:42:25.053606 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:42:25 crc kubenswrapper[5033]: I0226 17:42:25.213523 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:42:25 crc kubenswrapper[5033]: I0226 17:42:25.490078 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pscms" Feb 26 17:42:25 crc kubenswrapper[5033]: I0226 17:42:25.762790 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pscms"] Feb 26 17:42:25 crc kubenswrapper[5033]: I0226 17:42:25.938084 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svqb2"] Feb 26 17:42:25 crc kubenswrapper[5033]: I0226 17:42:25.938705 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-svqb2" podUID="1171e541-6384-4918-b83b-1eeff776308f" containerName="registry-server" containerID="cri-o://6c05275a297a57bd37b28881816a31e844d667470d8c5ebd534665d84c1ed373" gracePeriod=2 Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.428202 5033 generic.go:334] "Generic (PLEG): container finished" podID="1171e541-6384-4918-b83b-1eeff776308f" containerID="6c05275a297a57bd37b28881816a31e844d667470d8c5ebd534665d84c1ed373" exitCode=0 Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.428380 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svqb2" event={"ID":"1171e541-6384-4918-b83b-1eeff776308f","Type":"ContainerDied","Data":"6c05275a297a57bd37b28881816a31e844d667470d8c5ebd534665d84c1ed373"} Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.559241 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.731097 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmhn5\" (UniqueName: \"kubernetes.io/projected/1171e541-6384-4918-b83b-1eeff776308f-kube-api-access-nmhn5\") pod \"1171e541-6384-4918-b83b-1eeff776308f\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.731155 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-catalog-content\") pod \"1171e541-6384-4918-b83b-1eeff776308f\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.731228 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-utilities\") pod \"1171e541-6384-4918-b83b-1eeff776308f\" (UID: \"1171e541-6384-4918-b83b-1eeff776308f\") " Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.736055 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-utilities" (OuterVolumeSpecName: "utilities") pod "1171e541-6384-4918-b83b-1eeff776308f" (UID: "1171e541-6384-4918-b83b-1eeff776308f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.743433 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1171e541-6384-4918-b83b-1eeff776308f-kube-api-access-nmhn5" (OuterVolumeSpecName: "kube-api-access-nmhn5") pod "1171e541-6384-4918-b83b-1eeff776308f" (UID: "1171e541-6384-4918-b83b-1eeff776308f"). InnerVolumeSpecName "kube-api-access-nmhn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.835743 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmhn5\" (UniqueName: \"kubernetes.io/projected/1171e541-6384-4918-b83b-1eeff776308f-kube-api-access-nmhn5\") on node \"crc\" DevicePath \"\"" Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.835774 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.928146 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1171e541-6384-4918-b83b-1eeff776308f" (UID: "1171e541-6384-4918-b83b-1eeff776308f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:42:26 crc kubenswrapper[5033]: I0226 17:42:26.938804 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1171e541-6384-4918-b83b-1eeff776308f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:42:27 crc kubenswrapper[5033]: I0226 17:42:27.442856 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svqb2" Feb 26 17:42:27 crc kubenswrapper[5033]: I0226 17:42:27.442846 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svqb2" event={"ID":"1171e541-6384-4918-b83b-1eeff776308f","Type":"ContainerDied","Data":"71514f8e574bafc481750be954b016bc88fb47b1189a681532c982e18306699a"} Feb 26 17:42:27 crc kubenswrapper[5033]: I0226 17:42:27.443349 5033 scope.go:117] "RemoveContainer" containerID="6c05275a297a57bd37b28881816a31e844d667470d8c5ebd534665d84c1ed373" Feb 26 17:42:27 crc kubenswrapper[5033]: I0226 17:42:27.479854 5033 scope.go:117] "RemoveContainer" containerID="43a08a482fe2817cc2b3bf99b789d9c3fc55e937a654787bc31b29b4d1db6b2e" Feb 26 17:42:27 crc kubenswrapper[5033]: I0226 17:42:27.484199 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svqb2"] Feb 26 17:42:27 crc kubenswrapper[5033]: I0226 17:42:27.511212 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-svqb2"] Feb 26 17:42:27 crc kubenswrapper[5033]: I0226 17:42:27.518679 5033 scope.go:117] "RemoveContainer" containerID="a93dbb1f3d946e7230d163d231dfe67800e8f9a3815c1f0a62a274754e1af53c" Feb 26 17:42:27 crc kubenswrapper[5033]: I0226 17:42:27.647511 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1171e541-6384-4918-b83b-1eeff776308f" path="/var/lib/kubelet/pods/1171e541-6384-4918-b83b-1eeff776308f/volumes" Feb 26 17:42:33 crc kubenswrapper[5033]: I0226 17:42:33.640949 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:42:33 crc kubenswrapper[5033]: E0226 17:42:33.641834 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:42:41 crc kubenswrapper[5033]: I0226 17:42:41.617894 5033 scope.go:117] "RemoveContainer" containerID="92aa1ac73e90fbc4de46224b24cbbcc92757a6ee196bc76a2c473f6b03335a12" Feb 26 17:42:45 crc kubenswrapper[5033]: I0226 17:42:45.628328 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:42:45 crc kubenswrapper[5033]: E0226 17:42:45.629372 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:42:54 crc kubenswrapper[5033]: I0226 17:42:54.959872 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qwsvl"] Feb 26 17:42:54 crc kubenswrapper[5033]: E0226 17:42:54.961320 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd60cc72-b742-4acc-ac35-b8aee208b9fb" containerName="oc" Feb 26 17:42:54 crc kubenswrapper[5033]: I0226 17:42:54.961342 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd60cc72-b742-4acc-ac35-b8aee208b9fb" containerName="oc" Feb 26 17:42:54 crc kubenswrapper[5033]: E0226 17:42:54.961393 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1171e541-6384-4918-b83b-1eeff776308f" containerName="extract-content" Feb 26 17:42:54 crc kubenswrapper[5033]: I0226 17:42:54.961403 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1171e541-6384-4918-b83b-1eeff776308f" containerName="extract-content" Feb 26 17:42:54 crc kubenswrapper[5033]: E0226 17:42:54.961421 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1171e541-6384-4918-b83b-1eeff776308f" containerName="registry-server" Feb 26 17:42:54 crc kubenswrapper[5033]: I0226 17:42:54.961429 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1171e541-6384-4918-b83b-1eeff776308f" containerName="registry-server" Feb 26 17:42:54 crc kubenswrapper[5033]: E0226 17:42:54.961453 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1171e541-6384-4918-b83b-1eeff776308f" containerName="extract-utilities" Feb 26 17:42:54 crc kubenswrapper[5033]: I0226 17:42:54.961469 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1171e541-6384-4918-b83b-1eeff776308f" containerName="extract-utilities" Feb 26 17:42:54 crc kubenswrapper[5033]: I0226 17:42:54.961846 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd60cc72-b742-4acc-ac35-b8aee208b9fb" containerName="oc" Feb 26 17:42:54 crc kubenswrapper[5033]: I0226 17:42:54.961869 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1171e541-6384-4918-b83b-1eeff776308f" containerName="registry-server" Feb 26 17:42:54 crc kubenswrapper[5033]: I0226 17:42:54.963934 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:54 crc kubenswrapper[5033]: I0226 17:42:54.977755 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qwsvl"] Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.068325 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-utilities\") pod \"community-operators-qwsvl\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.068568 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxp46\" (UniqueName: \"kubernetes.io/projected/de122c60-10d5-421a-b367-4732333789a6-kube-api-access-bxp46\") pod \"community-operators-qwsvl\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.068669 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-catalog-content\") pod \"community-operators-qwsvl\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.170247 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxp46\" (UniqueName: \"kubernetes.io/projected/de122c60-10d5-421a-b367-4732333789a6-kube-api-access-bxp46\") pod \"community-operators-qwsvl\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.170358 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-catalog-content\") pod \"community-operators-qwsvl\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.170444 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-utilities\") pod \"community-operators-qwsvl\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.170899 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-catalog-content\") pod \"community-operators-qwsvl\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.170955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-utilities\") pod \"community-operators-qwsvl\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.193819 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxp46\" (UniqueName: \"kubernetes.io/projected/de122c60-10d5-421a-b367-4732333789a6-kube-api-access-bxp46\") pod \"community-operators-qwsvl\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:55 crc kubenswrapper[5033]: I0226 17:42:55.365092 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:42:56 crc kubenswrapper[5033]: I0226 17:42:56.079572 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qwsvl"] Feb 26 17:42:56 crc kubenswrapper[5033]: W0226 17:42:56.085742 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde122c60_10d5_421a_b367_4732333789a6.slice/crio-e4c7f60f8a62fdc7923ca3e70a5cbc0fbaccbc0787bcace1dde2b7b790a2cbd9 WatchSource:0}: Error finding container e4c7f60f8a62fdc7923ca3e70a5cbc0fbaccbc0787bcace1dde2b7b790a2cbd9: Status 404 returned error can't find the container with id e4c7f60f8a62fdc7923ca3e70a5cbc0fbaccbc0787bcace1dde2b7b790a2cbd9 Feb 26 17:42:56 crc kubenswrapper[5033]: I0226 17:42:56.628559 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:42:56 crc kubenswrapper[5033]: E0226 17:42:56.629089 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:42:56 crc kubenswrapper[5033]: I0226 17:42:56.819009 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwsvl" event={"ID":"de122c60-10d5-421a-b367-4732333789a6","Type":"ContainerStarted","Data":"1edde2a7a9a72b38e5616f5cc00d2d1a3f817ca2807324dd6068ecd17c3e3cae"} Feb 26 17:42:56 crc kubenswrapper[5033]: I0226 17:42:56.819417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwsvl" event={"ID":"de122c60-10d5-421a-b367-4732333789a6","Type":"ContainerStarted","Data":"e4c7f60f8a62fdc7923ca3e70a5cbc0fbaccbc0787bcace1dde2b7b790a2cbd9"} Feb 26 17:42:57 crc kubenswrapper[5033]: I0226 17:42:57.837566 5033 generic.go:334] "Generic (PLEG): container finished" podID="de122c60-10d5-421a-b367-4732333789a6" containerID="1edde2a7a9a72b38e5616f5cc00d2d1a3f817ca2807324dd6068ecd17c3e3cae" exitCode=0 Feb 26 17:42:57 crc kubenswrapper[5033]: I0226 17:42:57.837628 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwsvl" event={"ID":"de122c60-10d5-421a-b367-4732333789a6","Type":"ContainerDied","Data":"1edde2a7a9a72b38e5616f5cc00d2d1a3f817ca2807324dd6068ecd17c3e3cae"} Feb 26 17:42:59 crc kubenswrapper[5033]: I0226 17:42:59.859113 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwsvl" event={"ID":"de122c60-10d5-421a-b367-4732333789a6","Type":"ContainerStarted","Data":"02a27cda7eae09f1901b0228f18072a405e18cd1b5a7608721b4cc2500620ff3"} Feb 26 17:43:00 crc kubenswrapper[5033]: I0226 17:43:00.870330 5033 generic.go:334] "Generic (PLEG): container finished" podID="de122c60-10d5-421a-b367-4732333789a6" containerID="02a27cda7eae09f1901b0228f18072a405e18cd1b5a7608721b4cc2500620ff3" exitCode=0 Feb 26 17:43:00 crc kubenswrapper[5033]: I0226 17:43:00.870393 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwsvl" event={"ID":"de122c60-10d5-421a-b367-4732333789a6","Type":"ContainerDied","Data":"02a27cda7eae09f1901b0228f18072a405e18cd1b5a7608721b4cc2500620ff3"} Feb 26 17:43:01 crc kubenswrapper[5033]: I0226 17:43:01.887090 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwsvl" event={"ID":"de122c60-10d5-421a-b367-4732333789a6","Type":"ContainerStarted","Data":"be931a750483d744871055d1b70378e16958b73898b99eee3831f74303ab04e2"} Feb 26 17:43:01 crc kubenswrapper[5033]: I0226 17:43:01.921202 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qwsvl" podStartSLOduration=4.502666844 podStartE2EDuration="7.921180071s" podCreationTimestamp="2026-02-26 17:42:54 +0000 UTC" firstStartedPulling="2026-02-26 17:42:57.84108992 +0000 UTC m=+7305.053688600" lastFinishedPulling="2026-02-26 17:43:01.259603157 +0000 UTC m=+7308.472201827" observedRunningTime="2026-02-26 17:43:01.912163591 +0000 UTC m=+7309.124762251" watchObservedRunningTime="2026-02-26 17:43:01.921180071 +0000 UTC m=+7309.133778741" Feb 26 17:43:05 crc kubenswrapper[5033]: I0226 17:43:05.366097 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:43:05 crc kubenswrapper[5033]: I0226 17:43:05.366831 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:43:05 crc kubenswrapper[5033]: I0226 17:43:05.417020 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:43:08 crc kubenswrapper[5033]: I0226 17:43:08.628572 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:43:08 crc kubenswrapper[5033]: E0226 17:43:08.629364 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:43:15 crc kubenswrapper[5033]: I0226 17:43:15.416598 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:43:17 crc kubenswrapper[5033]: I0226 17:43:17.591186 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qwsvl"] Feb 26 17:43:17 crc kubenswrapper[5033]: I0226 17:43:17.591987 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qwsvl" podUID="de122c60-10d5-421a-b367-4732333789a6" containerName="registry-server" containerID="cri-o://be931a750483d744871055d1b70378e16958b73898b99eee3831f74303ab04e2" gracePeriod=2 Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.075294 5033 generic.go:334] "Generic (PLEG): container finished" podID="de122c60-10d5-421a-b367-4732333789a6" containerID="be931a750483d744871055d1b70378e16958b73898b99eee3831f74303ab04e2" exitCode=0 Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.075400 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwsvl" event={"ID":"de122c60-10d5-421a-b367-4732333789a6","Type":"ContainerDied","Data":"be931a750483d744871055d1b70378e16958b73898b99eee3831f74303ab04e2"} Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.075731 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qwsvl" event={"ID":"de122c60-10d5-421a-b367-4732333789a6","Type":"ContainerDied","Data":"e4c7f60f8a62fdc7923ca3e70a5cbc0fbaccbc0787bcace1dde2b7b790a2cbd9"} Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.075756 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4c7f60f8a62fdc7923ca3e70a5cbc0fbaccbc0787bcace1dde2b7b790a2cbd9" Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.107172 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.232723 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-utilities\") pod \"de122c60-10d5-421a-b367-4732333789a6\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.232956 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxp46\" (UniqueName: \"kubernetes.io/projected/de122c60-10d5-421a-b367-4732333789a6-kube-api-access-bxp46\") pod \"de122c60-10d5-421a-b367-4732333789a6\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.233013 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-catalog-content\") pod \"de122c60-10d5-421a-b367-4732333789a6\" (UID: \"de122c60-10d5-421a-b367-4732333789a6\") " Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.233811 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-utilities" (OuterVolumeSpecName: "utilities") pod "de122c60-10d5-421a-b367-4732333789a6" (UID: "de122c60-10d5-421a-b367-4732333789a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.239121 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de122c60-10d5-421a-b367-4732333789a6-kube-api-access-bxp46" (OuterVolumeSpecName: "kube-api-access-bxp46") pod "de122c60-10d5-421a-b367-4732333789a6" (UID: "de122c60-10d5-421a-b367-4732333789a6"). InnerVolumeSpecName "kube-api-access-bxp46". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.310933 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de122c60-10d5-421a-b367-4732333789a6" (UID: "de122c60-10d5-421a-b367-4732333789a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.335066 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxp46\" (UniqueName: \"kubernetes.io/projected/de122c60-10d5-421a-b367-4732333789a6-kube-api-access-bxp46\") on node \"crc\" DevicePath \"\"" Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.335106 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:43:18 crc kubenswrapper[5033]: I0226 17:43:18.335116 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de122c60-10d5-421a-b367-4732333789a6-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:43:19 crc kubenswrapper[5033]: I0226 17:43:19.084608 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qwsvl" Feb 26 17:43:19 crc kubenswrapper[5033]: I0226 17:43:19.129754 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qwsvl"] Feb 26 17:43:19 crc kubenswrapper[5033]: I0226 17:43:19.139694 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qwsvl"] Feb 26 17:43:19 crc kubenswrapper[5033]: I0226 17:43:19.642138 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de122c60-10d5-421a-b367-4732333789a6" path="/var/lib/kubelet/pods/de122c60-10d5-421a-b367-4732333789a6/volumes" Feb 26 17:43:21 crc kubenswrapper[5033]: I0226 17:43:21.628233 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:43:21 crc kubenswrapper[5033]: E0226 17:43:21.629038 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:43:35 crc kubenswrapper[5033]: I0226 17:43:35.628002 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:43:35 crc kubenswrapper[5033]: E0226 17:43:35.629614 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:43:46 crc kubenswrapper[5033]: I0226 17:43:46.628420 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:43:46 crc kubenswrapper[5033]: E0226 17:43:46.629516 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.139127 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535464-vdf2j"] Feb 26 17:44:00 crc kubenswrapper[5033]: E0226 17:44:00.140228 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de122c60-10d5-421a-b367-4732333789a6" containerName="extract-content" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.140248 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="de122c60-10d5-421a-b367-4732333789a6" containerName="extract-content" Feb 26 17:44:00 crc kubenswrapper[5033]: E0226 17:44:00.140310 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de122c60-10d5-421a-b367-4732333789a6" containerName="extract-utilities" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.140319 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="de122c60-10d5-421a-b367-4732333789a6" containerName="extract-utilities" Feb 26 17:44:00 crc kubenswrapper[5033]: E0226 17:44:00.140341 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de122c60-10d5-421a-b367-4732333789a6" containerName="registry-server" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.140351 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="de122c60-10d5-421a-b367-4732333789a6" containerName="registry-server" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.140599 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="de122c60-10d5-421a-b367-4732333789a6" containerName="registry-server" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.141446 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535464-vdf2j" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.144010 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.144326 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.144608 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.154213 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535464-vdf2j"] Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.290184 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpcbk\" (UniqueName: \"kubernetes.io/projected/2a993462-42a4-46ed-98e7-738df18efe2a-kube-api-access-cpcbk\") pod \"auto-csr-approver-29535464-vdf2j\" (UID: \"2a993462-42a4-46ed-98e7-738df18efe2a\") " pod="openshift-infra/auto-csr-approver-29535464-vdf2j" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.392606 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpcbk\" (UniqueName: \"kubernetes.io/projected/2a993462-42a4-46ed-98e7-738df18efe2a-kube-api-access-cpcbk\") pod \"auto-csr-approver-29535464-vdf2j\" (UID: \"2a993462-42a4-46ed-98e7-738df18efe2a\") " pod="openshift-infra/auto-csr-approver-29535464-vdf2j" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.429359 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpcbk\" (UniqueName: \"kubernetes.io/projected/2a993462-42a4-46ed-98e7-738df18efe2a-kube-api-access-cpcbk\") pod \"auto-csr-approver-29535464-vdf2j\" (UID: \"2a993462-42a4-46ed-98e7-738df18efe2a\") " pod="openshift-infra/auto-csr-approver-29535464-vdf2j" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.462103 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535464-vdf2j" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.628455 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:44:00 crc kubenswrapper[5033]: E0226 17:44:00.629080 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.945386 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535464-vdf2j"] Feb 26 17:44:00 crc kubenswrapper[5033]: I0226 17:44:00.949530 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:44:01 crc kubenswrapper[5033]: I0226 17:44:01.070156 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535464-vdf2j" event={"ID":"2a993462-42a4-46ed-98e7-738df18efe2a","Type":"ContainerStarted","Data":"2be59aff06d6425525f30732544244141efe49b9f7bf70ed3b612f92d8da7dda"} Feb 26 17:44:03 crc kubenswrapper[5033]: I0226 17:44:03.090368 5033 generic.go:334] "Generic (PLEG): container finished" podID="2a993462-42a4-46ed-98e7-738df18efe2a" containerID="f76f8fac10a0e1be3bde42c40c97ef72ca96b140176bf28063a314a968a2c3df" exitCode=0 Feb 26 17:44:03 crc kubenswrapper[5033]: I0226 17:44:03.090502 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535464-vdf2j" event={"ID":"2a993462-42a4-46ed-98e7-738df18efe2a","Type":"ContainerDied","Data":"f76f8fac10a0e1be3bde42c40c97ef72ca96b140176bf28063a314a968a2c3df"} Feb 26 17:44:04 crc kubenswrapper[5033]: I0226 17:44:04.522960 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535464-vdf2j" Feb 26 17:44:04 crc kubenswrapper[5033]: I0226 17:44:04.692444 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpcbk\" (UniqueName: \"kubernetes.io/projected/2a993462-42a4-46ed-98e7-738df18efe2a-kube-api-access-cpcbk\") pod \"2a993462-42a4-46ed-98e7-738df18efe2a\" (UID: \"2a993462-42a4-46ed-98e7-738df18efe2a\") " Feb 26 17:44:04 crc kubenswrapper[5033]: I0226 17:44:04.701682 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a993462-42a4-46ed-98e7-738df18efe2a-kube-api-access-cpcbk" (OuterVolumeSpecName: "kube-api-access-cpcbk") pod "2a993462-42a4-46ed-98e7-738df18efe2a" (UID: "2a993462-42a4-46ed-98e7-738df18efe2a"). InnerVolumeSpecName "kube-api-access-cpcbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:44:04 crc kubenswrapper[5033]: I0226 17:44:04.795580 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpcbk\" (UniqueName: \"kubernetes.io/projected/2a993462-42a4-46ed-98e7-738df18efe2a-kube-api-access-cpcbk\") on node \"crc\" DevicePath \"\"" Feb 26 17:44:05 crc kubenswrapper[5033]: I0226 17:44:05.114187 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535464-vdf2j" event={"ID":"2a993462-42a4-46ed-98e7-738df18efe2a","Type":"ContainerDied","Data":"2be59aff06d6425525f30732544244141efe49b9f7bf70ed3b612f92d8da7dda"} Feb 26 17:44:05 crc kubenswrapper[5033]: I0226 17:44:05.114749 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2be59aff06d6425525f30732544244141efe49b9f7bf70ed3b612f92d8da7dda" Feb 26 17:44:05 crc kubenswrapper[5033]: I0226 17:44:05.114252 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535464-vdf2j" Feb 26 17:44:05 crc kubenswrapper[5033]: I0226 17:44:05.620029 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535458-m65tm"] Feb 26 17:44:05 crc kubenswrapper[5033]: I0226 17:44:05.638979 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535458-m65tm"] Feb 26 17:44:07 crc kubenswrapper[5033]: I0226 17:44:07.642762 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4272dacd-15d6-4e3b-80ea-b66dcbe48f2f" path="/var/lib/kubelet/pods/4272dacd-15d6-4e3b-80ea-b66dcbe48f2f/volumes" Feb 26 17:44:15 crc kubenswrapper[5033]: I0226 17:44:15.635581 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:44:15 crc kubenswrapper[5033]: E0226 17:44:15.636496 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:44:25 crc kubenswrapper[5033]: I0226 17:44:25.356594 5033 generic.go:334] "Generic (PLEG): container finished" podID="cdd70c36-4e9c-4608-816c-7fd46b747577" containerID="07988d3ba0998f4971c92699e61bf052f8642ebf3973213d2219d08ae3c415f4" exitCode=0 Feb 26 17:44:25 crc kubenswrapper[5033]: I0226 17:44:25.356694 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" event={"ID":"cdd70c36-4e9c-4608-816c-7fd46b747577","Type":"ContainerDied","Data":"07988d3ba0998f4971c92699e61bf052f8642ebf3973213d2219d08ae3c415f4"} Feb 26 17:44:26 crc kubenswrapper[5033]: I0226 17:44:26.862334 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:44:26 crc kubenswrapper[5033]: I0226 17:44:26.973486 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ssh-key-openstack-cell1\") pod \"cdd70c36-4e9c-4608-816c-7fd46b747577\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " Feb 26 17:44:26 crc kubenswrapper[5033]: I0226 17:44:26.973720 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ceph\") pod \"cdd70c36-4e9c-4608-816c-7fd46b747577\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " Feb 26 17:44:26 crc kubenswrapper[5033]: I0226 17:44:26.973831 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6nmk\" (UniqueName: \"kubernetes.io/projected/cdd70c36-4e9c-4608-816c-7fd46b747577-kube-api-access-b6nmk\") pod \"cdd70c36-4e9c-4608-816c-7fd46b747577\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " Feb 26 17:44:26 crc kubenswrapper[5033]: I0226 17:44:26.973950 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-inventory\") pod \"cdd70c36-4e9c-4608-816c-7fd46b747577\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " Feb 26 17:44:26 crc kubenswrapper[5033]: I0226 17:44:26.974242 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-bootstrap-combined-ca-bundle\") pod \"cdd70c36-4e9c-4608-816c-7fd46b747577\" (UID: \"cdd70c36-4e9c-4608-816c-7fd46b747577\") " Feb 26 17:44:26 crc kubenswrapper[5033]: I0226 17:44:26.984169 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "cdd70c36-4e9c-4608-816c-7fd46b747577" (UID: "cdd70c36-4e9c-4608-816c-7fd46b747577"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:44:26 crc kubenswrapper[5033]: I0226 17:44:26.984199 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdd70c36-4e9c-4608-816c-7fd46b747577-kube-api-access-b6nmk" (OuterVolumeSpecName: "kube-api-access-b6nmk") pod "cdd70c36-4e9c-4608-816c-7fd46b747577" (UID: "cdd70c36-4e9c-4608-816c-7fd46b747577"). InnerVolumeSpecName "kube-api-access-b6nmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:44:26 crc kubenswrapper[5033]: I0226 17:44:26.984266 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ceph" (OuterVolumeSpecName: "ceph") pod "cdd70c36-4e9c-4608-816c-7fd46b747577" (UID: "cdd70c36-4e9c-4608-816c-7fd46b747577"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.005060 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cdd70c36-4e9c-4608-816c-7fd46b747577" (UID: "cdd70c36-4e9c-4608-816c-7fd46b747577"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.008435 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-inventory" (OuterVolumeSpecName: "inventory") pod "cdd70c36-4e9c-4608-816c-7fd46b747577" (UID: "cdd70c36-4e9c-4608-816c-7fd46b747577"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.077100 5033 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.077133 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.077148 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.077159 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6nmk\" (UniqueName: \"kubernetes.io/projected/cdd70c36-4e9c-4608-816c-7fd46b747577-kube-api-access-b6nmk\") on node \"crc\" DevicePath \"\"" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.077169 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdd70c36-4e9c-4608-816c-7fd46b747577-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.382609 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" event={"ID":"cdd70c36-4e9c-4608-816c-7fd46b747577","Type":"ContainerDied","Data":"1bb3bc3c75741d4e57641eda61351788402ad887c383ad57d9924099cc755968"} Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.382675 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bb3bc3c75741d4e57641eda61351788402ad887c383ad57d9924099cc755968" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.382754 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-hqjgv" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.491168 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-xndgp"] Feb 26 17:44:27 crc kubenswrapper[5033]: E0226 17:44:27.491838 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd70c36-4e9c-4608-816c-7fd46b747577" containerName="bootstrap-openstack-openstack-cell1" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.491872 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd70c36-4e9c-4608-816c-7fd46b747577" containerName="bootstrap-openstack-openstack-cell1" Feb 26 17:44:27 crc kubenswrapper[5033]: E0226 17:44:27.491898 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a993462-42a4-46ed-98e7-738df18efe2a" containerName="oc" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.491907 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a993462-42a4-46ed-98e7-738df18efe2a" containerName="oc" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.492203 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a993462-42a4-46ed-98e7-738df18efe2a" containerName="oc" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.492228 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd70c36-4e9c-4608-816c-7fd46b747577" containerName="bootstrap-openstack-openstack-cell1" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.493331 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.501003 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.503187 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.503382 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.503386 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.505692 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-xndgp"] Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.587818 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.587918 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ceph\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.588001 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-inventory\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.588522 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4r2x\" (UniqueName: \"kubernetes.io/projected/64d8cd2d-5dbe-45e7-b844-339aa50577b6-kube-api-access-p4r2x\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.691122 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.691310 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ceph\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.691464 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-inventory\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.691643 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4r2x\" (UniqueName: \"kubernetes.io/projected/64d8cd2d-5dbe-45e7-b844-339aa50577b6-kube-api-access-p4r2x\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.694519 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.695614 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-inventory\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.699197 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ceph\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.713835 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4r2x\" (UniqueName: \"kubernetes.io/projected/64d8cd2d-5dbe-45e7-b844-339aa50577b6-kube-api-access-p4r2x\") pod \"download-cache-openstack-openstack-cell1-xndgp\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:27 crc kubenswrapper[5033]: I0226 17:44:27.830109 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:44:28 crc kubenswrapper[5033]: I0226 17:44:28.354089 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-xndgp"] Feb 26 17:44:28 crc kubenswrapper[5033]: I0226 17:44:28.392585 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-xndgp" event={"ID":"64d8cd2d-5dbe-45e7-b844-339aa50577b6","Type":"ContainerStarted","Data":"b4dba22dad8a596533fdaa1606294ff7f6be7d158f3136bf243abdfb297ec25e"} Feb 26 17:44:29 crc kubenswrapper[5033]: I0226 17:44:29.628647 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:44:29 crc kubenswrapper[5033]: E0226 17:44:29.629867 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:44:31 crc kubenswrapper[5033]: I0226 17:44:31.423630 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-xndgp" event={"ID":"64d8cd2d-5dbe-45e7-b844-339aa50577b6","Type":"ContainerStarted","Data":"b91272d74a0b94d77c2e07a3d5763080469824c744ae777fa8a406602ed77002"} Feb 26 17:44:31 crc kubenswrapper[5033]: I0226 17:44:31.453576 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-xndgp" podStartSLOduration=1.838233752 podStartE2EDuration="4.453547357s" podCreationTimestamp="2026-02-26 17:44:27 +0000 UTC" firstStartedPulling="2026-02-26 17:44:28.362443874 +0000 UTC m=+7395.575042524" lastFinishedPulling="2026-02-26 17:44:30.977757439 +0000 UTC m=+7398.190356129" observedRunningTime="2026-02-26 17:44:31.44095993 +0000 UTC m=+7398.653558600" watchObservedRunningTime="2026-02-26 17:44:31.453547357 +0000 UTC m=+7398.666146037" Feb 26 17:44:41 crc kubenswrapper[5033]: I0226 17:44:41.781698 5033 scope.go:117] "RemoveContainer" containerID="811adef759b03aa89440e2449dfbceb88e0d57f239df90e8256e046de203cc9d" Feb 26 17:44:42 crc kubenswrapper[5033]: I0226 17:44:42.627733 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:44:42 crc kubenswrapper[5033]: E0226 17:44:42.628117 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:44:53 crc kubenswrapper[5033]: I0226 17:44:53.637102 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:44:54 crc kubenswrapper[5033]: I0226 17:44:54.730837 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"81832189e9ae9c1cde20ad800ba615a86bf39dd0d605e4f89a7bd4c6db781306"} Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.150663 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz"] Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.153025 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.160053 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.160986 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.164834 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz"] Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.180458 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-secret-volume\") pod \"collect-profiles-29535465-kxmgz\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.180661 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-config-volume\") pod \"collect-profiles-29535465-kxmgz\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.180951 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gtm\" (UniqueName: \"kubernetes.io/projected/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-kube-api-access-28gtm\") pod \"collect-profiles-29535465-kxmgz\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.282920 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-secret-volume\") pod \"collect-profiles-29535465-kxmgz\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.283018 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-config-volume\") pod \"collect-profiles-29535465-kxmgz\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.283128 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gtm\" (UniqueName: \"kubernetes.io/projected/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-kube-api-access-28gtm\") pod \"collect-profiles-29535465-kxmgz\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.284335 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-config-volume\") pod \"collect-profiles-29535465-kxmgz\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.289889 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-secret-volume\") pod \"collect-profiles-29535465-kxmgz\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.302339 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gtm\" (UniqueName: \"kubernetes.io/projected/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-kube-api-access-28gtm\") pod \"collect-profiles-29535465-kxmgz\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.493200 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:00 crc kubenswrapper[5033]: I0226 17:45:00.987333 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz"] Feb 26 17:45:01 crc kubenswrapper[5033]: I0226 17:45:01.819331 5033 generic.go:334] "Generic (PLEG): container finished" podID="fc187ed7-3e79-423b-82e8-74d5c0bbfaf7" containerID="df9d1653d33e93662ec55fe9c6cbdebf148ca2fd8a3c96da4a75f84683f45b3c" exitCode=0 Feb 26 17:45:01 crc kubenswrapper[5033]: I0226 17:45:01.819442 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" event={"ID":"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7","Type":"ContainerDied","Data":"df9d1653d33e93662ec55fe9c6cbdebf148ca2fd8a3c96da4a75f84683f45b3c"} Feb 26 17:45:01 crc kubenswrapper[5033]: I0226 17:45:01.819602 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" event={"ID":"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7","Type":"ContainerStarted","Data":"b3f184b47b7830a1a3316a238be1f2b5c15f3fcbc2107906182369b2fb229214"} Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.311810 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.355277 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-config-volume\") pod \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.355626 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-secret-volume\") pod \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.355739 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28gtm\" (UniqueName: \"kubernetes.io/projected/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-kube-api-access-28gtm\") pod \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\" (UID: \"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7\") " Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.356365 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-config-volume" (OuterVolumeSpecName: "config-volume") pod "fc187ed7-3e79-423b-82e8-74d5c0bbfaf7" (UID: "fc187ed7-3e79-423b-82e8-74d5c0bbfaf7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.356618 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.361608 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fc187ed7-3e79-423b-82e8-74d5c0bbfaf7" (UID: "fc187ed7-3e79-423b-82e8-74d5c0bbfaf7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.361795 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-kube-api-access-28gtm" (OuterVolumeSpecName: "kube-api-access-28gtm") pod "fc187ed7-3e79-423b-82e8-74d5c0bbfaf7" (UID: "fc187ed7-3e79-423b-82e8-74d5c0bbfaf7"). InnerVolumeSpecName "kube-api-access-28gtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.459094 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.459154 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28gtm\" (UniqueName: \"kubernetes.io/projected/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7-kube-api-access-28gtm\") on node \"crc\" DevicePath \"\"" Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.842953 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" event={"ID":"fc187ed7-3e79-423b-82e8-74d5c0bbfaf7","Type":"ContainerDied","Data":"b3f184b47b7830a1a3316a238be1f2b5c15f3fcbc2107906182369b2fb229214"} Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.842993 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3f184b47b7830a1a3316a238be1f2b5c15f3fcbc2107906182369b2fb229214" Feb 26 17:45:03 crc kubenswrapper[5033]: I0226 17:45:03.843046 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz" Feb 26 17:45:04 crc kubenswrapper[5033]: I0226 17:45:04.404606 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l"] Feb 26 17:45:04 crc kubenswrapper[5033]: I0226 17:45:04.419352 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535420-t8z4l"] Feb 26 17:45:05 crc kubenswrapper[5033]: I0226 17:45:05.644027 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a5ef96b-1bb4-48cd-a0ce-5e4eac167285" path="/var/lib/kubelet/pods/2a5ef96b-1bb4-48cd-a0ce-5e4eac167285/volumes" Feb 26 17:45:41 crc kubenswrapper[5033]: I0226 17:45:41.907843 5033 scope.go:117] "RemoveContainer" containerID="cc67a8a1fb5c45dc1c3cde41214cc762eb2ea7c8e008b4ebfa09fa376fdde52f" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.157627 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535466-vnn7l"] Feb 26 17:46:00 crc kubenswrapper[5033]: E0226 17:46:00.158617 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc187ed7-3e79-423b-82e8-74d5c0bbfaf7" containerName="collect-profiles" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.158632 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc187ed7-3e79-423b-82e8-74d5c0bbfaf7" containerName="collect-profiles" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.158932 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc187ed7-3e79-423b-82e8-74d5c0bbfaf7" containerName="collect-profiles" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.159863 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535466-vnn7l" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.162905 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.162981 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.167698 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.177800 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535466-vnn7l"] Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.327534 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tx2b\" (UniqueName: \"kubernetes.io/projected/5c424a5b-ca02-4917-9a46-de6ced0a6afe-kube-api-access-9tx2b\") pod \"auto-csr-approver-29535466-vnn7l\" (UID: \"5c424a5b-ca02-4917-9a46-de6ced0a6afe\") " pod="openshift-infra/auto-csr-approver-29535466-vnn7l" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.429589 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tx2b\" (UniqueName: \"kubernetes.io/projected/5c424a5b-ca02-4917-9a46-de6ced0a6afe-kube-api-access-9tx2b\") pod \"auto-csr-approver-29535466-vnn7l\" (UID: \"5c424a5b-ca02-4917-9a46-de6ced0a6afe\") " pod="openshift-infra/auto-csr-approver-29535466-vnn7l" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.464694 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tx2b\" (UniqueName: \"kubernetes.io/projected/5c424a5b-ca02-4917-9a46-de6ced0a6afe-kube-api-access-9tx2b\") pod \"auto-csr-approver-29535466-vnn7l\" (UID: \"5c424a5b-ca02-4917-9a46-de6ced0a6afe\") " pod="openshift-infra/auto-csr-approver-29535466-vnn7l" Feb 26 17:46:00 crc kubenswrapper[5033]: I0226 17:46:00.480308 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535466-vnn7l" Feb 26 17:46:01 crc kubenswrapper[5033]: I0226 17:46:01.013392 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535466-vnn7l"] Feb 26 17:46:01 crc kubenswrapper[5033]: I0226 17:46:01.565637 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535466-vnn7l" event={"ID":"5c424a5b-ca02-4917-9a46-de6ced0a6afe","Type":"ContainerStarted","Data":"81ecf17226f12659a1f80a137ec21f13b03a93db72c64bd9574e0256ea86a5eb"} Feb 26 17:46:02 crc kubenswrapper[5033]: I0226 17:46:02.578144 5033 generic.go:334] "Generic (PLEG): container finished" podID="5c424a5b-ca02-4917-9a46-de6ced0a6afe" containerID="eebf15aea05bf459369299d0e91bd6015a03374fc2cc6e5f33c05b871705252c" exitCode=0 Feb 26 17:46:02 crc kubenswrapper[5033]: I0226 17:46:02.578228 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535466-vnn7l" event={"ID":"5c424a5b-ca02-4917-9a46-de6ced0a6afe","Type":"ContainerDied","Data":"eebf15aea05bf459369299d0e91bd6015a03374fc2cc6e5f33c05b871705252c"} Feb 26 17:46:04 crc kubenswrapper[5033]: I0226 17:46:04.007143 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535466-vnn7l" Feb 26 17:46:04 crc kubenswrapper[5033]: I0226 17:46:04.124237 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tx2b\" (UniqueName: \"kubernetes.io/projected/5c424a5b-ca02-4917-9a46-de6ced0a6afe-kube-api-access-9tx2b\") pod \"5c424a5b-ca02-4917-9a46-de6ced0a6afe\" (UID: \"5c424a5b-ca02-4917-9a46-de6ced0a6afe\") " Feb 26 17:46:04 crc kubenswrapper[5033]: I0226 17:46:04.135249 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c424a5b-ca02-4917-9a46-de6ced0a6afe-kube-api-access-9tx2b" (OuterVolumeSpecName: "kube-api-access-9tx2b") pod "5c424a5b-ca02-4917-9a46-de6ced0a6afe" (UID: "5c424a5b-ca02-4917-9a46-de6ced0a6afe"). InnerVolumeSpecName "kube-api-access-9tx2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:46:04 crc kubenswrapper[5033]: I0226 17:46:04.226868 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tx2b\" (UniqueName: \"kubernetes.io/projected/5c424a5b-ca02-4917-9a46-de6ced0a6afe-kube-api-access-9tx2b\") on node \"crc\" DevicePath \"\"" Feb 26 17:46:04 crc kubenswrapper[5033]: I0226 17:46:04.599100 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535466-vnn7l" event={"ID":"5c424a5b-ca02-4917-9a46-de6ced0a6afe","Type":"ContainerDied","Data":"81ecf17226f12659a1f80a137ec21f13b03a93db72c64bd9574e0256ea86a5eb"} Feb 26 17:46:04 crc kubenswrapper[5033]: I0226 17:46:04.599146 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81ecf17226f12659a1f80a137ec21f13b03a93db72c64bd9574e0256ea86a5eb" Feb 26 17:46:04 crc kubenswrapper[5033]: I0226 17:46:04.599220 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535466-vnn7l" Feb 26 17:46:05 crc kubenswrapper[5033]: I0226 17:46:05.115807 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535460-jwx9c"] Feb 26 17:46:05 crc kubenswrapper[5033]: I0226 17:46:05.134554 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535460-jwx9c"] Feb 26 17:46:05 crc kubenswrapper[5033]: I0226 17:46:05.645708 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1738236-8698-4059-9d96-1c010907fdd7" path="/var/lib/kubelet/pods/f1738236-8698-4059-9d96-1c010907fdd7/volumes" Feb 26 17:46:14 crc kubenswrapper[5033]: I0226 17:46:14.278679 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 17:46:19 crc kubenswrapper[5033]: I0226 17:46:19.721679 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-58qbb"] Feb 26 17:46:19 crc kubenswrapper[5033]: E0226 17:46:19.722827 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c424a5b-ca02-4917-9a46-de6ced0a6afe" containerName="oc" Feb 26 17:46:19 crc kubenswrapper[5033]: I0226 17:46:19.722843 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c424a5b-ca02-4917-9a46-de6ced0a6afe" containerName="oc" Feb 26 17:46:19 crc kubenswrapper[5033]: I0226 17:46:19.723054 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c424a5b-ca02-4917-9a46-de6ced0a6afe" containerName="oc" Feb 26 17:46:19 crc kubenswrapper[5033]: I0226 17:46:19.724717 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:19 crc kubenswrapper[5033]: I0226 17:46:19.740097 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-58qbb"] Feb 26 17:46:19 crc kubenswrapper[5033]: I0226 17:46:19.906866 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hnw\" (UniqueName: \"kubernetes.io/projected/396a3d54-6356-4056-aaa8-e011d8b51b28-kube-api-access-29hnw\") pod \"redhat-marketplace-58qbb\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:19 crc kubenswrapper[5033]: I0226 17:46:19.907216 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-catalog-content\") pod \"redhat-marketplace-58qbb\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:19 crc kubenswrapper[5033]: I0226 17:46:19.907685 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-utilities\") pod \"redhat-marketplace-58qbb\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:20 crc kubenswrapper[5033]: I0226 17:46:20.009634 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-utilities\") pod \"redhat-marketplace-58qbb\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:20 crc kubenswrapper[5033]: I0226 17:46:20.009705 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29hnw\" (UniqueName: \"kubernetes.io/projected/396a3d54-6356-4056-aaa8-e011d8b51b28-kube-api-access-29hnw\") pod \"redhat-marketplace-58qbb\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:20 crc kubenswrapper[5033]: I0226 17:46:20.009795 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-catalog-content\") pod \"redhat-marketplace-58qbb\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:20 crc kubenswrapper[5033]: I0226 17:46:20.010295 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-utilities\") pod \"redhat-marketplace-58qbb\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:20 crc kubenswrapper[5033]: I0226 17:46:20.010439 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-catalog-content\") pod \"redhat-marketplace-58qbb\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:20 crc kubenswrapper[5033]: I0226 17:46:20.057167 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29hnw\" (UniqueName: \"kubernetes.io/projected/396a3d54-6356-4056-aaa8-e011d8b51b28-kube-api-access-29hnw\") pod \"redhat-marketplace-58qbb\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:20 crc kubenswrapper[5033]: I0226 17:46:20.353703 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:20 crc kubenswrapper[5033]: I0226 17:46:20.845376 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-58qbb"] Feb 26 17:46:21 crc kubenswrapper[5033]: I0226 17:46:21.809614 5033 generic.go:334] "Generic (PLEG): container finished" podID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerID="d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4" exitCode=0 Feb 26 17:46:21 crc kubenswrapper[5033]: I0226 17:46:21.809669 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58qbb" event={"ID":"396a3d54-6356-4056-aaa8-e011d8b51b28","Type":"ContainerDied","Data":"d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4"} Feb 26 17:46:21 crc kubenswrapper[5033]: I0226 17:46:21.809701 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58qbb" event={"ID":"396a3d54-6356-4056-aaa8-e011d8b51b28","Type":"ContainerStarted","Data":"c31bac8407c943b5f51d1c60f4a99b46d466501b60c7c40ae2bdccb44b90619d"} Feb 26 17:46:24 crc kubenswrapper[5033]: I0226 17:46:24.851837 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58qbb" event={"ID":"396a3d54-6356-4056-aaa8-e011d8b51b28","Type":"ContainerStarted","Data":"f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829"} Feb 26 17:46:25 crc kubenswrapper[5033]: I0226 17:46:25.864466 5033 generic.go:334] "Generic (PLEG): container finished" podID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerID="f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829" exitCode=0 Feb 26 17:46:25 crc kubenswrapper[5033]: I0226 17:46:25.864879 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58qbb" event={"ID":"396a3d54-6356-4056-aaa8-e011d8b51b28","Type":"ContainerDied","Data":"f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829"} Feb 26 17:46:26 crc kubenswrapper[5033]: I0226 17:46:26.876780 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58qbb" event={"ID":"396a3d54-6356-4056-aaa8-e011d8b51b28","Type":"ContainerStarted","Data":"11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310"} Feb 26 17:46:26 crc kubenswrapper[5033]: I0226 17:46:26.900224 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-58qbb" podStartSLOduration=3.329609913 podStartE2EDuration="7.900204149s" podCreationTimestamp="2026-02-26 17:46:19 +0000 UTC" firstStartedPulling="2026-02-26 17:46:21.821529918 +0000 UTC m=+7509.034128568" lastFinishedPulling="2026-02-26 17:46:26.392124154 +0000 UTC m=+7513.604722804" observedRunningTime="2026-02-26 17:46:26.893797622 +0000 UTC m=+7514.106396302" watchObservedRunningTime="2026-02-26 17:46:26.900204149 +0000 UTC m=+7514.112802799" Feb 26 17:46:30 crc kubenswrapper[5033]: I0226 17:46:30.354024 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:30 crc kubenswrapper[5033]: I0226 17:46:30.355038 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:30 crc kubenswrapper[5033]: I0226 17:46:30.418297 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:40 crc kubenswrapper[5033]: I0226 17:46:40.479965 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:40 crc kubenswrapper[5033]: I0226 17:46:40.545659 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-58qbb"] Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.059516 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-58qbb" podUID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerName="registry-server" containerID="cri-o://11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310" gracePeriod=2 Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.698727 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.755362 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-catalog-content\") pod \"396a3d54-6356-4056-aaa8-e011d8b51b28\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.759951 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29hnw\" (UniqueName: \"kubernetes.io/projected/396a3d54-6356-4056-aaa8-e011d8b51b28-kube-api-access-29hnw\") pod \"396a3d54-6356-4056-aaa8-e011d8b51b28\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.760017 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-utilities\") pod \"396a3d54-6356-4056-aaa8-e011d8b51b28\" (UID: \"396a3d54-6356-4056-aaa8-e011d8b51b28\") " Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.762184 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-utilities" (OuterVolumeSpecName: "utilities") pod "396a3d54-6356-4056-aaa8-e011d8b51b28" (UID: "396a3d54-6356-4056-aaa8-e011d8b51b28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.772049 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396a3d54-6356-4056-aaa8-e011d8b51b28-kube-api-access-29hnw" (OuterVolumeSpecName: "kube-api-access-29hnw") pod "396a3d54-6356-4056-aaa8-e011d8b51b28" (UID: "396a3d54-6356-4056-aaa8-e011d8b51b28"). InnerVolumeSpecName "kube-api-access-29hnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.796480 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "396a3d54-6356-4056-aaa8-e011d8b51b28" (UID: "396a3d54-6356-4056-aaa8-e011d8b51b28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.874045 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.875832 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29hnw\" (UniqueName: \"kubernetes.io/projected/396a3d54-6356-4056-aaa8-e011d8b51b28-kube-api-access-29hnw\") on node \"crc\" DevicePath \"\"" Feb 26 17:46:41 crc kubenswrapper[5033]: I0226 17:46:41.875877 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396a3d54-6356-4056-aaa8-e011d8b51b28-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.026147 5033 scope.go:117] "RemoveContainer" containerID="5a2aa3b91f3d0956001063cdf2811b5388d0d59ace18f241a2b0a0c57d8b7bad" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.076505 5033 generic.go:334] "Generic (PLEG): container finished" podID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerID="11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310" exitCode=0 Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.076593 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58qbb" event={"ID":"396a3d54-6356-4056-aaa8-e011d8b51b28","Type":"ContainerDied","Data":"11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310"} Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.076718 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58qbb" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.076996 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58qbb" event={"ID":"396a3d54-6356-4056-aaa8-e011d8b51b28","Type":"ContainerDied","Data":"c31bac8407c943b5f51d1c60f4a99b46d466501b60c7c40ae2bdccb44b90619d"} Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.077309 5033 scope.go:117] "RemoveContainer" containerID="11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.130622 5033 scope.go:117] "RemoveContainer" containerID="f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.134996 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-58qbb"] Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.144335 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-58qbb"] Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.147227 5033 scope.go:117] "RemoveContainer" containerID="d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.165604 5033 scope.go:117] "RemoveContainer" containerID="11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310" Feb 26 17:46:42 crc kubenswrapper[5033]: E0226 17:46:42.166233 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310\": container with ID starting with 11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310 not found: ID does not exist" containerID="11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.166306 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310"} err="failed to get container status \"11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310\": rpc error: code = NotFound desc = could not find container \"11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310\": container with ID starting with 11580f8ef050f626d9db7760e046ac4cde80130c4b133a6cf9e4cb7182eae310 not found: ID does not exist" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.166342 5033 scope.go:117] "RemoveContainer" containerID="f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829" Feb 26 17:46:42 crc kubenswrapper[5033]: E0226 17:46:42.166699 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829\": container with ID starting with f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829 not found: ID does not exist" containerID="f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.166753 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829"} err="failed to get container status \"f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829\": rpc error: code = NotFound desc = could not find container \"f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829\": container with ID starting with f6a23c1e23be904a0161ad9fc2deb6d3fbf57ae94672ce6f796fe88db8eaa829 not found: ID does not exist" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.166784 5033 scope.go:117] "RemoveContainer" containerID="d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4" Feb 26 17:46:42 crc kubenswrapper[5033]: E0226 17:46:42.167067 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4\": container with ID starting with d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4 not found: ID does not exist" containerID="d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4" Feb 26 17:46:42 crc kubenswrapper[5033]: I0226 17:46:42.167134 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4"} err="failed to get container status \"d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4\": rpc error: code = NotFound desc = could not find container \"d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4\": container with ID starting with d6fb00be7c80882f2beb35d62e883314e1bdd04f8aa1968b61cf43de3590b5d4 not found: ID does not exist" Feb 26 17:46:43 crc kubenswrapper[5033]: I0226 17:46:43.639185 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396a3d54-6356-4056-aaa8-e011d8b51b28" path="/var/lib/kubelet/pods/396a3d54-6356-4056-aaa8-e011d8b51b28/volumes" Feb 26 17:46:44 crc kubenswrapper[5033]: I0226 17:46:44.098791 5033 generic.go:334] "Generic (PLEG): container finished" podID="64d8cd2d-5dbe-45e7-b844-339aa50577b6" containerID="b91272d74a0b94d77c2e07a3d5763080469824c744ae777fa8a406602ed77002" exitCode=0 Feb 26 17:46:44 crc kubenswrapper[5033]: I0226 17:46:44.098923 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-xndgp" event={"ID":"64d8cd2d-5dbe-45e7-b844-339aa50577b6","Type":"ContainerDied","Data":"b91272d74a0b94d77c2e07a3d5763080469824c744ae777fa8a406602ed77002"} Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.703007 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.786776 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ceph\") pod \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.787012 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ssh-key-openstack-cell1\") pod \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.787060 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4r2x\" (UniqueName: \"kubernetes.io/projected/64d8cd2d-5dbe-45e7-b844-339aa50577b6-kube-api-access-p4r2x\") pod \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.787128 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-inventory\") pod \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\" (UID: \"64d8cd2d-5dbe-45e7-b844-339aa50577b6\") " Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.795588 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ceph" (OuterVolumeSpecName: "ceph") pod "64d8cd2d-5dbe-45e7-b844-339aa50577b6" (UID: "64d8cd2d-5dbe-45e7-b844-339aa50577b6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.803231 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d8cd2d-5dbe-45e7-b844-339aa50577b6-kube-api-access-p4r2x" (OuterVolumeSpecName: "kube-api-access-p4r2x") pod "64d8cd2d-5dbe-45e7-b844-339aa50577b6" (UID: "64d8cd2d-5dbe-45e7-b844-339aa50577b6"). InnerVolumeSpecName "kube-api-access-p4r2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.831960 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-inventory" (OuterVolumeSpecName: "inventory") pod "64d8cd2d-5dbe-45e7-b844-339aa50577b6" (UID: "64d8cd2d-5dbe-45e7-b844-339aa50577b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.838773 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "64d8cd2d-5dbe-45e7-b844-339aa50577b6" (UID: "64d8cd2d-5dbe-45e7-b844-339aa50577b6"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.889202 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.889238 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.889250 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4r2x\" (UniqueName: \"kubernetes.io/projected/64d8cd2d-5dbe-45e7-b844-339aa50577b6-kube-api-access-p4r2x\") on node \"crc\" DevicePath \"\"" Feb 26 17:46:45 crc kubenswrapper[5033]: I0226 17:46:45.889330 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64d8cd2d-5dbe-45e7-b844-339aa50577b6-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.174005 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-xndgp" event={"ID":"64d8cd2d-5dbe-45e7-b844-339aa50577b6","Type":"ContainerDied","Data":"b4dba22dad8a596533fdaa1606294ff7f6be7d158f3136bf243abdfb297ec25e"} Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.174065 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4dba22dad8a596533fdaa1606294ff7f6be7d158f3136bf243abdfb297ec25e" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.174091 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-xndgp" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.299005 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-xc82d"] Feb 26 17:46:46 crc kubenswrapper[5033]: E0226 17:46:46.299447 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerName="registry-server" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.299464 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerName="registry-server" Feb 26 17:46:46 crc kubenswrapper[5033]: E0226 17:46:46.299477 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d8cd2d-5dbe-45e7-b844-339aa50577b6" containerName="download-cache-openstack-openstack-cell1" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.299483 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d8cd2d-5dbe-45e7-b844-339aa50577b6" containerName="download-cache-openstack-openstack-cell1" Feb 26 17:46:46 crc kubenswrapper[5033]: E0226 17:46:46.299499 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerName="extract-utilities" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.299506 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerName="extract-utilities" Feb 26 17:46:46 crc kubenswrapper[5033]: E0226 17:46:46.299518 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerName="extract-content" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.299524 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerName="extract-content" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.299714 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="396a3d54-6356-4056-aaa8-e011d8b51b28" containerName="registry-server" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.299734 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="64d8cd2d-5dbe-45e7-b844-339aa50577b6" containerName="download-cache-openstack-openstack-cell1" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.300475 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.304842 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.308794 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.309017 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.315694 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.321075 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-xc82d"] Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.413663 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8qx\" (UniqueName: \"kubernetes.io/projected/6d711842-6c9a-468e-842b-b44c019d0e13-kube-api-access-fx8qx\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.413972 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.414279 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-inventory\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.414367 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ceph\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.516460 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8qx\" (UniqueName: \"kubernetes.io/projected/6d711842-6c9a-468e-842b-b44c019d0e13-kube-api-access-fx8qx\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.516518 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.516576 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-inventory\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.516601 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ceph\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.521305 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.521681 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-inventory\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.522100 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ceph\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.533183 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8qx\" (UniqueName: \"kubernetes.io/projected/6d711842-6c9a-468e-842b-b44c019d0e13-kube-api-access-fx8qx\") pod \"configure-network-openstack-openstack-cell1-xc82d\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:46 crc kubenswrapper[5033]: I0226 17:46:46.650393 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:46:47 crc kubenswrapper[5033]: I0226 17:46:47.221282 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-xc82d"] Feb 26 17:46:47 crc kubenswrapper[5033]: W0226 17:46:47.223124 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d711842_6c9a_468e_842b_b44c019d0e13.slice/crio-ce7751e5356a4f19aa4b4802c485f70f61c16eb863c053c6ca2bbdd21f93bff1 WatchSource:0}: Error finding container ce7751e5356a4f19aa4b4802c485f70f61c16eb863c053c6ca2bbdd21f93bff1: Status 404 returned error can't find the container with id ce7751e5356a4f19aa4b4802c485f70f61c16eb863c053c6ca2bbdd21f93bff1 Feb 26 17:46:48 crc kubenswrapper[5033]: I0226 17:46:48.194489 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-xc82d" event={"ID":"6d711842-6c9a-468e-842b-b44c019d0e13","Type":"ContainerStarted","Data":"cef662e67616f4337317d5314570400aca7ab3934a83caf8a592a57b451d8918"} Feb 26 17:46:48 crc kubenswrapper[5033]: I0226 17:46:48.194766 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-xc82d" event={"ID":"6d711842-6c9a-468e-842b-b44c019d0e13","Type":"ContainerStarted","Data":"ce7751e5356a4f19aa4b4802c485f70f61c16eb863c053c6ca2bbdd21f93bff1"} Feb 26 17:46:48 crc kubenswrapper[5033]: I0226 17:46:48.215370 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-xc82d" podStartSLOduration=1.754956136 podStartE2EDuration="2.215348449s" podCreationTimestamp="2026-02-26 17:46:46 +0000 UTC" firstStartedPulling="2026-02-26 17:46:47.226330732 +0000 UTC m=+7534.438929382" lastFinishedPulling="2026-02-26 17:46:47.686723005 +0000 UTC m=+7534.899321695" observedRunningTime="2026-02-26 17:46:48.208512083 +0000 UTC m=+7535.421110733" watchObservedRunningTime="2026-02-26 17:46:48.215348449 +0000 UTC m=+7535.427947099" Feb 26 17:47:20 crc kubenswrapper[5033]: I0226 17:47:20.189044 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:47:20 crc kubenswrapper[5033]: I0226 17:47:20.189670 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:47:50 crc kubenswrapper[5033]: I0226 17:47:50.189763 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:47:50 crc kubenswrapper[5033]: I0226 17:47:50.190421 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.152104 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535468-mtn4s"] Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.154368 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535468-mtn4s" Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.157035 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.158153 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.158244 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.165127 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535468-mtn4s"] Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.225743 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvxrx\" (UniqueName: \"kubernetes.io/projected/4162b533-4350-485d-a6b7-8e2a3920ee38-kube-api-access-bvxrx\") pod \"auto-csr-approver-29535468-mtn4s\" (UID: \"4162b533-4350-485d-a6b7-8e2a3920ee38\") " pod="openshift-infra/auto-csr-approver-29535468-mtn4s" Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.328870 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvxrx\" (UniqueName: \"kubernetes.io/projected/4162b533-4350-485d-a6b7-8e2a3920ee38-kube-api-access-bvxrx\") pod \"auto-csr-approver-29535468-mtn4s\" (UID: \"4162b533-4350-485d-a6b7-8e2a3920ee38\") " pod="openshift-infra/auto-csr-approver-29535468-mtn4s" Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.354085 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvxrx\" (UniqueName: \"kubernetes.io/projected/4162b533-4350-485d-a6b7-8e2a3920ee38-kube-api-access-bvxrx\") pod \"auto-csr-approver-29535468-mtn4s\" (UID: \"4162b533-4350-485d-a6b7-8e2a3920ee38\") " pod="openshift-infra/auto-csr-approver-29535468-mtn4s" Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.482391 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535468-mtn4s" Feb 26 17:48:00 crc kubenswrapper[5033]: I0226 17:48:00.962543 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535468-mtn4s"] Feb 26 17:48:01 crc kubenswrapper[5033]: I0226 17:48:01.027068 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535468-mtn4s" event={"ID":"4162b533-4350-485d-a6b7-8e2a3920ee38","Type":"ContainerStarted","Data":"c985cc8816e2f591e6cae8691a989de6da9aaa51ac5b3cdf0fcb9e4aa77ed71a"} Feb 26 17:48:03 crc kubenswrapper[5033]: I0226 17:48:03.055477 5033 generic.go:334] "Generic (PLEG): container finished" podID="4162b533-4350-485d-a6b7-8e2a3920ee38" containerID="8ce7faf7b3a6b59b15b1d48be427d3f4ea35efcacedb40bb3171904b0bcb0feb" exitCode=0 Feb 26 17:48:03 crc kubenswrapper[5033]: I0226 17:48:03.055529 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535468-mtn4s" event={"ID":"4162b533-4350-485d-a6b7-8e2a3920ee38","Type":"ContainerDied","Data":"8ce7faf7b3a6b59b15b1d48be427d3f4ea35efcacedb40bb3171904b0bcb0feb"} Feb 26 17:48:04 crc kubenswrapper[5033]: I0226 17:48:04.496516 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535468-mtn4s" Feb 26 17:48:04 crc kubenswrapper[5033]: I0226 17:48:04.645677 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvxrx\" (UniqueName: \"kubernetes.io/projected/4162b533-4350-485d-a6b7-8e2a3920ee38-kube-api-access-bvxrx\") pod \"4162b533-4350-485d-a6b7-8e2a3920ee38\" (UID: \"4162b533-4350-485d-a6b7-8e2a3920ee38\") " Feb 26 17:48:04 crc kubenswrapper[5033]: I0226 17:48:04.652865 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4162b533-4350-485d-a6b7-8e2a3920ee38-kube-api-access-bvxrx" (OuterVolumeSpecName: "kube-api-access-bvxrx") pod "4162b533-4350-485d-a6b7-8e2a3920ee38" (UID: "4162b533-4350-485d-a6b7-8e2a3920ee38"). InnerVolumeSpecName "kube-api-access-bvxrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:48:04 crc kubenswrapper[5033]: I0226 17:48:04.749136 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvxrx\" (UniqueName: \"kubernetes.io/projected/4162b533-4350-485d-a6b7-8e2a3920ee38-kube-api-access-bvxrx\") on node \"crc\" DevicePath \"\"" Feb 26 17:48:05 crc kubenswrapper[5033]: I0226 17:48:05.080545 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535468-mtn4s" event={"ID":"4162b533-4350-485d-a6b7-8e2a3920ee38","Type":"ContainerDied","Data":"c985cc8816e2f591e6cae8691a989de6da9aaa51ac5b3cdf0fcb9e4aa77ed71a"} Feb 26 17:48:05 crc kubenswrapper[5033]: I0226 17:48:05.080586 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c985cc8816e2f591e6cae8691a989de6da9aaa51ac5b3cdf0fcb9e4aa77ed71a" Feb 26 17:48:05 crc kubenswrapper[5033]: I0226 17:48:05.080633 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535468-mtn4s" Feb 26 17:48:05 crc kubenswrapper[5033]: I0226 17:48:05.573374 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535462-k6jww"] Feb 26 17:48:05 crc kubenswrapper[5033]: I0226 17:48:05.585052 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535462-k6jww"] Feb 26 17:48:05 crc kubenswrapper[5033]: I0226 17:48:05.642141 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd60cc72-b742-4acc-ac35-b8aee208b9fb" path="/var/lib/kubelet/pods/bd60cc72-b742-4acc-ac35-b8aee208b9fb/volumes" Feb 26 17:48:12 crc kubenswrapper[5033]: I0226 17:48:12.156797 5033 generic.go:334] "Generic (PLEG): container finished" podID="6d711842-6c9a-468e-842b-b44c019d0e13" containerID="cef662e67616f4337317d5314570400aca7ab3934a83caf8a592a57b451d8918" exitCode=0 Feb 26 17:48:12 crc kubenswrapper[5033]: I0226 17:48:12.156841 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-xc82d" event={"ID":"6d711842-6c9a-468e-842b-b44c019d0e13","Type":"ContainerDied","Data":"cef662e67616f4337317d5314570400aca7ab3934a83caf8a592a57b451d8918"} Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.653543 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.747792 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-inventory\") pod \"6d711842-6c9a-468e-842b-b44c019d0e13\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.747889 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ceph\") pod \"6d711842-6c9a-468e-842b-b44c019d0e13\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.748027 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx8qx\" (UniqueName: \"kubernetes.io/projected/6d711842-6c9a-468e-842b-b44c019d0e13-kube-api-access-fx8qx\") pod \"6d711842-6c9a-468e-842b-b44c019d0e13\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.748275 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ssh-key-openstack-cell1\") pod \"6d711842-6c9a-468e-842b-b44c019d0e13\" (UID: \"6d711842-6c9a-468e-842b-b44c019d0e13\") " Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.753794 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ceph" (OuterVolumeSpecName: "ceph") pod "6d711842-6c9a-468e-842b-b44c019d0e13" (UID: "6d711842-6c9a-468e-842b-b44c019d0e13"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.753833 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d711842-6c9a-468e-842b-b44c019d0e13-kube-api-access-fx8qx" (OuterVolumeSpecName: "kube-api-access-fx8qx") pod "6d711842-6c9a-468e-842b-b44c019d0e13" (UID: "6d711842-6c9a-468e-842b-b44c019d0e13"). InnerVolumeSpecName "kube-api-access-fx8qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.789237 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "6d711842-6c9a-468e-842b-b44c019d0e13" (UID: "6d711842-6c9a-468e-842b-b44c019d0e13"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.792898 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-inventory" (OuterVolumeSpecName: "inventory") pod "6d711842-6c9a-468e-842b-b44c019d0e13" (UID: "6d711842-6c9a-468e-842b-b44c019d0e13"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.851113 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx8qx\" (UniqueName: \"kubernetes.io/projected/6d711842-6c9a-468e-842b-b44c019d0e13-kube-api-access-fx8qx\") on node \"crc\" DevicePath \"\"" Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.851144 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.851154 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:48:13 crc kubenswrapper[5033]: I0226 17:48:13.851164 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d711842-6c9a-468e-842b-b44c019d0e13-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.180850 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-xc82d" event={"ID":"6d711842-6c9a-468e-842b-b44c019d0e13","Type":"ContainerDied","Data":"ce7751e5356a4f19aa4b4802c485f70f61c16eb863c053c6ca2bbdd21f93bff1"} Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.181111 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce7751e5356a4f19aa4b4802c485f70f61c16eb863c053c6ca2bbdd21f93bff1" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.180992 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-xc82d" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.298851 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-mm9vg"] Feb 26 17:48:14 crc kubenswrapper[5033]: E0226 17:48:14.299307 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d711842-6c9a-468e-842b-b44c019d0e13" containerName="configure-network-openstack-openstack-cell1" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.299322 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d711842-6c9a-468e-842b-b44c019d0e13" containerName="configure-network-openstack-openstack-cell1" Feb 26 17:48:14 crc kubenswrapper[5033]: E0226 17:48:14.299333 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4162b533-4350-485d-a6b7-8e2a3920ee38" containerName="oc" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.299341 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="4162b533-4350-485d-a6b7-8e2a3920ee38" containerName="oc" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.299549 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="4162b533-4350-485d-a6b7-8e2a3920ee38" containerName="oc" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.299587 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d711842-6c9a-468e-842b-b44c019d0e13" containerName="configure-network-openstack-openstack-cell1" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.300351 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.302829 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.303067 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.303444 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.305744 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.315105 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-mm9vg"] Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.361932 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-inventory\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.361990 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59npv\" (UniqueName: \"kubernetes.io/projected/32f43285-acd0-474c-920f-04f59f060fb7-kube-api-access-59npv\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.362014 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ceph\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.362176 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.464738 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-inventory\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.464835 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59npv\" (UniqueName: \"kubernetes.io/projected/32f43285-acd0-474c-920f-04f59f060fb7-kube-api-access-59npv\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.464866 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ceph\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.464919 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.468547 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.469715 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-inventory\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.474390 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ceph\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.484519 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59npv\" (UniqueName: \"kubernetes.io/projected/32f43285-acd0-474c-920f-04f59f060fb7-kube-api-access-59npv\") pod \"validate-network-openstack-openstack-cell1-mm9vg\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:14 crc kubenswrapper[5033]: I0226 17:48:14.633673 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:15 crc kubenswrapper[5033]: I0226 17:48:15.198637 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-mm9vg"] Feb 26 17:48:15 crc kubenswrapper[5033]: W0226 17:48:15.206664 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32f43285_acd0_474c_920f_04f59f060fb7.slice/crio-c42ef8ec70b862416c78e4b110d6cee99c8e000ba2b1845f0b8e143b9300392f WatchSource:0}: Error finding container c42ef8ec70b862416c78e4b110d6cee99c8e000ba2b1845f0b8e143b9300392f: Status 404 returned error can't find the container with id c42ef8ec70b862416c78e4b110d6cee99c8e000ba2b1845f0b8e143b9300392f Feb 26 17:48:16 crc kubenswrapper[5033]: I0226 17:48:16.199647 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" event={"ID":"32f43285-acd0-474c-920f-04f59f060fb7","Type":"ContainerStarted","Data":"a461497692d50055673e32db29254684d63433d02a07a0fa78fd0492192f0106"} Feb 26 17:48:16 crc kubenswrapper[5033]: I0226 17:48:16.200172 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" event={"ID":"32f43285-acd0-474c-920f-04f59f060fb7","Type":"ContainerStarted","Data":"c42ef8ec70b862416c78e4b110d6cee99c8e000ba2b1845f0b8e143b9300392f"} Feb 26 17:48:16 crc kubenswrapper[5033]: I0226 17:48:16.218636 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" podStartSLOduration=1.69433311 podStartE2EDuration="2.218616109s" podCreationTimestamp="2026-02-26 17:48:14 +0000 UTC" firstStartedPulling="2026-02-26 17:48:15.210067075 +0000 UTC m=+7622.422665745" lastFinishedPulling="2026-02-26 17:48:15.734350084 +0000 UTC m=+7622.946948744" observedRunningTime="2026-02-26 17:48:16.21293625 +0000 UTC m=+7623.425534910" watchObservedRunningTime="2026-02-26 17:48:16.218616109 +0000 UTC m=+7623.431214759" Feb 26 17:48:20 crc kubenswrapper[5033]: I0226 17:48:20.188740 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:48:20 crc kubenswrapper[5033]: I0226 17:48:20.189224 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:48:20 crc kubenswrapper[5033]: I0226 17:48:20.189287 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:48:20 crc kubenswrapper[5033]: I0226 17:48:20.189820 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81832189e9ae9c1cde20ad800ba615a86bf39dd0d605e4f89a7bd4c6db781306"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:48:20 crc kubenswrapper[5033]: I0226 17:48:20.189872 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://81832189e9ae9c1cde20ad800ba615a86bf39dd0d605e4f89a7bd4c6db781306" gracePeriod=600 Feb 26 17:48:21 crc kubenswrapper[5033]: I0226 17:48:21.258427 5033 generic.go:334] "Generic (PLEG): container finished" podID="32f43285-acd0-474c-920f-04f59f060fb7" containerID="a461497692d50055673e32db29254684d63433d02a07a0fa78fd0492192f0106" exitCode=0 Feb 26 17:48:21 crc kubenswrapper[5033]: I0226 17:48:21.258710 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" event={"ID":"32f43285-acd0-474c-920f-04f59f060fb7","Type":"ContainerDied","Data":"a461497692d50055673e32db29254684d63433d02a07a0fa78fd0492192f0106"} Feb 26 17:48:21 crc kubenswrapper[5033]: I0226 17:48:21.263928 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="81832189e9ae9c1cde20ad800ba615a86bf39dd0d605e4f89a7bd4c6db781306" exitCode=0 Feb 26 17:48:21 crc kubenswrapper[5033]: I0226 17:48:21.263984 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"81832189e9ae9c1cde20ad800ba615a86bf39dd0d605e4f89a7bd4c6db781306"} Feb 26 17:48:21 crc kubenswrapper[5033]: I0226 17:48:21.264018 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644"} Feb 26 17:48:21 crc kubenswrapper[5033]: I0226 17:48:21.264040 5033 scope.go:117] "RemoveContainer" containerID="4efc91a316e4553d79e06b47deeba04312843519f06c5c934d9407649ac1a36b" Feb 26 17:48:22 crc kubenswrapper[5033]: I0226 17:48:22.796765 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:22 crc kubenswrapper[5033]: I0226 17:48:22.955242 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59npv\" (UniqueName: \"kubernetes.io/projected/32f43285-acd0-474c-920f-04f59f060fb7-kube-api-access-59npv\") pod \"32f43285-acd0-474c-920f-04f59f060fb7\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " Feb 26 17:48:22 crc kubenswrapper[5033]: I0226 17:48:22.955426 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ssh-key-openstack-cell1\") pod \"32f43285-acd0-474c-920f-04f59f060fb7\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " Feb 26 17:48:22 crc kubenswrapper[5033]: I0226 17:48:22.955544 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-inventory\") pod \"32f43285-acd0-474c-920f-04f59f060fb7\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " Feb 26 17:48:22 crc kubenswrapper[5033]: I0226 17:48:22.955566 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ceph\") pod \"32f43285-acd0-474c-920f-04f59f060fb7\" (UID: \"32f43285-acd0-474c-920f-04f59f060fb7\") " Feb 26 17:48:22 crc kubenswrapper[5033]: I0226 17:48:22.966667 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ceph" (OuterVolumeSpecName: "ceph") pod "32f43285-acd0-474c-920f-04f59f060fb7" (UID: "32f43285-acd0-474c-920f-04f59f060fb7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:48:22 crc kubenswrapper[5033]: I0226 17:48:22.966736 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f43285-acd0-474c-920f-04f59f060fb7-kube-api-access-59npv" (OuterVolumeSpecName: "kube-api-access-59npv") pod "32f43285-acd0-474c-920f-04f59f060fb7" (UID: "32f43285-acd0-474c-920f-04f59f060fb7"). InnerVolumeSpecName "kube-api-access-59npv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:48:22 crc kubenswrapper[5033]: I0226 17:48:22.994760 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "32f43285-acd0-474c-920f-04f59f060fb7" (UID: "32f43285-acd0-474c-920f-04f59f060fb7"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.022932 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-inventory" (OuterVolumeSpecName: "inventory") pod "32f43285-acd0-474c-920f-04f59f060fb7" (UID: "32f43285-acd0-474c-920f-04f59f060fb7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.058134 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.058164 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.058178 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59npv\" (UniqueName: \"kubernetes.io/projected/32f43285-acd0-474c-920f-04f59f060fb7-kube-api-access-59npv\") on node \"crc\" DevicePath \"\"" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.058192 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/32f43285-acd0-474c-920f-04f59f060fb7-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.292312 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" event={"ID":"32f43285-acd0-474c-920f-04f59f060fb7","Type":"ContainerDied","Data":"c42ef8ec70b862416c78e4b110d6cee99c8e000ba2b1845f0b8e143b9300392f"} Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.292621 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c42ef8ec70b862416c78e4b110d6cee99c8e000ba2b1845f0b8e143b9300392f" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.292534 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-mm9vg" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.367487 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-bsrfn"] Feb 26 17:48:23 crc kubenswrapper[5033]: E0226 17:48:23.367937 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f43285-acd0-474c-920f-04f59f060fb7" containerName="validate-network-openstack-openstack-cell1" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.367958 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f43285-acd0-474c-920f-04f59f060fb7" containerName="validate-network-openstack-openstack-cell1" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.368164 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f43285-acd0-474c-920f-04f59f060fb7" containerName="validate-network-openstack-openstack-cell1" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.368986 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.371718 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.371759 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.371765 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.373682 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.392808 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-bsrfn"] Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.465671 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ceph\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.465726 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxgdl\" (UniqueName: \"kubernetes.io/projected/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-kube-api-access-bxgdl\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.465752 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.466067 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-inventory\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.568339 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-inventory\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.568565 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ceph\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.568616 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxgdl\" (UniqueName: \"kubernetes.io/projected/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-kube-api-access-bxgdl\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.568641 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.573421 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ceph\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.575920 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.578057 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-inventory\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.588873 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxgdl\" (UniqueName: \"kubernetes.io/projected/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-kube-api-access-bxgdl\") pod \"install-os-openstack-openstack-cell1-bsrfn\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:23 crc kubenswrapper[5033]: I0226 17:48:23.695888 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:48:24 crc kubenswrapper[5033]: I0226 17:48:24.348241 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-bsrfn"] Feb 26 17:48:24 crc kubenswrapper[5033]: W0226 17:48:24.350913 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa523cf5_af6f_4c44_8d8e_0016d52e3db7.slice/crio-a1b1919846f341ba0ae734260bf40c35d0d4241725a323a25411b94762aea30e WatchSource:0}: Error finding container a1b1919846f341ba0ae734260bf40c35d0d4241725a323a25411b94762aea30e: Status 404 returned error can't find the container with id a1b1919846f341ba0ae734260bf40c35d0d4241725a323a25411b94762aea30e Feb 26 17:48:25 crc kubenswrapper[5033]: I0226 17:48:25.315200 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-bsrfn" event={"ID":"fa523cf5-af6f-4c44-8d8e-0016d52e3db7","Type":"ContainerStarted","Data":"ad2df13f023bfca1137045d92ce2fd3e922bfb90c33616342fe32e3197fe4d7c"} Feb 26 17:48:25 crc kubenswrapper[5033]: I0226 17:48:25.315599 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-bsrfn" event={"ID":"fa523cf5-af6f-4c44-8d8e-0016d52e3db7","Type":"ContainerStarted","Data":"a1b1919846f341ba0ae734260bf40c35d0d4241725a323a25411b94762aea30e"} Feb 26 17:48:25 crc kubenswrapper[5033]: I0226 17:48:25.339166 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-bsrfn" podStartSLOduration=1.8088883340000002 podStartE2EDuration="2.339139618s" podCreationTimestamp="2026-02-26 17:48:23 +0000 UTC" firstStartedPulling="2026-02-26 17:48:24.354459436 +0000 UTC m=+7631.567058086" lastFinishedPulling="2026-02-26 17:48:24.88471071 +0000 UTC m=+7632.097309370" observedRunningTime="2026-02-26 17:48:25.32978772 +0000 UTC m=+7632.542386380" watchObservedRunningTime="2026-02-26 17:48:25.339139618 +0000 UTC m=+7632.551738268" Feb 26 17:48:42 crc kubenswrapper[5033]: I0226 17:48:42.178180 5033 scope.go:117] "RemoveContainer" containerID="a77786ff81d9c85ac1f76dbc6b708498a28501206de1d52ebe35a1dbf1fef858" Feb 26 17:49:11 crc kubenswrapper[5033]: I0226 17:49:11.810161 5033 generic.go:334] "Generic (PLEG): container finished" podID="fa523cf5-af6f-4c44-8d8e-0016d52e3db7" containerID="ad2df13f023bfca1137045d92ce2fd3e922bfb90c33616342fe32e3197fe4d7c" exitCode=0 Feb 26 17:49:11 crc kubenswrapper[5033]: I0226 17:49:11.810248 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-bsrfn" event={"ID":"fa523cf5-af6f-4c44-8d8e-0016d52e3db7","Type":"ContainerDied","Data":"ad2df13f023bfca1137045d92ce2fd3e922bfb90c33616342fe32e3197fe4d7c"} Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.402527 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.441058 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-inventory\") pod \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.441561 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxgdl\" (UniqueName: \"kubernetes.io/projected/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-kube-api-access-bxgdl\") pod \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.441685 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ssh-key-openstack-cell1\") pod \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.441852 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ceph\") pod \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\" (UID: \"fa523cf5-af6f-4c44-8d8e-0016d52e3db7\") " Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.456960 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ceph" (OuterVolumeSpecName: "ceph") pod "fa523cf5-af6f-4c44-8d8e-0016d52e3db7" (UID: "fa523cf5-af6f-4c44-8d8e-0016d52e3db7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.457653 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-kube-api-access-bxgdl" (OuterVolumeSpecName: "kube-api-access-bxgdl") pod "fa523cf5-af6f-4c44-8d8e-0016d52e3db7" (UID: "fa523cf5-af6f-4c44-8d8e-0016d52e3db7"). InnerVolumeSpecName "kube-api-access-bxgdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.508481 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-inventory" (OuterVolumeSpecName: "inventory") pod "fa523cf5-af6f-4c44-8d8e-0016d52e3db7" (UID: "fa523cf5-af6f-4c44-8d8e-0016d52e3db7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.508535 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "fa523cf5-af6f-4c44-8d8e-0016d52e3db7" (UID: "fa523cf5-af6f-4c44-8d8e-0016d52e3db7"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.546964 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.547001 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxgdl\" (UniqueName: \"kubernetes.io/projected/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-kube-api-access-bxgdl\") on node \"crc\" DevicePath \"\"" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.547015 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.547026 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa523cf5-af6f-4c44-8d8e-0016d52e3db7-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.833800 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-bsrfn" event={"ID":"fa523cf5-af6f-4c44-8d8e-0016d52e3db7","Type":"ContainerDied","Data":"a1b1919846f341ba0ae734260bf40c35d0d4241725a323a25411b94762aea30e"} Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.834055 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1b1919846f341ba0ae734260bf40c35d0d4241725a323a25411b94762aea30e" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.834066 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-bsrfn" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.930075 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-rzn9b"] Feb 26 17:49:13 crc kubenswrapper[5033]: E0226 17:49:13.930604 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa523cf5-af6f-4c44-8d8e-0016d52e3db7" containerName="install-os-openstack-openstack-cell1" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.930627 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa523cf5-af6f-4c44-8d8e-0016d52e3db7" containerName="install-os-openstack-openstack-cell1" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.930887 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa523cf5-af6f-4c44-8d8e-0016d52e3db7" containerName="install-os-openstack-openstack-cell1" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.931887 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.935174 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.935475 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.935631 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.935991 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.951083 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-rzn9b"] Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.953371 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ceph\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.953414 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-inventory\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.953672 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vsb2\" (UniqueName: \"kubernetes.io/projected/2c298a16-67d9-45a7-bcef-322883b4030d-kube-api-access-2vsb2\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:13 crc kubenswrapper[5033]: I0226 17:49:13.953840 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.055646 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ceph\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.055690 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-inventory\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.055771 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vsb2\" (UniqueName: \"kubernetes.io/projected/2c298a16-67d9-45a7-bcef-322883b4030d-kube-api-access-2vsb2\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.055816 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.059621 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-inventory\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.062159 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.062184 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ceph\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.075006 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vsb2\" (UniqueName: \"kubernetes.io/projected/2c298a16-67d9-45a7-bcef-322883b4030d-kube-api-access-2vsb2\") pod \"configure-os-openstack-openstack-cell1-rzn9b\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.252729 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.804537 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-rzn9b"] Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.812737 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:49:14 crc kubenswrapper[5033]: I0226 17:49:14.845189 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" event={"ID":"2c298a16-67d9-45a7-bcef-322883b4030d","Type":"ContainerStarted","Data":"be4905709bb67f2ec920ab81768764c7e38bff112ef6249face909ec212c72bd"} Feb 26 17:49:15 crc kubenswrapper[5033]: I0226 17:49:15.858208 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" event={"ID":"2c298a16-67d9-45a7-bcef-322883b4030d","Type":"ContainerStarted","Data":"249cf77e610ad5984e1ce09d42178a5a2c7b6eea422131f58cdf6c273718715c"} Feb 26 17:49:15 crc kubenswrapper[5033]: I0226 17:49:15.892554 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" podStartSLOduration=2.312607092 podStartE2EDuration="2.892525965s" podCreationTimestamp="2026-02-26 17:49:13 +0000 UTC" firstStartedPulling="2026-02-26 17:49:14.81247207 +0000 UTC m=+7682.025070720" lastFinishedPulling="2026-02-26 17:49:15.392390913 +0000 UTC m=+7682.604989593" observedRunningTime="2026-02-26 17:49:15.875171922 +0000 UTC m=+7683.087770572" watchObservedRunningTime="2026-02-26 17:49:15.892525965 +0000 UTC m=+7683.105124615" Feb 26 17:49:17 crc kubenswrapper[5033]: E0226 17:49:17.667685 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa523cf5_af6f_4c44_8d8e_0016d52e3db7.slice\": RecentStats: unable to find data in memory cache]" Feb 26 17:49:27 crc kubenswrapper[5033]: E0226 17:49:27.966176 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa523cf5_af6f_4c44_8d8e_0016d52e3db7.slice\": RecentStats: unable to find data in memory cache]" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.149947 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8bnfm"] Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.168088 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bnfm"] Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.168195 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.276095 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rpdm\" (UniqueName: \"kubernetes.io/projected/67ee9623-58b8-4d40-9eb5-aaac246efd08-kube-api-access-7rpdm\") pod \"certified-operators-8bnfm\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.276575 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-utilities\") pod \"certified-operators-8bnfm\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.276773 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-catalog-content\") pod \"certified-operators-8bnfm\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.378408 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-utilities\") pod \"certified-operators-8bnfm\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.378860 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-catalog-content\") pod \"certified-operators-8bnfm\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.379005 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rpdm\" (UniqueName: \"kubernetes.io/projected/67ee9623-58b8-4d40-9eb5-aaac246efd08-kube-api-access-7rpdm\") pod \"certified-operators-8bnfm\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.378890 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-utilities\") pod \"certified-operators-8bnfm\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.379559 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-catalog-content\") pod \"certified-operators-8bnfm\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.401766 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rpdm\" (UniqueName: \"kubernetes.io/projected/67ee9623-58b8-4d40-9eb5-aaac246efd08-kube-api-access-7rpdm\") pod \"certified-operators-8bnfm\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:32 crc kubenswrapper[5033]: I0226 17:49:32.499846 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:33 crc kubenswrapper[5033]: I0226 17:49:33.007610 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bnfm"] Feb 26 17:49:33 crc kubenswrapper[5033]: W0226 17:49:33.018072 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ee9623_58b8_4d40_9eb5_aaac246efd08.slice/crio-9e8de5319a369398dda1a7536874bf43a24c717859d95c8259fef461cb861f60 WatchSource:0}: Error finding container 9e8de5319a369398dda1a7536874bf43a24c717859d95c8259fef461cb861f60: Status 404 returned error can't find the container with id 9e8de5319a369398dda1a7536874bf43a24c717859d95c8259fef461cb861f60 Feb 26 17:49:33 crc kubenswrapper[5033]: I0226 17:49:33.065194 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bnfm" event={"ID":"67ee9623-58b8-4d40-9eb5-aaac246efd08","Type":"ContainerStarted","Data":"9e8de5319a369398dda1a7536874bf43a24c717859d95c8259fef461cb861f60"} Feb 26 17:49:34 crc kubenswrapper[5033]: I0226 17:49:34.077808 5033 generic.go:334] "Generic (PLEG): container finished" podID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerID="302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03" exitCode=0 Feb 26 17:49:34 crc kubenswrapper[5033]: I0226 17:49:34.077874 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bnfm" event={"ID":"67ee9623-58b8-4d40-9eb5-aaac246efd08","Type":"ContainerDied","Data":"302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03"} Feb 26 17:49:35 crc kubenswrapper[5033]: I0226 17:49:35.097382 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bnfm" event={"ID":"67ee9623-58b8-4d40-9eb5-aaac246efd08","Type":"ContainerStarted","Data":"d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf"} Feb 26 17:49:38 crc kubenswrapper[5033]: E0226 17:49:38.307144 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa523cf5_af6f_4c44_8d8e_0016d52e3db7.slice\": RecentStats: unable to find data in memory cache]" Feb 26 17:49:38 crc kubenswrapper[5033]: I0226 17:49:38.691470 5033 generic.go:334] "Generic (PLEG): container finished" podID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerID="d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf" exitCode=0 Feb 26 17:49:38 crc kubenswrapper[5033]: I0226 17:49:38.691575 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bnfm" event={"ID":"67ee9623-58b8-4d40-9eb5-aaac246efd08","Type":"ContainerDied","Data":"d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf"} Feb 26 17:49:39 crc kubenswrapper[5033]: I0226 17:49:39.708371 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bnfm" event={"ID":"67ee9623-58b8-4d40-9eb5-aaac246efd08","Type":"ContainerStarted","Data":"f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf"} Feb 26 17:49:39 crc kubenswrapper[5033]: I0226 17:49:39.739725 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8bnfm" podStartSLOduration=2.643000924 podStartE2EDuration="7.739700463s" podCreationTimestamp="2026-02-26 17:49:32 +0000 UTC" firstStartedPulling="2026-02-26 17:49:34.082209545 +0000 UTC m=+7701.294808215" lastFinishedPulling="2026-02-26 17:49:39.178909104 +0000 UTC m=+7706.391507754" observedRunningTime="2026-02-26 17:49:39.727383273 +0000 UTC m=+7706.939981953" watchObservedRunningTime="2026-02-26 17:49:39.739700463 +0000 UTC m=+7706.952299133" Feb 26 17:49:42 crc kubenswrapper[5033]: I0226 17:49:42.271840 5033 scope.go:117] "RemoveContainer" containerID="be931a750483d744871055d1b70378e16958b73898b99eee3831f74303ab04e2" Feb 26 17:49:42 crc kubenswrapper[5033]: I0226 17:49:42.303643 5033 scope.go:117] "RemoveContainer" containerID="1edde2a7a9a72b38e5616f5cc00d2d1a3f817ca2807324dd6068ecd17c3e3cae" Feb 26 17:49:42 crc kubenswrapper[5033]: I0226 17:49:42.348144 5033 scope.go:117] "RemoveContainer" containerID="02a27cda7eae09f1901b0228f18072a405e18cd1b5a7608721b4cc2500620ff3" Feb 26 17:49:42 crc kubenswrapper[5033]: I0226 17:49:42.500360 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:42 crc kubenswrapper[5033]: I0226 17:49:42.500724 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:42 crc kubenswrapper[5033]: I0226 17:49:42.557837 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:48 crc kubenswrapper[5033]: E0226 17:49:48.612974 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa523cf5_af6f_4c44_8d8e_0016d52e3db7.slice\": RecentStats: unable to find data in memory cache]" Feb 26 17:49:52 crc kubenswrapper[5033]: I0226 17:49:52.586832 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:55 crc kubenswrapper[5033]: I0226 17:49:55.899718 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8bnfm"] Feb 26 17:49:55 crc kubenswrapper[5033]: I0226 17:49:55.900895 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8bnfm" podUID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerName="registry-server" containerID="cri-o://f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf" gracePeriod=2 Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.462486 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.586344 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-utilities\") pod \"67ee9623-58b8-4d40-9eb5-aaac246efd08\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.586818 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-catalog-content\") pod \"67ee9623-58b8-4d40-9eb5-aaac246efd08\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.587230 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rpdm\" (UniqueName: \"kubernetes.io/projected/67ee9623-58b8-4d40-9eb5-aaac246efd08-kube-api-access-7rpdm\") pod \"67ee9623-58b8-4d40-9eb5-aaac246efd08\" (UID: \"67ee9623-58b8-4d40-9eb5-aaac246efd08\") " Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.587331 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-utilities" (OuterVolumeSpecName: "utilities") pod "67ee9623-58b8-4d40-9eb5-aaac246efd08" (UID: "67ee9623-58b8-4d40-9eb5-aaac246efd08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.588457 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.599665 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ee9623-58b8-4d40-9eb5-aaac246efd08-kube-api-access-7rpdm" (OuterVolumeSpecName: "kube-api-access-7rpdm") pod "67ee9623-58b8-4d40-9eb5-aaac246efd08" (UID: "67ee9623-58b8-4d40-9eb5-aaac246efd08"). InnerVolumeSpecName "kube-api-access-7rpdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.663076 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67ee9623-58b8-4d40-9eb5-aaac246efd08" (UID: "67ee9623-58b8-4d40-9eb5-aaac246efd08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.690322 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rpdm\" (UniqueName: \"kubernetes.io/projected/67ee9623-58b8-4d40-9eb5-aaac246efd08-kube-api-access-7rpdm\") on node \"crc\" DevicePath \"\"" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.690365 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67ee9623-58b8-4d40-9eb5-aaac246efd08-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.916374 5033 generic.go:334] "Generic (PLEG): container finished" podID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerID="f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf" exitCode=0 Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.916421 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bnfm" event={"ID":"67ee9623-58b8-4d40-9eb5-aaac246efd08","Type":"ContainerDied","Data":"f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf"} Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.916478 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bnfm" event={"ID":"67ee9623-58b8-4d40-9eb5-aaac246efd08","Type":"ContainerDied","Data":"9e8de5319a369398dda1a7536874bf43a24c717859d95c8259fef461cb861f60"} Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.916499 5033 scope.go:117] "RemoveContainer" containerID="f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.916519 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bnfm" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.946827 5033 scope.go:117] "RemoveContainer" containerID="d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf" Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.972629 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8bnfm"] Feb 26 17:49:56 crc kubenswrapper[5033]: I0226 17:49:56.984415 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8bnfm"] Feb 26 17:49:57 crc kubenswrapper[5033]: I0226 17:49:57.002861 5033 scope.go:117] "RemoveContainer" containerID="302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03" Feb 26 17:49:57 crc kubenswrapper[5033]: I0226 17:49:57.038902 5033 scope.go:117] "RemoveContainer" containerID="f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf" Feb 26 17:49:57 crc kubenswrapper[5033]: E0226 17:49:57.039406 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf\": container with ID starting with f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf not found: ID does not exist" containerID="f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf" Feb 26 17:49:57 crc kubenswrapper[5033]: I0226 17:49:57.039468 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf"} err="failed to get container status \"f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf\": rpc error: code = NotFound desc = could not find container \"f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf\": container with ID starting with f704d13242298f32fa5bdf8883c3301eb9b0f5ec693ec31f5c492a0336ac3fbf not found: ID does not exist" Feb 26 17:49:57 crc kubenswrapper[5033]: I0226 17:49:57.039506 5033 scope.go:117] "RemoveContainer" containerID="d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf" Feb 26 17:49:57 crc kubenswrapper[5033]: E0226 17:49:57.040073 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf\": container with ID starting with d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf not found: ID does not exist" containerID="d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf" Feb 26 17:49:57 crc kubenswrapper[5033]: I0226 17:49:57.040117 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf"} err="failed to get container status \"d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf\": rpc error: code = NotFound desc = could not find container \"d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf\": container with ID starting with d9aa35d0c420f075e5cfa5646a751d07877125c0c3c600087ac1c2a85a609fcf not found: ID does not exist" Feb 26 17:49:57 crc kubenswrapper[5033]: I0226 17:49:57.040148 5033 scope.go:117] "RemoveContainer" containerID="302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03" Feb 26 17:49:57 crc kubenswrapper[5033]: E0226 17:49:57.040673 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03\": container with ID starting with 302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03 not found: ID does not exist" containerID="302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03" Feb 26 17:49:57 crc kubenswrapper[5033]: I0226 17:49:57.040709 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03"} err="failed to get container status \"302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03\": rpc error: code = NotFound desc = could not find container \"302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03\": container with ID starting with 302464f8a39d5faa41bb3d0a1ab48e16a3fceb19ffea05d26c58ee9362debe03 not found: ID does not exist" Feb 26 17:49:57 crc kubenswrapper[5033]: I0226 17:49:57.649925 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ee9623-58b8-4d40-9eb5-aaac246efd08" path="/var/lib/kubelet/pods/67ee9623-58b8-4d40-9eb5-aaac246efd08/volumes" Feb 26 17:49:58 crc kubenswrapper[5033]: E0226 17:49:58.925772 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa523cf5_af6f_4c44_8d8e_0016d52e3db7.slice\": RecentStats: unable to find data in memory cache]" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.179456 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535470-zj8fq"] Feb 26 17:50:00 crc kubenswrapper[5033]: E0226 17:50:00.182387 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerName="registry-server" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.182570 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerName="registry-server" Feb 26 17:50:00 crc kubenswrapper[5033]: E0226 17:50:00.182694 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerName="extract-utilities" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.182771 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerName="extract-utilities" Feb 26 17:50:00 crc kubenswrapper[5033]: E0226 17:50:00.182863 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerName="extract-content" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.182935 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerName="extract-content" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.183279 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ee9623-58b8-4d40-9eb5-aaac246efd08" containerName="registry-server" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.184252 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535470-zj8fq" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.188025 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.188031 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.188997 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.218697 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535470-zj8fq"] Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.278984 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptr74\" (UniqueName: \"kubernetes.io/projected/1a599e3e-f694-41e4-9e75-889c895d95b1-kube-api-access-ptr74\") pod \"auto-csr-approver-29535470-zj8fq\" (UID: \"1a599e3e-f694-41e4-9e75-889c895d95b1\") " pod="openshift-infra/auto-csr-approver-29535470-zj8fq" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.381672 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptr74\" (UniqueName: \"kubernetes.io/projected/1a599e3e-f694-41e4-9e75-889c895d95b1-kube-api-access-ptr74\") pod \"auto-csr-approver-29535470-zj8fq\" (UID: \"1a599e3e-f694-41e4-9e75-889c895d95b1\") " pod="openshift-infra/auto-csr-approver-29535470-zj8fq" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.398582 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptr74\" (UniqueName: \"kubernetes.io/projected/1a599e3e-f694-41e4-9e75-889c895d95b1-kube-api-access-ptr74\") pod \"auto-csr-approver-29535470-zj8fq\" (UID: \"1a599e3e-f694-41e4-9e75-889c895d95b1\") " pod="openshift-infra/auto-csr-approver-29535470-zj8fq" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.505239 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535470-zj8fq" Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.981648 5033 generic.go:334] "Generic (PLEG): container finished" podID="2c298a16-67d9-45a7-bcef-322883b4030d" containerID="249cf77e610ad5984e1ce09d42178a5a2c7b6eea422131f58cdf6c273718715c" exitCode=0 Feb 26 17:50:00 crc kubenswrapper[5033]: I0226 17:50:00.981761 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" event={"ID":"2c298a16-67d9-45a7-bcef-322883b4030d","Type":"ContainerDied","Data":"249cf77e610ad5984e1ce09d42178a5a2c7b6eea422131f58cdf6c273718715c"} Feb 26 17:50:01 crc kubenswrapper[5033]: I0226 17:50:01.000574 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535470-zj8fq"] Feb 26 17:50:01 crc kubenswrapper[5033]: W0226 17:50:01.003395 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a599e3e_f694_41e4_9e75_889c895d95b1.slice/crio-250554f797bd4f591bd40bad4baad596f7c41ed62462da8f8e947e8526c92264 WatchSource:0}: Error finding container 250554f797bd4f591bd40bad4baad596f7c41ed62462da8f8e947e8526c92264: Status 404 returned error can't find the container with id 250554f797bd4f591bd40bad4baad596f7c41ed62462da8f8e947e8526c92264 Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:01.999294 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535470-zj8fq" event={"ID":"1a599e3e-f694-41e4-9e75-889c895d95b1","Type":"ContainerStarted","Data":"250554f797bd4f591bd40bad4baad596f7c41ed62462da8f8e947e8526c92264"} Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.466223 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.637347 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-inventory\") pod \"2c298a16-67d9-45a7-bcef-322883b4030d\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.637425 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ceph\") pod \"2c298a16-67d9-45a7-bcef-322883b4030d\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.637573 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vsb2\" (UniqueName: \"kubernetes.io/projected/2c298a16-67d9-45a7-bcef-322883b4030d-kube-api-access-2vsb2\") pod \"2c298a16-67d9-45a7-bcef-322883b4030d\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.637643 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ssh-key-openstack-cell1\") pod \"2c298a16-67d9-45a7-bcef-322883b4030d\" (UID: \"2c298a16-67d9-45a7-bcef-322883b4030d\") " Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.646607 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c298a16-67d9-45a7-bcef-322883b4030d-kube-api-access-2vsb2" (OuterVolumeSpecName: "kube-api-access-2vsb2") pod "2c298a16-67d9-45a7-bcef-322883b4030d" (UID: "2c298a16-67d9-45a7-bcef-322883b4030d"). InnerVolumeSpecName "kube-api-access-2vsb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.649950 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ceph" (OuterVolumeSpecName: "ceph") pod "2c298a16-67d9-45a7-bcef-322883b4030d" (UID: "2c298a16-67d9-45a7-bcef-322883b4030d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.696995 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "2c298a16-67d9-45a7-bcef-322883b4030d" (UID: "2c298a16-67d9-45a7-bcef-322883b4030d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.698042 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-inventory" (OuterVolumeSpecName: "inventory") pod "2c298a16-67d9-45a7-bcef-322883b4030d" (UID: "2c298a16-67d9-45a7-bcef-322883b4030d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.740669 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.740733 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.740748 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vsb2\" (UniqueName: \"kubernetes.io/projected/2c298a16-67d9-45a7-bcef-322883b4030d-kube-api-access-2vsb2\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:02 crc kubenswrapper[5033]: I0226 17:50:02.740761 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2c298a16-67d9-45a7-bcef-322883b4030d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.017722 5033 generic.go:334] "Generic (PLEG): container finished" podID="1a599e3e-f694-41e4-9e75-889c895d95b1" containerID="91e4c51b2a754d1b5be6dfc3d123990510787195c0eba456ce4eb535d56b9952" exitCode=0 Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.017918 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535470-zj8fq" event={"ID":"1a599e3e-f694-41e4-9e75-889c895d95b1","Type":"ContainerDied","Data":"91e4c51b2a754d1b5be6dfc3d123990510787195c0eba456ce4eb535d56b9952"} Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.027527 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" event={"ID":"2c298a16-67d9-45a7-bcef-322883b4030d","Type":"ContainerDied","Data":"be4905709bb67f2ec920ab81768764c7e38bff112ef6249face909ec212c72bd"} Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.027648 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be4905709bb67f2ec920ab81768764c7e38bff112ef6249face909ec212c72bd" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.027749 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rzn9b" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.112056 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-prsc9"] Feb 26 17:50:03 crc kubenswrapper[5033]: E0226 17:50:03.112705 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c298a16-67d9-45a7-bcef-322883b4030d" containerName="configure-os-openstack-openstack-cell1" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.112735 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c298a16-67d9-45a7-bcef-322883b4030d" containerName="configure-os-openstack-openstack-cell1" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.113050 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c298a16-67d9-45a7-bcef-322883b4030d" containerName="configure-os-openstack-openstack-cell1" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.114090 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.116994 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.118367 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.118493 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.118768 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.129073 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-prsc9"] Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.252636 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.252782 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ceph\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.253204 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4mk\" (UniqueName: \"kubernetes.io/projected/50c67862-887c-4627-b44b-e6e4c7b50401-kube-api-access-qd4mk\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.253601 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-inventory-0\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.357790 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.357910 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ceph\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.358156 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4mk\" (UniqueName: \"kubernetes.io/projected/50c67862-887c-4627-b44b-e6e4c7b50401-kube-api-access-qd4mk\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.358242 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-inventory-0\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.366016 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-inventory-0\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.366523 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.367416 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ceph\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.389891 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4mk\" (UniqueName: \"kubernetes.io/projected/50c67862-887c-4627-b44b-e6e4c7b50401-kube-api-access-qd4mk\") pod \"ssh-known-hosts-openstack-prsc9\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:03 crc kubenswrapper[5033]: I0226 17:50:03.441032 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:04 crc kubenswrapper[5033]: I0226 17:50:04.067614 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-prsc9"] Feb 26 17:50:04 crc kubenswrapper[5033]: I0226 17:50:04.393649 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535470-zj8fq" Feb 26 17:50:04 crc kubenswrapper[5033]: I0226 17:50:04.582747 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptr74\" (UniqueName: \"kubernetes.io/projected/1a599e3e-f694-41e4-9e75-889c895d95b1-kube-api-access-ptr74\") pod \"1a599e3e-f694-41e4-9e75-889c895d95b1\" (UID: \"1a599e3e-f694-41e4-9e75-889c895d95b1\") " Feb 26 17:50:04 crc kubenswrapper[5033]: I0226 17:50:04.589009 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a599e3e-f694-41e4-9e75-889c895d95b1-kube-api-access-ptr74" (OuterVolumeSpecName: "kube-api-access-ptr74") pod "1a599e3e-f694-41e4-9e75-889c895d95b1" (UID: "1a599e3e-f694-41e4-9e75-889c895d95b1"). InnerVolumeSpecName "kube-api-access-ptr74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:50:04 crc kubenswrapper[5033]: I0226 17:50:04.685758 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptr74\" (UniqueName: \"kubernetes.io/projected/1a599e3e-f694-41e4-9e75-889c895d95b1-kube-api-access-ptr74\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:05 crc kubenswrapper[5033]: I0226 17:50:05.050763 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535470-zj8fq" event={"ID":"1a599e3e-f694-41e4-9e75-889c895d95b1","Type":"ContainerDied","Data":"250554f797bd4f591bd40bad4baad596f7c41ed62462da8f8e947e8526c92264"} Feb 26 17:50:05 crc kubenswrapper[5033]: I0226 17:50:05.050784 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535470-zj8fq" Feb 26 17:50:05 crc kubenswrapper[5033]: I0226 17:50:05.050808 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="250554f797bd4f591bd40bad4baad596f7c41ed62462da8f8e947e8526c92264" Feb 26 17:50:05 crc kubenswrapper[5033]: I0226 17:50:05.055638 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-prsc9" event={"ID":"50c67862-887c-4627-b44b-e6e4c7b50401","Type":"ContainerStarted","Data":"771e3221364b6ff9a9a60e5f4b45b001fbe1c0aeea8f70d4c3ce8a4d4044fe23"} Feb 26 17:50:05 crc kubenswrapper[5033]: I0226 17:50:05.055742 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-prsc9" event={"ID":"50c67862-887c-4627-b44b-e6e4c7b50401","Type":"ContainerStarted","Data":"7e2550a391d1d70de93d19dd5acfd34d41f4ad628192b27c4b13859d74b0b081"} Feb 26 17:50:05 crc kubenswrapper[5033]: I0226 17:50:05.086367 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-prsc9" podStartSLOduration=1.687878805 podStartE2EDuration="2.08635032s" podCreationTimestamp="2026-02-26 17:50:03 +0000 UTC" firstStartedPulling="2026-02-26 17:50:04.074425425 +0000 UTC m=+7731.287024075" lastFinishedPulling="2026-02-26 17:50:04.47289694 +0000 UTC m=+7731.685495590" observedRunningTime="2026-02-26 17:50:05.079951045 +0000 UTC m=+7732.292549705" watchObservedRunningTime="2026-02-26 17:50:05.08635032 +0000 UTC m=+7732.298948970" Feb 26 17:50:05 crc kubenswrapper[5033]: I0226 17:50:05.498721 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535464-vdf2j"] Feb 26 17:50:05 crc kubenswrapper[5033]: I0226 17:50:05.513450 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535464-vdf2j"] Feb 26 17:50:05 crc kubenswrapper[5033]: I0226 17:50:05.655606 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a993462-42a4-46ed-98e7-738df18efe2a" path="/var/lib/kubelet/pods/2a993462-42a4-46ed-98e7-738df18efe2a/volumes" Feb 26 17:50:09 crc kubenswrapper[5033]: E0226 17:50:09.264966 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa523cf5_af6f_4c44_8d8e_0016d52e3db7.slice\": RecentStats: unable to find data in memory cache]" Feb 26 17:50:14 crc kubenswrapper[5033]: I0226 17:50:14.149559 5033 generic.go:334] "Generic (PLEG): container finished" podID="50c67862-887c-4627-b44b-e6e4c7b50401" containerID="771e3221364b6ff9a9a60e5f4b45b001fbe1c0aeea8f70d4c3ce8a4d4044fe23" exitCode=0 Feb 26 17:50:14 crc kubenswrapper[5033]: I0226 17:50:14.149635 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-prsc9" event={"ID":"50c67862-887c-4627-b44b-e6e4c7b50401","Type":"ContainerDied","Data":"771e3221364b6ff9a9a60e5f4b45b001fbe1c0aeea8f70d4c3ce8a4d4044fe23"} Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.670997 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.857369 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ceph\") pod \"50c67862-887c-4627-b44b-e6e4c7b50401\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.857490 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd4mk\" (UniqueName: \"kubernetes.io/projected/50c67862-887c-4627-b44b-e6e4c7b50401-kube-api-access-qd4mk\") pod \"50c67862-887c-4627-b44b-e6e4c7b50401\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.857669 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ssh-key-openstack-cell1\") pod \"50c67862-887c-4627-b44b-e6e4c7b50401\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.857787 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-inventory-0\") pod \"50c67862-887c-4627-b44b-e6e4c7b50401\" (UID: \"50c67862-887c-4627-b44b-e6e4c7b50401\") " Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.865576 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ceph" (OuterVolumeSpecName: "ceph") pod "50c67862-887c-4627-b44b-e6e4c7b50401" (UID: "50c67862-887c-4627-b44b-e6e4c7b50401"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.865662 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c67862-887c-4627-b44b-e6e4c7b50401-kube-api-access-qd4mk" (OuterVolumeSpecName: "kube-api-access-qd4mk") pod "50c67862-887c-4627-b44b-e6e4c7b50401" (UID: "50c67862-887c-4627-b44b-e6e4c7b50401"). InnerVolumeSpecName "kube-api-access-qd4mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.884954 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "50c67862-887c-4627-b44b-e6e4c7b50401" (UID: "50c67862-887c-4627-b44b-e6e4c7b50401"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.886870 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "50c67862-887c-4627-b44b-e6e4c7b50401" (UID: "50c67862-887c-4627-b44b-e6e4c7b50401"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.960023 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.960058 5033 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.960068 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50c67862-887c-4627-b44b-e6e4c7b50401-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:15 crc kubenswrapper[5033]: I0226 17:50:15.960078 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd4mk\" (UniqueName: \"kubernetes.io/projected/50c67862-887c-4627-b44b-e6e4c7b50401-kube-api-access-qd4mk\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.175145 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-prsc9" event={"ID":"50c67862-887c-4627-b44b-e6e4c7b50401","Type":"ContainerDied","Data":"7e2550a391d1d70de93d19dd5acfd34d41f4ad628192b27c4b13859d74b0b081"} Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.175530 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e2550a391d1d70de93d19dd5acfd34d41f4ad628192b27c4b13859d74b0b081" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.175598 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-prsc9" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.265027 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-wf2zl"] Feb 26 17:50:16 crc kubenswrapper[5033]: E0226 17:50:16.265625 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c67862-887c-4627-b44b-e6e4c7b50401" containerName="ssh-known-hosts-openstack" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.265646 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c67862-887c-4627-b44b-e6e4c7b50401" containerName="ssh-known-hosts-openstack" Feb 26 17:50:16 crc kubenswrapper[5033]: E0226 17:50:16.265670 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a599e3e-f694-41e4-9e75-889c895d95b1" containerName="oc" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.265677 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a599e3e-f694-41e4-9e75-889c895d95b1" containerName="oc" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.265860 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a599e3e-f694-41e4-9e75-889c895d95b1" containerName="oc" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.265888 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c67862-887c-4627-b44b-e6e4c7b50401" containerName="ssh-known-hosts-openstack" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.266759 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.270039 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.270993 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.271594 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.276287 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.286964 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-wf2zl"] Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.368767 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.368811 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkcqk\" (UniqueName: \"kubernetes.io/projected/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-kube-api-access-rkcqk\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.368836 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-inventory\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.369200 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ceph\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.470867 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ceph\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.471032 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.471059 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkcqk\" (UniqueName: \"kubernetes.io/projected/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-kube-api-access-rkcqk\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.471089 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-inventory\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.476426 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-inventory\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.476492 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.477663 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ceph\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.499552 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkcqk\" (UniqueName: \"kubernetes.io/projected/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-kube-api-access-rkcqk\") pod \"run-os-openstack-openstack-cell1-wf2zl\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:16 crc kubenswrapper[5033]: I0226 17:50:16.595738 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:17 crc kubenswrapper[5033]: I0226 17:50:17.088186 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-wf2zl"] Feb 26 17:50:17 crc kubenswrapper[5033]: I0226 17:50:17.186931 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wf2zl" event={"ID":"40e5b6c1-de05-4f09-a9cb-6bda8c86a122","Type":"ContainerStarted","Data":"c29cb0e25165e19cdad8d11b79f44d7fad0e0b6b9d9ee8424da2d1b7e72828fe"} Feb 26 17:50:18 crc kubenswrapper[5033]: I0226 17:50:18.203895 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wf2zl" event={"ID":"40e5b6c1-de05-4f09-a9cb-6bda8c86a122","Type":"ContainerStarted","Data":"3fa9ed0de93aa4cd915b9085c1954c32633cd9cc2a2f18029131e529957df54f"} Feb 26 17:50:18 crc kubenswrapper[5033]: I0226 17:50:18.231895 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-wf2zl" podStartSLOduration=1.784675646 podStartE2EDuration="2.231866998s" podCreationTimestamp="2026-02-26 17:50:16 +0000 UTC" firstStartedPulling="2026-02-26 17:50:17.096668121 +0000 UTC m=+7744.309266771" lastFinishedPulling="2026-02-26 17:50:17.543859463 +0000 UTC m=+7744.756458123" observedRunningTime="2026-02-26 17:50:18.224362626 +0000 UTC m=+7745.436961376" watchObservedRunningTime="2026-02-26 17:50:18.231866998 +0000 UTC m=+7745.444465688" Feb 26 17:50:20 crc kubenswrapper[5033]: I0226 17:50:20.189481 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:50:20 crc kubenswrapper[5033]: I0226 17:50:20.190512 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:50:26 crc kubenswrapper[5033]: I0226 17:50:26.290434 5033 generic.go:334] "Generic (PLEG): container finished" podID="40e5b6c1-de05-4f09-a9cb-6bda8c86a122" containerID="3fa9ed0de93aa4cd915b9085c1954c32633cd9cc2a2f18029131e529957df54f" exitCode=0 Feb 26 17:50:26 crc kubenswrapper[5033]: I0226 17:50:26.290551 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wf2zl" event={"ID":"40e5b6c1-de05-4f09-a9cb-6bda8c86a122","Type":"ContainerDied","Data":"3fa9ed0de93aa4cd915b9085c1954c32633cd9cc2a2f18029131e529957df54f"} Feb 26 17:50:27 crc kubenswrapper[5033]: I0226 17:50:27.892763 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.060635 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-inventory\") pod \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.060784 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ssh-key-openstack-cell1\") pod \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.061015 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkcqk\" (UniqueName: \"kubernetes.io/projected/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-kube-api-access-rkcqk\") pod \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.061134 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ceph\") pod \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\" (UID: \"40e5b6c1-de05-4f09-a9cb-6bda8c86a122\") " Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.067235 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ceph" (OuterVolumeSpecName: "ceph") pod "40e5b6c1-de05-4f09-a9cb-6bda8c86a122" (UID: "40e5b6c1-de05-4f09-a9cb-6bda8c86a122"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.071131 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-kube-api-access-rkcqk" (OuterVolumeSpecName: "kube-api-access-rkcqk") pod "40e5b6c1-de05-4f09-a9cb-6bda8c86a122" (UID: "40e5b6c1-de05-4f09-a9cb-6bda8c86a122"). InnerVolumeSpecName "kube-api-access-rkcqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.093841 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-inventory" (OuterVolumeSpecName: "inventory") pod "40e5b6c1-de05-4f09-a9cb-6bda8c86a122" (UID: "40e5b6c1-de05-4f09-a9cb-6bda8c86a122"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.102332 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "40e5b6c1-de05-4f09-a9cb-6bda8c86a122" (UID: "40e5b6c1-de05-4f09-a9cb-6bda8c86a122"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.163833 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkcqk\" (UniqueName: \"kubernetes.io/projected/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-kube-api-access-rkcqk\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.164315 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.164325 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.164334 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/40e5b6c1-de05-4f09-a9cb-6bda8c86a122-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.315551 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wf2zl" event={"ID":"40e5b6c1-de05-4f09-a9cb-6bda8c86a122","Type":"ContainerDied","Data":"c29cb0e25165e19cdad8d11b79f44d7fad0e0b6b9d9ee8424da2d1b7e72828fe"} Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.315600 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c29cb0e25165e19cdad8d11b79f44d7fad0e0b6b9d9ee8424da2d1b7e72828fe" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.315571 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wf2zl" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.389148 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-58gxt"] Feb 26 17:50:28 crc kubenswrapper[5033]: E0226 17:50:28.389651 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e5b6c1-de05-4f09-a9cb-6bda8c86a122" containerName="run-os-openstack-openstack-cell1" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.389672 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e5b6c1-de05-4f09-a9cb-6bda8c86a122" containerName="run-os-openstack-openstack-cell1" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.389902 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e5b6c1-de05-4f09-a9cb-6bda8c86a122" containerName="run-os-openstack-openstack-cell1" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.390730 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.392640 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.394323 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.394551 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.394810 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.404690 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-58gxt"] Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.573027 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.573530 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ceph\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.573560 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khrrj\" (UniqueName: \"kubernetes.io/projected/56c12e84-7770-493b-a30e-3fcaace66b8c-kube-api-access-khrrj\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.573599 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-inventory\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.675403 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.675531 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ceph\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.675573 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khrrj\" (UniqueName: \"kubernetes.io/projected/56c12e84-7770-493b-a30e-3fcaace66b8c-kube-api-access-khrrj\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.675623 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-inventory\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.683414 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ceph\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.683893 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.685971 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-inventory\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.697920 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khrrj\" (UniqueName: \"kubernetes.io/projected/56c12e84-7770-493b-a30e-3fcaace66b8c-kube-api-access-khrrj\") pod \"reboot-os-openstack-openstack-cell1-58gxt\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:28 crc kubenswrapper[5033]: I0226 17:50:28.719130 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:29 crc kubenswrapper[5033]: I0226 17:50:29.341697 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-58gxt"] Feb 26 17:50:30 crc kubenswrapper[5033]: I0226 17:50:30.344758 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" event={"ID":"56c12e84-7770-493b-a30e-3fcaace66b8c","Type":"ContainerStarted","Data":"23ef924894a568fdaab3c1f41422fc2bc11b877c50eaf2bd16fb2bc8206ab134"} Feb 26 17:50:30 crc kubenswrapper[5033]: I0226 17:50:30.345054 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" event={"ID":"56c12e84-7770-493b-a30e-3fcaace66b8c","Type":"ContainerStarted","Data":"ea216a133894abb0afed1069a9079d5968e2cb0303ee54c06555b369776b8181"} Feb 26 17:50:30 crc kubenswrapper[5033]: I0226 17:50:30.371622 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" podStartSLOduration=1.954131754 podStartE2EDuration="2.371594461s" podCreationTimestamp="2026-02-26 17:50:28 +0000 UTC" firstStartedPulling="2026-02-26 17:50:29.343515472 +0000 UTC m=+7756.556114122" lastFinishedPulling="2026-02-26 17:50:29.760978179 +0000 UTC m=+7756.973576829" observedRunningTime="2026-02-26 17:50:30.367679586 +0000 UTC m=+7757.580278246" watchObservedRunningTime="2026-02-26 17:50:30.371594461 +0000 UTC m=+7757.584193141" Feb 26 17:50:42 crc kubenswrapper[5033]: I0226 17:50:42.492854 5033 scope.go:117] "RemoveContainer" containerID="f76f8fac10a0e1be3bde42c40c97ef72ca96b140176bf28063a314a968a2c3df" Feb 26 17:50:46 crc kubenswrapper[5033]: I0226 17:50:46.523959 5033 generic.go:334] "Generic (PLEG): container finished" podID="56c12e84-7770-493b-a30e-3fcaace66b8c" containerID="23ef924894a568fdaab3c1f41422fc2bc11b877c50eaf2bd16fb2bc8206ab134" exitCode=0 Feb 26 17:50:46 crc kubenswrapper[5033]: I0226 17:50:46.524073 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" event={"ID":"56c12e84-7770-493b-a30e-3fcaace66b8c","Type":"ContainerDied","Data":"23ef924894a568fdaab3c1f41422fc2bc11b877c50eaf2bd16fb2bc8206ab134"} Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.099086 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.234344 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-inventory\") pod \"56c12e84-7770-493b-a30e-3fcaace66b8c\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.234431 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khrrj\" (UniqueName: \"kubernetes.io/projected/56c12e84-7770-493b-a30e-3fcaace66b8c-kube-api-access-khrrj\") pod \"56c12e84-7770-493b-a30e-3fcaace66b8c\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.234495 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ceph\") pod \"56c12e84-7770-493b-a30e-3fcaace66b8c\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.234739 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ssh-key-openstack-cell1\") pod \"56c12e84-7770-493b-a30e-3fcaace66b8c\" (UID: \"56c12e84-7770-493b-a30e-3fcaace66b8c\") " Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.242625 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ceph" (OuterVolumeSpecName: "ceph") pod "56c12e84-7770-493b-a30e-3fcaace66b8c" (UID: "56c12e84-7770-493b-a30e-3fcaace66b8c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.242652 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c12e84-7770-493b-a30e-3fcaace66b8c-kube-api-access-khrrj" (OuterVolumeSpecName: "kube-api-access-khrrj") pod "56c12e84-7770-493b-a30e-3fcaace66b8c" (UID: "56c12e84-7770-493b-a30e-3fcaace66b8c"). InnerVolumeSpecName "kube-api-access-khrrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.271396 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "56c12e84-7770-493b-a30e-3fcaace66b8c" (UID: "56c12e84-7770-493b-a30e-3fcaace66b8c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.292979 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-inventory" (OuterVolumeSpecName: "inventory") pod "56c12e84-7770-493b-a30e-3fcaace66b8c" (UID: "56c12e84-7770-493b-a30e-3fcaace66b8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.337406 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.337444 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.337453 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khrrj\" (UniqueName: \"kubernetes.io/projected/56c12e84-7770-493b-a30e-3fcaace66b8c-kube-api-access-khrrj\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.337462 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56c12e84-7770-493b-a30e-3fcaace66b8c-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.576177 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" event={"ID":"56c12e84-7770-493b-a30e-3fcaace66b8c","Type":"ContainerDied","Data":"ea216a133894abb0afed1069a9079d5968e2cb0303ee54c06555b369776b8181"} Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.576230 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-58gxt" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.576238 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea216a133894abb0afed1069a9079d5968e2cb0303ee54c06555b369776b8181" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.667361 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-t9vd8"] Feb 26 17:50:48 crc kubenswrapper[5033]: E0226 17:50:48.668141 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c12e84-7770-493b-a30e-3fcaace66b8c" containerName="reboot-os-openstack-openstack-cell1" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.668172 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c12e84-7770-493b-a30e-3fcaace66b8c" containerName="reboot-os-openstack-openstack-cell1" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.668607 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c12e84-7770-493b-a30e-3fcaace66b8c" containerName="reboot-os-openstack-openstack-cell1" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.670004 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.672320 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.673067 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.673426 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.673658 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.692038 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-t9vd8"] Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.746839 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.746888 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq8zx\" (UniqueName: \"kubernetes.io/projected/9e984f92-4207-4d54-9e99-8d18b18db198-kube-api-access-tq8zx\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.746912 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.746941 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-inventory\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.747132 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.747306 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.747447 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.747572 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.747783 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.747835 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.747863 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ceph\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.747914 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850205 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850325 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq8zx\" (UniqueName: \"kubernetes.io/projected/9e984f92-4207-4d54-9e99-8d18b18db198-kube-api-access-tq8zx\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850371 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850458 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-inventory\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850505 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850563 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850635 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850706 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850832 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850881 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850933 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ceph\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.850984 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.855236 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.855520 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.855785 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-inventory\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.856189 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.857175 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.857178 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.857530 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.858068 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ceph\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.864825 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.864955 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.865136 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:48 crc kubenswrapper[5033]: I0226 17:50:48.868205 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq8zx\" (UniqueName: \"kubernetes.io/projected/9e984f92-4207-4d54-9e99-8d18b18db198-kube-api-access-tq8zx\") pod \"install-certs-openstack-openstack-cell1-t9vd8\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:49 crc kubenswrapper[5033]: I0226 17:50:49.002158 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:50:49 crc kubenswrapper[5033]: I0226 17:50:49.645376 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-t9vd8"] Feb 26 17:50:50 crc kubenswrapper[5033]: I0226 17:50:50.188702 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:50:50 crc kubenswrapper[5033]: I0226 17:50:50.189388 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:50:50 crc kubenswrapper[5033]: I0226 17:50:50.608333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" event={"ID":"9e984f92-4207-4d54-9e99-8d18b18db198","Type":"ContainerStarted","Data":"17c4c8837ec4959c82d23abab9cad666b8c0c985c729722431b60e6e4354176a"} Feb 26 17:50:51 crc kubenswrapper[5033]: I0226 17:50:51.622568 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" event={"ID":"9e984f92-4207-4d54-9e99-8d18b18db198","Type":"ContainerStarted","Data":"221039d1bccede783a8d8910edd1eabc08728b07756b33ef011d48af5b3e85a1"} Feb 26 17:50:51 crc kubenswrapper[5033]: I0226 17:50:51.653882 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" podStartSLOduration=2.944528405 podStartE2EDuration="3.653862171s" podCreationTimestamp="2026-02-26 17:50:48 +0000 UTC" firstStartedPulling="2026-02-26 17:50:49.628530804 +0000 UTC m=+7776.841129474" lastFinishedPulling="2026-02-26 17:50:50.33786459 +0000 UTC m=+7777.550463240" observedRunningTime="2026-02-26 17:50:51.650058378 +0000 UTC m=+7778.862657028" watchObservedRunningTime="2026-02-26 17:50:51.653862171 +0000 UTC m=+7778.866460821" Feb 26 17:51:09 crc kubenswrapper[5033]: I0226 17:51:09.824064 5033 generic.go:334] "Generic (PLEG): container finished" podID="9e984f92-4207-4d54-9e99-8d18b18db198" containerID="221039d1bccede783a8d8910edd1eabc08728b07756b33ef011d48af5b3e85a1" exitCode=0 Feb 26 17:51:09 crc kubenswrapper[5033]: I0226 17:51:09.824144 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" event={"ID":"9e984f92-4207-4d54-9e99-8d18b18db198","Type":"ContainerDied","Data":"221039d1bccede783a8d8910edd1eabc08728b07756b33ef011d48af5b3e85a1"} Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.370169 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409489 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq8zx\" (UniqueName: \"kubernetes.io/projected/9e984f92-4207-4d54-9e99-8d18b18db198-kube-api-access-tq8zx\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409572 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-bootstrap-combined-ca-bundle\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409655 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-telemetry-combined-ca-bundle\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409728 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ssh-key-openstack-cell1\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409770 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ceph\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409820 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-nova-combined-ca-bundle\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409836 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-dhcp-combined-ca-bundle\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409881 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-inventory\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409913 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ovn-combined-ca-bundle\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409943 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-sriov-combined-ca-bundle\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.409965 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-metadata-combined-ca-bundle\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.410003 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-libvirt-combined-ca-bundle\") pod \"9e984f92-4207-4d54-9e99-8d18b18db198\" (UID: \"9e984f92-4207-4d54-9e99-8d18b18db198\") " Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.415738 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ceph" (OuterVolumeSpecName: "ceph") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.416049 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e984f92-4207-4d54-9e99-8d18b18db198-kube-api-access-tq8zx" (OuterVolumeSpecName: "kube-api-access-tq8zx") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "kube-api-access-tq8zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.416187 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.416344 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.417145 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.417491 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.419131 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.421543 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.421571 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.423868 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.448425 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.449995 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-inventory" (OuterVolumeSpecName: "inventory") pod "9e984f92-4207-4d54-9e99-8d18b18db198" (UID: "9e984f92-4207-4d54-9e99-8d18b18db198"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513092 5033 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513132 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513142 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513151 5033 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513161 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513198 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513210 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513220 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513229 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513240 5033 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513251 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq8zx\" (UniqueName: \"kubernetes.io/projected/9e984f92-4207-4d54-9e99-8d18b18db198-kube-api-access-tq8zx\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.513275 5033 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e984f92-4207-4d54-9e99-8d18b18db198-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.847957 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" event={"ID":"9e984f92-4207-4d54-9e99-8d18b18db198","Type":"ContainerDied","Data":"17c4c8837ec4959c82d23abab9cad666b8c0c985c729722431b60e6e4354176a"} Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.848398 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17c4c8837ec4959c82d23abab9cad666b8c0c985c729722431b60e6e4354176a" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.848045 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-t9vd8" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.966768 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-fclgb"] Feb 26 17:51:11 crc kubenswrapper[5033]: E0226 17:51:11.967191 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e984f92-4207-4d54-9e99-8d18b18db198" containerName="install-certs-openstack-openstack-cell1" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.967203 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e984f92-4207-4d54-9e99-8d18b18db198" containerName="install-certs-openstack-openstack-cell1" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.967582 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e984f92-4207-4d54-9e99-8d18b18db198" containerName="install-certs-openstack-openstack-cell1" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.968311 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.972186 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.972747 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.973098 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.973132 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:51:11 crc kubenswrapper[5033]: I0226 17:51:11.979590 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-fclgb"] Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.024035 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.024282 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-inventory\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.024326 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ceph\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.024405 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxrcd\" (UniqueName: \"kubernetes.io/projected/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-kube-api-access-pxrcd\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.127024 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-inventory\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.127098 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ceph\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.127192 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxrcd\" (UniqueName: \"kubernetes.io/projected/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-kube-api-access-pxrcd\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.127384 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.133919 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ceph\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.133973 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-inventory\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.141642 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.155241 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxrcd\" (UniqueName: \"kubernetes.io/projected/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-kube-api-access-pxrcd\") pod \"ceph-client-openstack-openstack-cell1-fclgb\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.298196 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:12 crc kubenswrapper[5033]: I0226 17:51:12.917307 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-fclgb"] Feb 26 17:51:12 crc kubenswrapper[5033]: W0226 17:51:12.922936 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73f6a569_5a65_4fbf_a77f_2a6ce838ac41.slice/crio-1e071c0191aca136f80e1900f7688c0bce6ea7a082bc7f11b9942edb6966b0cc WatchSource:0}: Error finding container 1e071c0191aca136f80e1900f7688c0bce6ea7a082bc7f11b9942edb6966b0cc: Status 404 returned error can't find the container with id 1e071c0191aca136f80e1900f7688c0bce6ea7a082bc7f11b9942edb6966b0cc Feb 26 17:51:13 crc kubenswrapper[5033]: I0226 17:51:13.870373 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" event={"ID":"73f6a569-5a65-4fbf-a77f-2a6ce838ac41","Type":"ContainerStarted","Data":"bec6f249a9db8c55379e20d8ec4f4eb958ffe61af447917b6bda9c21fda6ba1d"} Feb 26 17:51:13 crc kubenswrapper[5033]: I0226 17:51:13.870731 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" event={"ID":"73f6a569-5a65-4fbf-a77f-2a6ce838ac41","Type":"ContainerStarted","Data":"1e071c0191aca136f80e1900f7688c0bce6ea7a082bc7f11b9942edb6966b0cc"} Feb 26 17:51:13 crc kubenswrapper[5033]: I0226 17:51:13.918518 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" podStartSLOduration=2.503816565 podStartE2EDuration="2.918490234s" podCreationTimestamp="2026-02-26 17:51:11 +0000 UTC" firstStartedPulling="2026-02-26 17:51:12.928443052 +0000 UTC m=+7800.141041742" lastFinishedPulling="2026-02-26 17:51:13.343116751 +0000 UTC m=+7800.555715411" observedRunningTime="2026-02-26 17:51:13.901866879 +0000 UTC m=+7801.114465619" watchObservedRunningTime="2026-02-26 17:51:13.918490234 +0000 UTC m=+7801.131088924" Feb 26 17:51:18 crc kubenswrapper[5033]: I0226 17:51:18.926923 5033 generic.go:334] "Generic (PLEG): container finished" podID="73f6a569-5a65-4fbf-a77f-2a6ce838ac41" containerID="bec6f249a9db8c55379e20d8ec4f4eb958ffe61af447917b6bda9c21fda6ba1d" exitCode=0 Feb 26 17:51:18 crc kubenswrapper[5033]: I0226 17:51:18.927030 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" event={"ID":"73f6a569-5a65-4fbf-a77f-2a6ce838ac41","Type":"ContainerDied","Data":"bec6f249a9db8c55379e20d8ec4f4eb958ffe61af447917b6bda9c21fda6ba1d"} Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.189022 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.189303 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.189406 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.190170 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.190222 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" gracePeriod=600 Feb 26 17:51:20 crc kubenswrapper[5033]: E0226 17:51:20.308610 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.403467 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.510606 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-inventory\") pod \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.510912 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxrcd\" (UniqueName: \"kubernetes.io/projected/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-kube-api-access-pxrcd\") pod \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.511064 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ceph\") pod \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.511299 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ssh-key-openstack-cell1\") pod \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\" (UID: \"73f6a569-5a65-4fbf-a77f-2a6ce838ac41\") " Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.517220 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-kube-api-access-pxrcd" (OuterVolumeSpecName: "kube-api-access-pxrcd") pod "73f6a569-5a65-4fbf-a77f-2a6ce838ac41" (UID: "73f6a569-5a65-4fbf-a77f-2a6ce838ac41"). InnerVolumeSpecName "kube-api-access-pxrcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.527889 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ceph" (OuterVolumeSpecName: "ceph") pod "73f6a569-5a65-4fbf-a77f-2a6ce838ac41" (UID: "73f6a569-5a65-4fbf-a77f-2a6ce838ac41"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.553877 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "73f6a569-5a65-4fbf-a77f-2a6ce838ac41" (UID: "73f6a569-5a65-4fbf-a77f-2a6ce838ac41"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.568551 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-inventory" (OuterVolumeSpecName: "inventory") pod "73f6a569-5a65-4fbf-a77f-2a6ce838ac41" (UID: "73f6a569-5a65-4fbf-a77f-2a6ce838ac41"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.613761 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxrcd\" (UniqueName: \"kubernetes.io/projected/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-kube-api-access-pxrcd\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.613794 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.613806 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.613816 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a569-5a65-4fbf-a77f-2a6ce838ac41-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.955993 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" event={"ID":"73f6a569-5a65-4fbf-a77f-2a6ce838ac41","Type":"ContainerDied","Data":"1e071c0191aca136f80e1900f7688c0bce6ea7a082bc7f11b9942edb6966b0cc"} Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.956086 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e071c0191aca136f80e1900f7688c0bce6ea7a082bc7f11b9942edb6966b0cc" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.956019 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-fclgb" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.960120 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" exitCode=0 Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.960183 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644"} Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.960245 5033 scope.go:117] "RemoveContainer" containerID="81832189e9ae9c1cde20ad800ba615a86bf39dd0d605e4f89a7bd4c6db781306" Feb 26 17:51:20 crc kubenswrapper[5033]: I0226 17:51:20.961100 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:51:20 crc kubenswrapper[5033]: E0226 17:51:20.961712 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.072518 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-js4r4"] Feb 26 17:51:21 crc kubenswrapper[5033]: E0226 17:51:21.073113 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f6a569-5a65-4fbf-a77f-2a6ce838ac41" containerName="ceph-client-openstack-openstack-cell1" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.073139 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f6a569-5a65-4fbf-a77f-2a6ce838ac41" containerName="ceph-client-openstack-openstack-cell1" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.073464 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f6a569-5a65-4fbf-a77f-2a6ce838ac41" containerName="ceph-client-openstack-openstack-cell1" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.074483 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.077738 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.077843 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.081670 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.082104 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.082270 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.088471 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-js4r4"] Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.127988 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5cbz\" (UniqueName: \"kubernetes.io/projected/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-kube-api-access-f5cbz\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.128167 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-inventory\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.128204 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.128397 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.128512 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.128542 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ceph\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.230503 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-inventory\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.230587 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.230703 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.230768 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.230964 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ceph\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.231425 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5cbz\" (UniqueName: \"kubernetes.io/projected/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-kube-api-access-f5cbz\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.231671 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.233900 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.234427 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.234736 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ceph\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.242201 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-inventory\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.247471 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5cbz\" (UniqueName: \"kubernetes.io/projected/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-kube-api-access-f5cbz\") pod \"ovn-openstack-openstack-cell1-js4r4\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.402689 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:51:21 crc kubenswrapper[5033]: I0226 17:51:21.996328 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-js4r4"] Feb 26 17:51:22 crc kubenswrapper[5033]: I0226 17:51:22.992245 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-js4r4" event={"ID":"e2fc5aa3-dad1-4f82-92ad-daf02318ab73","Type":"ContainerStarted","Data":"d5be5f922a94e15068361514ad6e5aa91508e96f8e77a295744112b18e8851a5"} Feb 26 17:51:25 crc kubenswrapper[5033]: I0226 17:51:25.020997 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-js4r4" event={"ID":"e2fc5aa3-dad1-4f82-92ad-daf02318ab73","Type":"ContainerStarted","Data":"1fdfe5d92f26dbfec467495da10133464baa1b6b071c24041d0edd56367bf5bd"} Feb 26 17:51:25 crc kubenswrapper[5033]: I0226 17:51:25.060497 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-js4r4" podStartSLOduration=1.765523594 podStartE2EDuration="4.060464347s" podCreationTimestamp="2026-02-26 17:51:21 +0000 UTC" firstStartedPulling="2026-02-26 17:51:21.979617003 +0000 UTC m=+7809.192215673" lastFinishedPulling="2026-02-26 17:51:24.274557766 +0000 UTC m=+7811.487156426" observedRunningTime="2026-02-26 17:51:25.051752885 +0000 UTC m=+7812.264351605" watchObservedRunningTime="2026-02-26 17:51:25.060464347 +0000 UTC m=+7812.273063027" Feb 26 17:51:31 crc kubenswrapper[5033]: I0226 17:51:31.627940 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:51:31 crc kubenswrapper[5033]: E0226 17:51:31.628640 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:51:43 crc kubenswrapper[5033]: I0226 17:51:43.628657 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:51:43 crc kubenswrapper[5033]: E0226 17:51:43.629710 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:51:56 crc kubenswrapper[5033]: I0226 17:51:56.629382 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:51:56 crc kubenswrapper[5033]: E0226 17:51:56.630355 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.151102 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535472-h5jfw"] Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.152964 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535472-h5jfw" Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.155598 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.156237 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.158034 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.164035 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535472-h5jfw"] Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.248372 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx65h\" (UniqueName: \"kubernetes.io/projected/93f69ca6-7893-4a70-ab5a-f8e993359d0d-kube-api-access-qx65h\") pod \"auto-csr-approver-29535472-h5jfw\" (UID: \"93f69ca6-7893-4a70-ab5a-f8e993359d0d\") " pod="openshift-infra/auto-csr-approver-29535472-h5jfw" Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.350699 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx65h\" (UniqueName: \"kubernetes.io/projected/93f69ca6-7893-4a70-ab5a-f8e993359d0d-kube-api-access-qx65h\") pod \"auto-csr-approver-29535472-h5jfw\" (UID: \"93f69ca6-7893-4a70-ab5a-f8e993359d0d\") " pod="openshift-infra/auto-csr-approver-29535472-h5jfw" Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.373476 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx65h\" (UniqueName: \"kubernetes.io/projected/93f69ca6-7893-4a70-ab5a-f8e993359d0d-kube-api-access-qx65h\") pod \"auto-csr-approver-29535472-h5jfw\" (UID: \"93f69ca6-7893-4a70-ab5a-f8e993359d0d\") " pod="openshift-infra/auto-csr-approver-29535472-h5jfw" Feb 26 17:52:00 crc kubenswrapper[5033]: I0226 17:52:00.506865 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535472-h5jfw" Feb 26 17:52:01 crc kubenswrapper[5033]: I0226 17:52:01.046898 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535472-h5jfw"] Feb 26 17:52:01 crc kubenswrapper[5033]: I0226 17:52:01.507599 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535472-h5jfw" event={"ID":"93f69ca6-7893-4a70-ab5a-f8e993359d0d","Type":"ContainerStarted","Data":"92695649ed21588744e0015b7b48552fea57e732912547fb24e6f4befd21d0b5"} Feb 26 17:52:02 crc kubenswrapper[5033]: I0226 17:52:02.519651 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535472-h5jfw" event={"ID":"93f69ca6-7893-4a70-ab5a-f8e993359d0d","Type":"ContainerStarted","Data":"f075004e8c1050e3027740457110c52f4e51399e687aa029858b370706aef397"} Feb 26 17:52:02 crc kubenswrapper[5033]: I0226 17:52:02.538766 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535472-h5jfw" podStartSLOduration=1.571234297 podStartE2EDuration="2.53874796s" podCreationTimestamp="2026-02-26 17:52:00 +0000 UTC" firstStartedPulling="2026-02-26 17:52:01.05283285 +0000 UTC m=+7848.265431530" lastFinishedPulling="2026-02-26 17:52:02.020346523 +0000 UTC m=+7849.232945193" observedRunningTime="2026-02-26 17:52:02.532820535 +0000 UTC m=+7849.745419185" watchObservedRunningTime="2026-02-26 17:52:02.53874796 +0000 UTC m=+7849.751346610" Feb 26 17:52:03 crc kubenswrapper[5033]: I0226 17:52:03.547231 5033 generic.go:334] "Generic (PLEG): container finished" podID="93f69ca6-7893-4a70-ab5a-f8e993359d0d" containerID="f075004e8c1050e3027740457110c52f4e51399e687aa029858b370706aef397" exitCode=0 Feb 26 17:52:03 crc kubenswrapper[5033]: I0226 17:52:03.547288 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535472-h5jfw" event={"ID":"93f69ca6-7893-4a70-ab5a-f8e993359d0d","Type":"ContainerDied","Data":"f075004e8c1050e3027740457110c52f4e51399e687aa029858b370706aef397"} Feb 26 17:52:04 crc kubenswrapper[5033]: I0226 17:52:04.936918 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535472-h5jfw" Feb 26 17:52:05 crc kubenswrapper[5033]: I0226 17:52:05.058685 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx65h\" (UniqueName: \"kubernetes.io/projected/93f69ca6-7893-4a70-ab5a-f8e993359d0d-kube-api-access-qx65h\") pod \"93f69ca6-7893-4a70-ab5a-f8e993359d0d\" (UID: \"93f69ca6-7893-4a70-ab5a-f8e993359d0d\") " Feb 26 17:52:05 crc kubenswrapper[5033]: I0226 17:52:05.064591 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f69ca6-7893-4a70-ab5a-f8e993359d0d-kube-api-access-qx65h" (OuterVolumeSpecName: "kube-api-access-qx65h") pod "93f69ca6-7893-4a70-ab5a-f8e993359d0d" (UID: "93f69ca6-7893-4a70-ab5a-f8e993359d0d"). InnerVolumeSpecName "kube-api-access-qx65h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:52:05 crc kubenswrapper[5033]: I0226 17:52:05.160812 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx65h\" (UniqueName: \"kubernetes.io/projected/93f69ca6-7893-4a70-ab5a-f8e993359d0d-kube-api-access-qx65h\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:05 crc kubenswrapper[5033]: I0226 17:52:05.583513 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535472-h5jfw" event={"ID":"93f69ca6-7893-4a70-ab5a-f8e993359d0d","Type":"ContainerDied","Data":"92695649ed21588744e0015b7b48552fea57e732912547fb24e6f4befd21d0b5"} Feb 26 17:52:05 crc kubenswrapper[5033]: I0226 17:52:05.583562 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92695649ed21588744e0015b7b48552fea57e732912547fb24e6f4befd21d0b5" Feb 26 17:52:05 crc kubenswrapper[5033]: I0226 17:52:05.583623 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535472-h5jfw" Feb 26 17:52:05 crc kubenswrapper[5033]: I0226 17:52:05.621302 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535466-vnn7l"] Feb 26 17:52:05 crc kubenswrapper[5033]: I0226 17:52:05.640217 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535466-vnn7l"] Feb 26 17:52:07 crc kubenswrapper[5033]: I0226 17:52:07.658574 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c424a5b-ca02-4917-9a46-de6ced0a6afe" path="/var/lib/kubelet/pods/5c424a5b-ca02-4917-9a46-de6ced0a6afe/volumes" Feb 26 17:52:11 crc kubenswrapper[5033]: I0226 17:52:11.628380 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:52:11 crc kubenswrapper[5033]: E0226 17:52:11.629110 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:52:26 crc kubenswrapper[5033]: I0226 17:52:26.628096 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:52:26 crc kubenswrapper[5033]: E0226 17:52:26.629113 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.186856 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-svtn6"] Feb 26 17:52:29 crc kubenswrapper[5033]: E0226 17:52:29.187919 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f69ca6-7893-4a70-ab5a-f8e993359d0d" containerName="oc" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.187941 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f69ca6-7893-4a70-ab5a-f8e993359d0d" containerName="oc" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.188429 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f69ca6-7893-4a70-ab5a-f8e993359d0d" containerName="oc" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.193276 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.214659 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svtn6"] Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.324111 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-utilities\") pod \"redhat-operators-svtn6\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.324523 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-catalog-content\") pod \"redhat-operators-svtn6\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.324560 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gn5t\" (UniqueName: \"kubernetes.io/projected/70f44618-f6aa-4508-b807-99b175fc935d-kube-api-access-5gn5t\") pod \"redhat-operators-svtn6\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.427882 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-utilities\") pod \"redhat-operators-svtn6\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.427950 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-catalog-content\") pod \"redhat-operators-svtn6\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.428030 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gn5t\" (UniqueName: \"kubernetes.io/projected/70f44618-f6aa-4508-b807-99b175fc935d-kube-api-access-5gn5t\") pod \"redhat-operators-svtn6\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.428836 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-utilities\") pod \"redhat-operators-svtn6\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.428988 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-catalog-content\") pod \"redhat-operators-svtn6\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.455431 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gn5t\" (UniqueName: \"kubernetes.io/projected/70f44618-f6aa-4508-b807-99b175fc935d-kube-api-access-5gn5t\") pod \"redhat-operators-svtn6\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:29 crc kubenswrapper[5033]: I0226 17:52:29.530824 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:30 crc kubenswrapper[5033]: I0226 17:52:30.048029 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svtn6"] Feb 26 17:52:30 crc kubenswrapper[5033]: I0226 17:52:30.863195 5033 generic.go:334] "Generic (PLEG): container finished" podID="70f44618-f6aa-4508-b807-99b175fc935d" containerID="45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea" exitCode=0 Feb 26 17:52:30 crc kubenswrapper[5033]: I0226 17:52:30.863309 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svtn6" event={"ID":"70f44618-f6aa-4508-b807-99b175fc935d","Type":"ContainerDied","Data":"45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea"} Feb 26 17:52:30 crc kubenswrapper[5033]: I0226 17:52:30.863502 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svtn6" event={"ID":"70f44618-f6aa-4508-b807-99b175fc935d","Type":"ContainerStarted","Data":"18a389285e94f1bdc025fa9a463152a1b4eefb7ce4bb257f50cf158e291cd014"} Feb 26 17:52:31 crc kubenswrapper[5033]: I0226 17:52:31.877684 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svtn6" event={"ID":"70f44618-f6aa-4508-b807-99b175fc935d","Type":"ContainerStarted","Data":"625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f"} Feb 26 17:52:31 crc kubenswrapper[5033]: I0226 17:52:31.881861 5033 generic.go:334] "Generic (PLEG): container finished" podID="e2fc5aa3-dad1-4f82-92ad-daf02318ab73" containerID="1fdfe5d92f26dbfec467495da10133464baa1b6b071c24041d0edd56367bf5bd" exitCode=0 Feb 26 17:52:31 crc kubenswrapper[5033]: I0226 17:52:31.882072 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-js4r4" event={"ID":"e2fc5aa3-dad1-4f82-92ad-daf02318ab73","Type":"ContainerDied","Data":"1fdfe5d92f26dbfec467495da10133464baa1b6b071c24041d0edd56367bf5bd"} Feb 26 17:52:32 crc kubenswrapper[5033]: I0226 17:52:32.897879 5033 generic.go:334] "Generic (PLEG): container finished" podID="70f44618-f6aa-4508-b807-99b175fc935d" containerID="625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f" exitCode=0 Feb 26 17:52:32 crc kubenswrapper[5033]: I0226 17:52:32.897940 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svtn6" event={"ID":"70f44618-f6aa-4508-b807-99b175fc935d","Type":"ContainerDied","Data":"625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f"} Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.401881 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.530385 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovncontroller-config-0\") pod \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.530796 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovn-combined-ca-bundle\") pod \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.530880 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ceph\") pod \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.531861 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5cbz\" (UniqueName: \"kubernetes.io/projected/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-kube-api-access-f5cbz\") pod \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.532071 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ssh-key-openstack-cell1\") pod \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.532142 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-inventory\") pod \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\" (UID: \"e2fc5aa3-dad1-4f82-92ad-daf02318ab73\") " Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.540183 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ceph" (OuterVolumeSpecName: "ceph") pod "e2fc5aa3-dad1-4f82-92ad-daf02318ab73" (UID: "e2fc5aa3-dad1-4f82-92ad-daf02318ab73"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.540984 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-kube-api-access-f5cbz" (OuterVolumeSpecName: "kube-api-access-f5cbz") pod "e2fc5aa3-dad1-4f82-92ad-daf02318ab73" (UID: "e2fc5aa3-dad1-4f82-92ad-daf02318ab73"). InnerVolumeSpecName "kube-api-access-f5cbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.545363 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e2fc5aa3-dad1-4f82-92ad-daf02318ab73" (UID: "e2fc5aa3-dad1-4f82-92ad-daf02318ab73"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.598107 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e2fc5aa3-dad1-4f82-92ad-daf02318ab73" (UID: "e2fc5aa3-dad1-4f82-92ad-daf02318ab73"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.598506 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-inventory" (OuterVolumeSpecName: "inventory") pod "e2fc5aa3-dad1-4f82-92ad-daf02318ab73" (UID: "e2fc5aa3-dad1-4f82-92ad-daf02318ab73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.604468 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e2fc5aa3-dad1-4f82-92ad-daf02318ab73" (UID: "e2fc5aa3-dad1-4f82-92ad-daf02318ab73"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.636296 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.636331 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.636345 5033 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.636358 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.636370 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.636381 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5cbz\" (UniqueName: \"kubernetes.io/projected/e2fc5aa3-dad1-4f82-92ad-daf02318ab73-kube-api-access-f5cbz\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.912316 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-js4r4" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.912306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-js4r4" event={"ID":"e2fc5aa3-dad1-4f82-92ad-daf02318ab73","Type":"ContainerDied","Data":"d5be5f922a94e15068361514ad6e5aa91508e96f8e77a295744112b18e8851a5"} Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.912488 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5be5f922a94e15068361514ad6e5aa91508e96f8e77a295744112b18e8851a5" Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.917198 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svtn6" event={"ID":"70f44618-f6aa-4508-b807-99b175fc935d","Type":"ContainerStarted","Data":"51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571"} Feb 26 17:52:33 crc kubenswrapper[5033]: I0226 17:52:33.954710 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-svtn6" podStartSLOduration=2.480219157 podStartE2EDuration="4.954679402s" podCreationTimestamp="2026-02-26 17:52:29 +0000 UTC" firstStartedPulling="2026-02-26 17:52:30.865784313 +0000 UTC m=+7878.078382963" lastFinishedPulling="2026-02-26 17:52:33.340244558 +0000 UTC m=+7880.552843208" observedRunningTime="2026-02-26 17:52:33.944814902 +0000 UTC m=+7881.157413602" watchObservedRunningTime="2026-02-26 17:52:33.954679402 +0000 UTC m=+7881.167278082" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.045465 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-hw98g"] Feb 26 17:52:34 crc kubenswrapper[5033]: E0226 17:52:34.045923 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2fc5aa3-dad1-4f82-92ad-daf02318ab73" containerName="ovn-openstack-openstack-cell1" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.045934 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2fc5aa3-dad1-4f82-92ad-daf02318ab73" containerName="ovn-openstack-openstack-cell1" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.046136 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2fc5aa3-dad1-4f82-92ad-daf02318ab73" containerName="ovn-openstack-openstack-cell1" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.046986 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.052776 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.053188 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.053412 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.053742 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.053925 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.054133 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.060967 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-hw98g"] Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.149958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.150320 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.150384 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.150412 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jmgt\" (UniqueName: \"kubernetes.io/projected/2f817919-d860-441e-abc9-c3a428da253b-kube-api-access-9jmgt\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.150455 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.150473 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.150519 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.252987 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.253149 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.253222 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.253300 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jmgt\" (UniqueName: \"kubernetes.io/projected/2f817919-d860-441e-abc9-c3a428da253b-kube-api-access-9jmgt\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.253367 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.253412 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.253502 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.256921 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.257468 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.257773 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.258349 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.258837 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.260001 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.279303 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jmgt\" (UniqueName: \"kubernetes.io/projected/2f817919-d860-441e-abc9-c3a428da253b-kube-api-access-9jmgt\") pod \"neutron-metadata-openstack-openstack-cell1-hw98g\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:34 crc kubenswrapper[5033]: I0226 17:52:34.396602 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:52:35 crc kubenswrapper[5033]: I0226 17:52:35.071987 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-hw98g"] Feb 26 17:52:35 crc kubenswrapper[5033]: I0226 17:52:35.938315 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" event={"ID":"2f817919-d860-441e-abc9-c3a428da253b","Type":"ContainerStarted","Data":"9b2a8ac2c7de2659b56b3470688659181e2506e27c918c4a2f3123c9c8d15787"} Feb 26 17:52:35 crc kubenswrapper[5033]: I0226 17:52:35.939186 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" event={"ID":"2f817919-d860-441e-abc9-c3a428da253b","Type":"ContainerStarted","Data":"ce35f74e00b95d2faa035660e282087c6f2c3fc6c7c1dd4462086440250366e1"} Feb 26 17:52:35 crc kubenswrapper[5033]: I0226 17:52:35.970732 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" podStartSLOduration=1.449815567 podStartE2EDuration="1.970711723s" podCreationTimestamp="2026-02-26 17:52:34 +0000 UTC" firstStartedPulling="2026-02-26 17:52:35.095809885 +0000 UTC m=+7882.308408535" lastFinishedPulling="2026-02-26 17:52:35.616706031 +0000 UTC m=+7882.829304691" observedRunningTime="2026-02-26 17:52:35.954378805 +0000 UTC m=+7883.166977465" watchObservedRunningTime="2026-02-26 17:52:35.970711723 +0000 UTC m=+7883.183310383" Feb 26 17:52:38 crc kubenswrapper[5033]: I0226 17:52:38.628472 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:52:38 crc kubenswrapper[5033]: E0226 17:52:38.629015 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:52:39 crc kubenswrapper[5033]: I0226 17:52:39.531086 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:39 crc kubenswrapper[5033]: I0226 17:52:39.531150 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:40 crc kubenswrapper[5033]: I0226 17:52:40.588170 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svtn6" podUID="70f44618-f6aa-4508-b807-99b175fc935d" containerName="registry-server" probeResult="failure" output=< Feb 26 17:52:40 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 17:52:40 crc kubenswrapper[5033]: > Feb 26 17:52:42 crc kubenswrapper[5033]: I0226 17:52:42.619536 5033 scope.go:117] "RemoveContainer" containerID="eebf15aea05bf459369299d0e91bd6015a03374fc2cc6e5f33c05b871705252c" Feb 26 17:52:49 crc kubenswrapper[5033]: I0226 17:52:49.611975 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:49 crc kubenswrapper[5033]: I0226 17:52:49.687436 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:49 crc kubenswrapper[5033]: I0226 17:52:49.856639 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svtn6"] Feb 26 17:52:51 crc kubenswrapper[5033]: I0226 17:52:51.142019 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-svtn6" podUID="70f44618-f6aa-4508-b807-99b175fc935d" containerName="registry-server" containerID="cri-o://51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571" gracePeriod=2 Feb 26 17:52:51 crc kubenswrapper[5033]: I0226 17:52:51.750010 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:51 crc kubenswrapper[5033]: I0226 17:52:51.866550 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-catalog-content\") pod \"70f44618-f6aa-4508-b807-99b175fc935d\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " Feb 26 17:52:51 crc kubenswrapper[5033]: I0226 17:52:51.866703 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-utilities\") pod \"70f44618-f6aa-4508-b807-99b175fc935d\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " Feb 26 17:52:51 crc kubenswrapper[5033]: I0226 17:52:51.866884 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gn5t\" (UniqueName: \"kubernetes.io/projected/70f44618-f6aa-4508-b807-99b175fc935d-kube-api-access-5gn5t\") pod \"70f44618-f6aa-4508-b807-99b175fc935d\" (UID: \"70f44618-f6aa-4508-b807-99b175fc935d\") " Feb 26 17:52:51 crc kubenswrapper[5033]: I0226 17:52:51.868103 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-utilities" (OuterVolumeSpecName: "utilities") pod "70f44618-f6aa-4508-b807-99b175fc935d" (UID: "70f44618-f6aa-4508-b807-99b175fc935d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:52:51 crc kubenswrapper[5033]: I0226 17:52:51.873536 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f44618-f6aa-4508-b807-99b175fc935d-kube-api-access-5gn5t" (OuterVolumeSpecName: "kube-api-access-5gn5t") pod "70f44618-f6aa-4508-b807-99b175fc935d" (UID: "70f44618-f6aa-4508-b807-99b175fc935d"). InnerVolumeSpecName "kube-api-access-5gn5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:52:51 crc kubenswrapper[5033]: I0226 17:52:51.969870 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gn5t\" (UniqueName: \"kubernetes.io/projected/70f44618-f6aa-4508-b807-99b175fc935d-kube-api-access-5gn5t\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:51 crc kubenswrapper[5033]: I0226 17:52:51.969919 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.032728 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70f44618-f6aa-4508-b807-99b175fc935d" (UID: "70f44618-f6aa-4508-b807-99b175fc935d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.072646 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f44618-f6aa-4508-b807-99b175fc935d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.151769 5033 generic.go:334] "Generic (PLEG): container finished" podID="70f44618-f6aa-4508-b807-99b175fc935d" containerID="51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571" exitCode=0 Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.151809 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svtn6" event={"ID":"70f44618-f6aa-4508-b807-99b175fc935d","Type":"ContainerDied","Data":"51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571"} Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.151843 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svtn6" event={"ID":"70f44618-f6aa-4508-b807-99b175fc935d","Type":"ContainerDied","Data":"18a389285e94f1bdc025fa9a463152a1b4eefb7ce4bb257f50cf158e291cd014"} Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.151861 5033 scope.go:117] "RemoveContainer" containerID="51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.151908 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svtn6" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.175756 5033 scope.go:117] "RemoveContainer" containerID="625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.206187 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svtn6"] Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.208116 5033 scope.go:117] "RemoveContainer" containerID="45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.215337 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-svtn6"] Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.269119 5033 scope.go:117] "RemoveContainer" containerID="51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571" Feb 26 17:52:52 crc kubenswrapper[5033]: E0226 17:52:52.269963 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571\": container with ID starting with 51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571 not found: ID does not exist" containerID="51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.270041 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571"} err="failed to get container status \"51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571\": rpc error: code = NotFound desc = could not find container \"51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571\": container with ID starting with 51ae450fc21805c9eb71d32f1b4604c3e2a8d18fc351f6b2eda392f569d2e571 not found: ID does not exist" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.270088 5033 scope.go:117] "RemoveContainer" containerID="625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f" Feb 26 17:52:52 crc kubenswrapper[5033]: E0226 17:52:52.270680 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f\": container with ID starting with 625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f not found: ID does not exist" containerID="625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.270725 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f"} err="failed to get container status \"625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f\": rpc error: code = NotFound desc = could not find container \"625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f\": container with ID starting with 625c8b7dd884630230b6488de83a8328b009831d5a632144fdf3b66fe7b15c0f not found: ID does not exist" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.270757 5033 scope.go:117] "RemoveContainer" containerID="45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea" Feb 26 17:52:52 crc kubenswrapper[5033]: E0226 17:52:52.271236 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea\": container with ID starting with 45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea not found: ID does not exist" containerID="45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.271300 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea"} err="failed to get container status \"45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea\": rpc error: code = NotFound desc = could not find container \"45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea\": container with ID starting with 45bb2af7f9fed983051ea0b218f55c7c3be37fe5cc20b04365d5a526045064ea not found: ID does not exist" Feb 26 17:52:52 crc kubenswrapper[5033]: I0226 17:52:52.629517 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:52:52 crc kubenswrapper[5033]: E0226 17:52:52.630049 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:52:53 crc kubenswrapper[5033]: I0226 17:52:53.648228 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f44618-f6aa-4508-b807-99b175fc935d" path="/var/lib/kubelet/pods/70f44618-f6aa-4508-b807-99b175fc935d/volumes" Feb 26 17:53:04 crc kubenswrapper[5033]: I0226 17:53:04.629012 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:53:04 crc kubenswrapper[5033]: E0226 17:53:04.629986 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:53:15 crc kubenswrapper[5033]: I0226 17:53:15.630449 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:53:15 crc kubenswrapper[5033]: E0226 17:53:15.631731 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:53:27 crc kubenswrapper[5033]: I0226 17:53:27.628057 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:53:27 crc kubenswrapper[5033]: E0226 17:53:27.629104 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:53:28 crc kubenswrapper[5033]: I0226 17:53:28.587369 5033 generic.go:334] "Generic (PLEG): container finished" podID="2f817919-d860-441e-abc9-c3a428da253b" containerID="9b2a8ac2c7de2659b56b3470688659181e2506e27c918c4a2f3123c9c8d15787" exitCode=0 Feb 26 17:53:28 crc kubenswrapper[5033]: I0226 17:53:28.587458 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" event={"ID":"2f817919-d860-441e-abc9-c3a428da253b","Type":"ContainerDied","Data":"9b2a8ac2c7de2659b56b3470688659181e2506e27c918c4a2f3123c9c8d15787"} Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.063496 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.165903 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-inventory\") pod \"2f817919-d860-441e-abc9-c3a428da253b\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.166047 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-metadata-combined-ca-bundle\") pod \"2f817919-d860-441e-abc9-c3a428da253b\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.166076 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ssh-key-openstack-cell1\") pod \"2f817919-d860-441e-abc9-c3a428da253b\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.166192 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jmgt\" (UniqueName: \"kubernetes.io/projected/2f817919-d860-441e-abc9-c3a428da253b-kube-api-access-9jmgt\") pod \"2f817919-d860-441e-abc9-c3a428da253b\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.166245 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"2f817919-d860-441e-abc9-c3a428da253b\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.166325 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ceph\") pod \"2f817919-d860-441e-abc9-c3a428da253b\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.166360 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-nova-metadata-neutron-config-0\") pod \"2f817919-d860-441e-abc9-c3a428da253b\" (UID: \"2f817919-d860-441e-abc9-c3a428da253b\") " Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.172259 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f817919-d860-441e-abc9-c3a428da253b-kube-api-access-9jmgt" (OuterVolumeSpecName: "kube-api-access-9jmgt") pod "2f817919-d860-441e-abc9-c3a428da253b" (UID: "2f817919-d860-441e-abc9-c3a428da253b"). InnerVolumeSpecName "kube-api-access-9jmgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.183908 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ceph" (OuterVolumeSpecName: "ceph") pod "2f817919-d860-441e-abc9-c3a428da253b" (UID: "2f817919-d860-441e-abc9-c3a428da253b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.190595 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2f817919-d860-441e-abc9-c3a428da253b" (UID: "2f817919-d860-441e-abc9-c3a428da253b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.197605 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-inventory" (OuterVolumeSpecName: "inventory") pod "2f817919-d860-441e-abc9-c3a428da253b" (UID: "2f817919-d860-441e-abc9-c3a428da253b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.199144 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "2f817919-d860-441e-abc9-c3a428da253b" (UID: "2f817919-d860-441e-abc9-c3a428da253b"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.205471 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "2f817919-d860-441e-abc9-c3a428da253b" (UID: "2f817919-d860-441e-abc9-c3a428da253b"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.205500 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "2f817919-d860-441e-abc9-c3a428da253b" (UID: "2f817919-d860-441e-abc9-c3a428da253b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.276195 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.276231 5033 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.276244 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.276253 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.276265 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.276276 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jmgt\" (UniqueName: \"kubernetes.io/projected/2f817919-d860-441e-abc9-c3a428da253b-kube-api-access-9jmgt\") on node \"crc\" DevicePath \"\"" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.276291 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2f817919-d860-441e-abc9-c3a428da253b-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.607136 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" event={"ID":"2f817919-d860-441e-abc9-c3a428da253b","Type":"ContainerDied","Data":"ce35f74e00b95d2faa035660e282087c6f2c3fc6c7c1dd4462086440250366e1"} Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.607178 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce35f74e00b95d2faa035660e282087c6f2c3fc6c7c1dd4462086440250366e1" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.607203 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hw98g" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.740580 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-7ncz9"] Feb 26 17:53:30 crc kubenswrapper[5033]: E0226 17:53:30.741306 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f44618-f6aa-4508-b807-99b175fc935d" containerName="extract-content" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.741334 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f44618-f6aa-4508-b807-99b175fc935d" containerName="extract-content" Feb 26 17:53:30 crc kubenswrapper[5033]: E0226 17:53:30.741386 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f817919-d860-441e-abc9-c3a428da253b" containerName="neutron-metadata-openstack-openstack-cell1" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.741399 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f817919-d860-441e-abc9-c3a428da253b" containerName="neutron-metadata-openstack-openstack-cell1" Feb 26 17:53:30 crc kubenswrapper[5033]: E0226 17:53:30.741449 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f44618-f6aa-4508-b807-99b175fc935d" containerName="registry-server" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.741461 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f44618-f6aa-4508-b807-99b175fc935d" containerName="registry-server" Feb 26 17:53:30 crc kubenswrapper[5033]: E0226 17:53:30.741485 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f44618-f6aa-4508-b807-99b175fc935d" containerName="extract-utilities" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.741501 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f44618-f6aa-4508-b807-99b175fc935d" containerName="extract-utilities" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.741889 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f817919-d860-441e-abc9-c3a428da253b" containerName="neutron-metadata-openstack-openstack-cell1" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.741930 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f44618-f6aa-4508-b807-99b175fc935d" containerName="registry-server" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.743308 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.745428 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.745753 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.746076 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.746531 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.750648 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.756649 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-7ncz9"] Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.888982 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.889178 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.889212 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.889259 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ceph\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.889329 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-inventory\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.889373 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gng9m\" (UniqueName: \"kubernetes.io/projected/8aa037b5-14a7-41e1-81cf-ffb06e380167-kube-api-access-gng9m\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.991177 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-inventory\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.991251 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gng9m\" (UniqueName: \"kubernetes.io/projected/8aa037b5-14a7-41e1-81cf-ffb06e380167-kube-api-access-gng9m\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.991444 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.991623 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.991647 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.991681 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ceph\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.996730 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.996849 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.997011 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.997382 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ceph\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:30 crc kubenswrapper[5033]: I0226 17:53:30.997745 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-inventory\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:31 crc kubenswrapper[5033]: I0226 17:53:31.021185 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gng9m\" (UniqueName: \"kubernetes.io/projected/8aa037b5-14a7-41e1-81cf-ffb06e380167-kube-api-access-gng9m\") pod \"libvirt-openstack-openstack-cell1-7ncz9\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:31 crc kubenswrapper[5033]: I0226 17:53:31.080428 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:53:31 crc kubenswrapper[5033]: I0226 17:53:31.674939 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-7ncz9"] Feb 26 17:53:32 crc kubenswrapper[5033]: I0226 17:53:32.629207 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" event={"ID":"8aa037b5-14a7-41e1-81cf-ffb06e380167","Type":"ContainerStarted","Data":"87089f3bdf4a3e5160f5c4d843a921358a74db2b8f331d03a4c324d4a16be3bd"} Feb 26 17:53:32 crc kubenswrapper[5033]: I0226 17:53:32.631885 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" event={"ID":"8aa037b5-14a7-41e1-81cf-ffb06e380167","Type":"ContainerStarted","Data":"a449906116b8b58f986cde063e049cfec781ed891b4e3cd3012cf1542cd890cc"} Feb 26 17:53:32 crc kubenswrapper[5033]: I0226 17:53:32.661525 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" podStartSLOduration=2.260473089 podStartE2EDuration="2.661508026s" podCreationTimestamp="2026-02-26 17:53:30 +0000 UTC" firstStartedPulling="2026-02-26 17:53:31.677323485 +0000 UTC m=+7938.889922135" lastFinishedPulling="2026-02-26 17:53:32.078358432 +0000 UTC m=+7939.290957072" observedRunningTime="2026-02-26 17:53:32.649738359 +0000 UTC m=+7939.862337099" watchObservedRunningTime="2026-02-26 17:53:32.661508026 +0000 UTC m=+7939.874106676" Feb 26 17:53:38 crc kubenswrapper[5033]: I0226 17:53:38.629003 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:53:38 crc kubenswrapper[5033]: E0226 17:53:38.630439 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:53:52 crc kubenswrapper[5033]: I0226 17:53:52.628852 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:53:52 crc kubenswrapper[5033]: E0226 17:53:52.629804 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.156726 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535474-vxz4v"] Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.158828 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535474-vxz4v" Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.161365 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.161388 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.162103 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.179823 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535474-vxz4v"] Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.281212 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt87x\" (UniqueName: \"kubernetes.io/projected/875dfb56-69a9-4423-85ba-a88ba5e4f401-kube-api-access-pt87x\") pod \"auto-csr-approver-29535474-vxz4v\" (UID: \"875dfb56-69a9-4423-85ba-a88ba5e4f401\") " pod="openshift-infra/auto-csr-approver-29535474-vxz4v" Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.383986 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt87x\" (UniqueName: \"kubernetes.io/projected/875dfb56-69a9-4423-85ba-a88ba5e4f401-kube-api-access-pt87x\") pod \"auto-csr-approver-29535474-vxz4v\" (UID: \"875dfb56-69a9-4423-85ba-a88ba5e4f401\") " pod="openshift-infra/auto-csr-approver-29535474-vxz4v" Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.403713 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt87x\" (UniqueName: \"kubernetes.io/projected/875dfb56-69a9-4423-85ba-a88ba5e4f401-kube-api-access-pt87x\") pod \"auto-csr-approver-29535474-vxz4v\" (UID: \"875dfb56-69a9-4423-85ba-a88ba5e4f401\") " pod="openshift-infra/auto-csr-approver-29535474-vxz4v" Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.483596 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535474-vxz4v" Feb 26 17:54:00 crc kubenswrapper[5033]: I0226 17:54:00.965180 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535474-vxz4v"] Feb 26 17:54:01 crc kubenswrapper[5033]: I0226 17:54:01.953897 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535474-vxz4v" event={"ID":"875dfb56-69a9-4423-85ba-a88ba5e4f401","Type":"ContainerStarted","Data":"7d3f72e07e50e7235cbaeab2547495f63624325b08679aa364efd2280a8b6eda"} Feb 26 17:54:02 crc kubenswrapper[5033]: I0226 17:54:02.964603 5033 generic.go:334] "Generic (PLEG): container finished" podID="875dfb56-69a9-4423-85ba-a88ba5e4f401" containerID="2ad5de41b5d7de1b18c5165838b57191146a9199a84945e3061d6b5459d46155" exitCode=0 Feb 26 17:54:02 crc kubenswrapper[5033]: I0226 17:54:02.964665 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535474-vxz4v" event={"ID":"875dfb56-69a9-4423-85ba-a88ba5e4f401","Type":"ContainerDied","Data":"2ad5de41b5d7de1b18c5165838b57191146a9199a84945e3061d6b5459d46155"} Feb 26 17:54:04 crc kubenswrapper[5033]: I0226 17:54:04.346740 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535474-vxz4v" Feb 26 17:54:04 crc kubenswrapper[5033]: I0226 17:54:04.479923 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt87x\" (UniqueName: \"kubernetes.io/projected/875dfb56-69a9-4423-85ba-a88ba5e4f401-kube-api-access-pt87x\") pod \"875dfb56-69a9-4423-85ba-a88ba5e4f401\" (UID: \"875dfb56-69a9-4423-85ba-a88ba5e4f401\") " Feb 26 17:54:04 crc kubenswrapper[5033]: I0226 17:54:04.485851 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875dfb56-69a9-4423-85ba-a88ba5e4f401-kube-api-access-pt87x" (OuterVolumeSpecName: "kube-api-access-pt87x") pod "875dfb56-69a9-4423-85ba-a88ba5e4f401" (UID: "875dfb56-69a9-4423-85ba-a88ba5e4f401"). InnerVolumeSpecName "kube-api-access-pt87x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:54:04 crc kubenswrapper[5033]: I0226 17:54:04.585014 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt87x\" (UniqueName: \"kubernetes.io/projected/875dfb56-69a9-4423-85ba-a88ba5e4f401-kube-api-access-pt87x\") on node \"crc\" DevicePath \"\"" Feb 26 17:54:04 crc kubenswrapper[5033]: I0226 17:54:04.990682 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535474-vxz4v" Feb 26 17:54:04 crc kubenswrapper[5033]: I0226 17:54:04.988598 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535474-vxz4v" event={"ID":"875dfb56-69a9-4423-85ba-a88ba5e4f401","Type":"ContainerDied","Data":"7d3f72e07e50e7235cbaeab2547495f63624325b08679aa364efd2280a8b6eda"} Feb 26 17:54:04 crc kubenswrapper[5033]: I0226 17:54:04.992872 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3f72e07e50e7235cbaeab2547495f63624325b08679aa364efd2280a8b6eda" Feb 26 17:54:05 crc kubenswrapper[5033]: I0226 17:54:05.451328 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535468-mtn4s"] Feb 26 17:54:05 crc kubenswrapper[5033]: I0226 17:54:05.463718 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535468-mtn4s"] Feb 26 17:54:05 crc kubenswrapper[5033]: I0226 17:54:05.653769 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4162b533-4350-485d-a6b7-8e2a3920ee38" path="/var/lib/kubelet/pods/4162b533-4350-485d-a6b7-8e2a3920ee38/volumes" Feb 26 17:54:07 crc kubenswrapper[5033]: I0226 17:54:07.628854 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:54:07 crc kubenswrapper[5033]: E0226 17:54:07.629723 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:54:22 crc kubenswrapper[5033]: I0226 17:54:22.628227 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:54:22 crc kubenswrapper[5033]: E0226 17:54:22.630334 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:54:37 crc kubenswrapper[5033]: I0226 17:54:37.704519 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:54:37 crc kubenswrapper[5033]: E0226 17:54:37.706226 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:54:42 crc kubenswrapper[5033]: I0226 17:54:42.740530 5033 scope.go:117] "RemoveContainer" containerID="8ce7faf7b3a6b59b15b1d48be427d3f4ea35efcacedb40bb3171904b0bcb0feb" Feb 26 17:54:52 crc kubenswrapper[5033]: I0226 17:54:52.630410 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:54:52 crc kubenswrapper[5033]: E0226 17:54:52.631315 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:55:06 crc kubenswrapper[5033]: I0226 17:55:06.628610 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:55:06 crc kubenswrapper[5033]: E0226 17:55:06.629316 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:55:21 crc kubenswrapper[5033]: I0226 17:55:21.629055 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:55:21 crc kubenswrapper[5033]: E0226 17:55:21.630068 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:55:34 crc kubenswrapper[5033]: I0226 17:55:34.628747 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:55:34 crc kubenswrapper[5033]: E0226 17:55:34.630202 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:55:46 crc kubenswrapper[5033]: I0226 17:55:46.629334 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:55:46 crc kubenswrapper[5033]: E0226 17:55:46.630184 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:55:59 crc kubenswrapper[5033]: I0226 17:55:59.629008 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:55:59 crc kubenswrapper[5033]: E0226 17:55:59.630125 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.150217 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535476-dtzld"] Feb 26 17:56:00 crc kubenswrapper[5033]: E0226 17:56:00.150892 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875dfb56-69a9-4423-85ba-a88ba5e4f401" containerName="oc" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.150924 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="875dfb56-69a9-4423-85ba-a88ba5e4f401" containerName="oc" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.151334 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="875dfb56-69a9-4423-85ba-a88ba5e4f401" containerName="oc" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.152545 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535476-dtzld" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.155495 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.155821 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.156290 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.206373 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535476-dtzld"] Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.301318 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5bb4\" (UniqueName: \"kubernetes.io/projected/0c394577-5481-48c5-ab68-20d54e193f60-kube-api-access-w5bb4\") pod \"auto-csr-approver-29535476-dtzld\" (UID: \"0c394577-5481-48c5-ab68-20d54e193f60\") " pod="openshift-infra/auto-csr-approver-29535476-dtzld" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.402539 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5bb4\" (UniqueName: \"kubernetes.io/projected/0c394577-5481-48c5-ab68-20d54e193f60-kube-api-access-w5bb4\") pod \"auto-csr-approver-29535476-dtzld\" (UID: \"0c394577-5481-48c5-ab68-20d54e193f60\") " pod="openshift-infra/auto-csr-approver-29535476-dtzld" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.440101 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5bb4\" (UniqueName: \"kubernetes.io/projected/0c394577-5481-48c5-ab68-20d54e193f60-kube-api-access-w5bb4\") pod \"auto-csr-approver-29535476-dtzld\" (UID: \"0c394577-5481-48c5-ab68-20d54e193f60\") " pod="openshift-infra/auto-csr-approver-29535476-dtzld" Feb 26 17:56:00 crc kubenswrapper[5033]: I0226 17:56:00.480171 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535476-dtzld" Feb 26 17:56:01 crc kubenswrapper[5033]: I0226 17:56:01.001309 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 17:56:01 crc kubenswrapper[5033]: I0226 17:56:01.012700 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535476-dtzld"] Feb 26 17:56:01 crc kubenswrapper[5033]: I0226 17:56:01.701513 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535476-dtzld" event={"ID":"0c394577-5481-48c5-ab68-20d54e193f60","Type":"ContainerStarted","Data":"880f400010a343a8aad9e5d5460611bc90089fb799ff5d9db925d8df5f105c7d"} Feb 26 17:56:02 crc kubenswrapper[5033]: I0226 17:56:02.713979 5033 generic.go:334] "Generic (PLEG): container finished" podID="0c394577-5481-48c5-ab68-20d54e193f60" containerID="5477b15b341f3eb1dbda1a3001893363cfbadeda78214432114af9ce9fefcc56" exitCode=0 Feb 26 17:56:02 crc kubenswrapper[5033]: I0226 17:56:02.714094 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535476-dtzld" event={"ID":"0c394577-5481-48c5-ab68-20d54e193f60","Type":"ContainerDied","Data":"5477b15b341f3eb1dbda1a3001893363cfbadeda78214432114af9ce9fefcc56"} Feb 26 17:56:04 crc kubenswrapper[5033]: I0226 17:56:04.105354 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535476-dtzld" Feb 26 17:56:04 crc kubenswrapper[5033]: I0226 17:56:04.186782 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5bb4\" (UniqueName: \"kubernetes.io/projected/0c394577-5481-48c5-ab68-20d54e193f60-kube-api-access-w5bb4\") pod \"0c394577-5481-48c5-ab68-20d54e193f60\" (UID: \"0c394577-5481-48c5-ab68-20d54e193f60\") " Feb 26 17:56:04 crc kubenswrapper[5033]: I0226 17:56:04.193614 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c394577-5481-48c5-ab68-20d54e193f60-kube-api-access-w5bb4" (OuterVolumeSpecName: "kube-api-access-w5bb4") pod "0c394577-5481-48c5-ab68-20d54e193f60" (UID: "0c394577-5481-48c5-ab68-20d54e193f60"). InnerVolumeSpecName "kube-api-access-w5bb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:56:04 crc kubenswrapper[5033]: I0226 17:56:04.289712 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5bb4\" (UniqueName: \"kubernetes.io/projected/0c394577-5481-48c5-ab68-20d54e193f60-kube-api-access-w5bb4\") on node \"crc\" DevicePath \"\"" Feb 26 17:56:04 crc kubenswrapper[5033]: I0226 17:56:04.733489 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535476-dtzld" event={"ID":"0c394577-5481-48c5-ab68-20d54e193f60","Type":"ContainerDied","Data":"880f400010a343a8aad9e5d5460611bc90089fb799ff5d9db925d8df5f105c7d"} Feb 26 17:56:04 crc kubenswrapper[5033]: I0226 17:56:04.733534 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="880f400010a343a8aad9e5d5460611bc90089fb799ff5d9db925d8df5f105c7d" Feb 26 17:56:04 crc kubenswrapper[5033]: I0226 17:56:04.733546 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535476-dtzld" Feb 26 17:56:05 crc kubenswrapper[5033]: I0226 17:56:05.182246 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535470-zj8fq"] Feb 26 17:56:05 crc kubenswrapper[5033]: I0226 17:56:05.190461 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535470-zj8fq"] Feb 26 17:56:05 crc kubenswrapper[5033]: I0226 17:56:05.641448 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a599e3e-f694-41e4-9e75-889c895d95b1" path="/var/lib/kubelet/pods/1a599e3e-f694-41e4-9e75-889c895d95b1/volumes" Feb 26 17:56:12 crc kubenswrapper[5033]: I0226 17:56:12.628152 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:56:12 crc kubenswrapper[5033]: E0226 17:56:12.628861 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.624086 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7jjt7"] Feb 26 17:56:21 crc kubenswrapper[5033]: E0226 17:56:21.625095 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c394577-5481-48c5-ab68-20d54e193f60" containerName="oc" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.625108 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c394577-5481-48c5-ab68-20d54e193f60" containerName="oc" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.625602 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c394577-5481-48c5-ab68-20d54e193f60" containerName="oc" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.627152 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.672826 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jjt7"] Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.731176 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-utilities\") pod \"redhat-marketplace-7jjt7\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.731353 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-catalog-content\") pod \"redhat-marketplace-7jjt7\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.731503 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmtc5\" (UniqueName: \"kubernetes.io/projected/3ff86037-1b41-493f-a736-6aad64997107-kube-api-access-kmtc5\") pod \"redhat-marketplace-7jjt7\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.833934 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmtc5\" (UniqueName: \"kubernetes.io/projected/3ff86037-1b41-493f-a736-6aad64997107-kube-api-access-kmtc5\") pod \"redhat-marketplace-7jjt7\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.834027 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-utilities\") pod \"redhat-marketplace-7jjt7\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.834167 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-catalog-content\") pod \"redhat-marketplace-7jjt7\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.834553 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-utilities\") pod \"redhat-marketplace-7jjt7\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.834668 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-catalog-content\") pod \"redhat-marketplace-7jjt7\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.858692 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmtc5\" (UniqueName: \"kubernetes.io/projected/3ff86037-1b41-493f-a736-6aad64997107-kube-api-access-kmtc5\") pod \"redhat-marketplace-7jjt7\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:21 crc kubenswrapper[5033]: I0226 17:56:21.962378 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:22 crc kubenswrapper[5033]: I0226 17:56:22.502800 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jjt7"] Feb 26 17:56:22 crc kubenswrapper[5033]: I0226 17:56:22.925012 5033 generic.go:334] "Generic (PLEG): container finished" podID="3ff86037-1b41-493f-a736-6aad64997107" containerID="cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956" exitCode=0 Feb 26 17:56:22 crc kubenswrapper[5033]: I0226 17:56:22.925091 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jjt7" event={"ID":"3ff86037-1b41-493f-a736-6aad64997107","Type":"ContainerDied","Data":"cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956"} Feb 26 17:56:22 crc kubenswrapper[5033]: I0226 17:56:22.925312 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jjt7" event={"ID":"3ff86037-1b41-493f-a736-6aad64997107","Type":"ContainerStarted","Data":"7275e5fd9a7fe2aea1f99071073abf9fcf45330d9f18077bbf62aecea3c5248e"} Feb 26 17:56:24 crc kubenswrapper[5033]: I0226 17:56:24.945859 5033 generic.go:334] "Generic (PLEG): container finished" podID="3ff86037-1b41-493f-a736-6aad64997107" containerID="b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88" exitCode=0 Feb 26 17:56:24 crc kubenswrapper[5033]: I0226 17:56:24.946370 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jjt7" event={"ID":"3ff86037-1b41-493f-a736-6aad64997107","Type":"ContainerDied","Data":"b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88"} Feb 26 17:56:25 crc kubenswrapper[5033]: I0226 17:56:25.960563 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jjt7" event={"ID":"3ff86037-1b41-493f-a736-6aad64997107","Type":"ContainerStarted","Data":"f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787"} Feb 26 17:56:27 crc kubenswrapper[5033]: I0226 17:56:27.629040 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:56:27 crc kubenswrapper[5033]: I0226 17:56:27.979372 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"06a7470efd2f33b06881edc40853b2fd654201f80a9674fabd38b2904cbfea8e"} Feb 26 17:56:28 crc kubenswrapper[5033]: I0226 17:56:28.008105 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7jjt7" podStartSLOduration=4.5786108599999995 podStartE2EDuration="7.008081279s" podCreationTimestamp="2026-02-26 17:56:21 +0000 UTC" firstStartedPulling="2026-02-26 17:56:22.927377039 +0000 UTC m=+8110.139975689" lastFinishedPulling="2026-02-26 17:56:25.356847458 +0000 UTC m=+8112.569446108" observedRunningTime="2026-02-26 17:56:25.976420397 +0000 UTC m=+8113.189019047" watchObservedRunningTime="2026-02-26 17:56:28.008081279 +0000 UTC m=+8115.220679929" Feb 26 17:56:31 crc kubenswrapper[5033]: I0226 17:56:31.962902 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:31 crc kubenswrapper[5033]: I0226 17:56:31.963501 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:32 crc kubenswrapper[5033]: I0226 17:56:32.014946 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:32 crc kubenswrapper[5033]: I0226 17:56:32.098780 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:32 crc kubenswrapper[5033]: I0226 17:56:32.273440 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jjt7"] Feb 26 17:56:34 crc kubenswrapper[5033]: I0226 17:56:34.050630 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7jjt7" podUID="3ff86037-1b41-493f-a736-6aad64997107" containerName="registry-server" containerID="cri-o://f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787" gracePeriod=2 Feb 26 17:56:34 crc kubenswrapper[5033]: I0226 17:56:34.917663 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.062522 5033 generic.go:334] "Generic (PLEG): container finished" podID="3ff86037-1b41-493f-a736-6aad64997107" containerID="f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787" exitCode=0 Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.062570 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jjt7" event={"ID":"3ff86037-1b41-493f-a736-6aad64997107","Type":"ContainerDied","Data":"f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787"} Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.062579 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jjt7" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.062604 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jjt7" event={"ID":"3ff86037-1b41-493f-a736-6aad64997107","Type":"ContainerDied","Data":"7275e5fd9a7fe2aea1f99071073abf9fcf45330d9f18077bbf62aecea3c5248e"} Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.062627 5033 scope.go:117] "RemoveContainer" containerID="f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.076754 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-utilities\") pod \"3ff86037-1b41-493f-a736-6aad64997107\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.077046 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-catalog-content\") pod \"3ff86037-1b41-493f-a736-6aad64997107\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.077111 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmtc5\" (UniqueName: \"kubernetes.io/projected/3ff86037-1b41-493f-a736-6aad64997107-kube-api-access-kmtc5\") pod \"3ff86037-1b41-493f-a736-6aad64997107\" (UID: \"3ff86037-1b41-493f-a736-6aad64997107\") " Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.078119 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-utilities" (OuterVolumeSpecName: "utilities") pod "3ff86037-1b41-493f-a736-6aad64997107" (UID: "3ff86037-1b41-493f-a736-6aad64997107"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.095582 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff86037-1b41-493f-a736-6aad64997107-kube-api-access-kmtc5" (OuterVolumeSpecName: "kube-api-access-kmtc5") pod "3ff86037-1b41-493f-a736-6aad64997107" (UID: "3ff86037-1b41-493f-a736-6aad64997107"). InnerVolumeSpecName "kube-api-access-kmtc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.098212 5033 scope.go:117] "RemoveContainer" containerID="b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.119971 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ff86037-1b41-493f-a736-6aad64997107" (UID: "3ff86037-1b41-493f-a736-6aad64997107"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.155070 5033 scope.go:117] "RemoveContainer" containerID="cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.180424 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.180455 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ff86037-1b41-493f-a736-6aad64997107-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.180465 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmtc5\" (UniqueName: \"kubernetes.io/projected/3ff86037-1b41-493f-a736-6aad64997107-kube-api-access-kmtc5\") on node \"crc\" DevicePath \"\"" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.200582 5033 scope.go:117] "RemoveContainer" containerID="f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787" Feb 26 17:56:35 crc kubenswrapper[5033]: E0226 17:56:35.200959 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787\": container with ID starting with f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787 not found: ID does not exist" containerID="f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.200996 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787"} err="failed to get container status \"f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787\": rpc error: code = NotFound desc = could not find container \"f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787\": container with ID starting with f9d604d80693f53bb327e1b6b3c5fdc8a184294857f2f9fef5fd5b533a2e3787 not found: ID does not exist" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.201016 5033 scope.go:117] "RemoveContainer" containerID="b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88" Feb 26 17:56:35 crc kubenswrapper[5033]: E0226 17:56:35.201328 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88\": container with ID starting with b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88 not found: ID does not exist" containerID="b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.201362 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88"} err="failed to get container status \"b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88\": rpc error: code = NotFound desc = could not find container \"b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88\": container with ID starting with b669bebbaf640585fafebf67f50b75b8c1dfcea7e1a692f406c4975b962eee88 not found: ID does not exist" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.201384 5033 scope.go:117] "RemoveContainer" containerID="cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956" Feb 26 17:56:35 crc kubenswrapper[5033]: E0226 17:56:35.201582 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956\": container with ID starting with cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956 not found: ID does not exist" containerID="cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.201613 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956"} err="failed to get container status \"cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956\": rpc error: code = NotFound desc = could not find container \"cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956\": container with ID starting with cfe8f2f8ae534bb1141c5559c918f90ab6ca6931a70b3ef9728d51dda7582956 not found: ID does not exist" Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.425383 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jjt7"] Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.435268 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jjt7"] Feb 26 17:56:35 crc kubenswrapper[5033]: I0226 17:56:35.641015 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff86037-1b41-493f-a736-6aad64997107" path="/var/lib/kubelet/pods/3ff86037-1b41-493f-a736-6aad64997107/volumes" Feb 26 17:56:42 crc kubenswrapper[5033]: I0226 17:56:42.867705 5033 scope.go:117] "RemoveContainer" containerID="91e4c51b2a754d1b5be6dfc3d123990510787195c0eba456ce4eb535d56b9952" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.344838 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5w9lf"] Feb 26 17:57:48 crc kubenswrapper[5033]: E0226 17:57:48.345977 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff86037-1b41-493f-a736-6aad64997107" containerName="registry-server" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.345995 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff86037-1b41-493f-a736-6aad64997107" containerName="registry-server" Feb 26 17:57:48 crc kubenswrapper[5033]: E0226 17:57:48.346038 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff86037-1b41-493f-a736-6aad64997107" containerName="extract-utilities" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.346048 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff86037-1b41-493f-a736-6aad64997107" containerName="extract-utilities" Feb 26 17:57:48 crc kubenswrapper[5033]: E0226 17:57:48.346086 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff86037-1b41-493f-a736-6aad64997107" containerName="extract-content" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.346095 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff86037-1b41-493f-a736-6aad64997107" containerName="extract-content" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.346392 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff86037-1b41-493f-a736-6aad64997107" containerName="registry-server" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.348228 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.376817 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-catalog-content\") pod \"community-operators-5w9lf\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.376994 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt2ts\" (UniqueName: \"kubernetes.io/projected/92492fc1-efb9-400c-b8da-d8d02e78ceb6-kube-api-access-gt2ts\") pod \"community-operators-5w9lf\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.377043 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-utilities\") pod \"community-operators-5w9lf\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.385116 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5w9lf"] Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.479636 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-catalog-content\") pod \"community-operators-5w9lf\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.479773 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt2ts\" (UniqueName: \"kubernetes.io/projected/92492fc1-efb9-400c-b8da-d8d02e78ceb6-kube-api-access-gt2ts\") pod \"community-operators-5w9lf\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.479811 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-utilities\") pod \"community-operators-5w9lf\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.480432 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-catalog-content\") pod \"community-operators-5w9lf\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.480718 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-utilities\") pod \"community-operators-5w9lf\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.507981 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt2ts\" (UniqueName: \"kubernetes.io/projected/92492fc1-efb9-400c-b8da-d8d02e78ceb6-kube-api-access-gt2ts\") pod \"community-operators-5w9lf\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:48 crc kubenswrapper[5033]: I0226 17:57:48.669420 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:49 crc kubenswrapper[5033]: I0226 17:57:49.230531 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5w9lf"] Feb 26 17:57:49 crc kubenswrapper[5033]: I0226 17:57:49.920524 5033 generic.go:334] "Generic (PLEG): container finished" podID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerID="e0c1e92ec7fdcc88a6f57b6fab31cfb5f5eb186675737c6931c371de514b6982" exitCode=0 Feb 26 17:57:49 crc kubenswrapper[5033]: I0226 17:57:49.920782 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5w9lf" event={"ID":"92492fc1-efb9-400c-b8da-d8d02e78ceb6","Type":"ContainerDied","Data":"e0c1e92ec7fdcc88a6f57b6fab31cfb5f5eb186675737c6931c371de514b6982"} Feb 26 17:57:49 crc kubenswrapper[5033]: I0226 17:57:49.920814 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5w9lf" event={"ID":"92492fc1-efb9-400c-b8da-d8d02e78ceb6","Type":"ContainerStarted","Data":"f8c6774ef3dc5805771a307be21eba7bc2ef3cb1097f11e8248ba55b43625d7f"} Feb 26 17:57:51 crc kubenswrapper[5033]: I0226 17:57:51.945320 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5w9lf" event={"ID":"92492fc1-efb9-400c-b8da-d8d02e78ceb6","Type":"ContainerStarted","Data":"9d482b247a6347730af9ee83b7888c89bae796aa2f40c3f39761f638a9b03310"} Feb 26 17:57:54 crc kubenswrapper[5033]: I0226 17:57:54.974869 5033 generic.go:334] "Generic (PLEG): container finished" podID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerID="9d482b247a6347730af9ee83b7888c89bae796aa2f40c3f39761f638a9b03310" exitCode=0 Feb 26 17:57:54 crc kubenswrapper[5033]: I0226 17:57:54.974973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5w9lf" event={"ID":"92492fc1-efb9-400c-b8da-d8d02e78ceb6","Type":"ContainerDied","Data":"9d482b247a6347730af9ee83b7888c89bae796aa2f40c3f39761f638a9b03310"} Feb 26 17:57:56 crc kubenswrapper[5033]: I0226 17:57:56.997049 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5w9lf" event={"ID":"92492fc1-efb9-400c-b8da-d8d02e78ceb6","Type":"ContainerStarted","Data":"ed4f70a57c22a9932a7baaf48c495b648c8684c17a7eaa074712df6b54ad373c"} Feb 26 17:57:57 crc kubenswrapper[5033]: I0226 17:57:57.017675 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5w9lf" podStartSLOduration=3.014737525 podStartE2EDuration="9.017658696s" podCreationTimestamp="2026-02-26 17:57:48 +0000 UTC" firstStartedPulling="2026-02-26 17:57:49.922829792 +0000 UTC m=+8197.135428442" lastFinishedPulling="2026-02-26 17:57:55.925750963 +0000 UTC m=+8203.138349613" observedRunningTime="2026-02-26 17:57:57.01285785 +0000 UTC m=+8204.225456490" watchObservedRunningTime="2026-02-26 17:57:57.017658696 +0000 UTC m=+8204.230257346" Feb 26 17:57:58 crc kubenswrapper[5033]: I0226 17:57:58.669867 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:58 crc kubenswrapper[5033]: I0226 17:57:58.670208 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:57:58 crc kubenswrapper[5033]: I0226 17:57:58.720515 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.149866 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535478-4trzm"] Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.152722 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535478-4trzm" Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.155537 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.155948 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.156452 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.164759 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535478-4trzm"] Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.315233 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ngt6\" (UniqueName: \"kubernetes.io/projected/455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0-kube-api-access-6ngt6\") pod \"auto-csr-approver-29535478-4trzm\" (UID: \"455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0\") " pod="openshift-infra/auto-csr-approver-29535478-4trzm" Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.418663 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ngt6\" (UniqueName: \"kubernetes.io/projected/455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0-kube-api-access-6ngt6\") pod \"auto-csr-approver-29535478-4trzm\" (UID: \"455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0\") " pod="openshift-infra/auto-csr-approver-29535478-4trzm" Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.442290 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ngt6\" (UniqueName: \"kubernetes.io/projected/455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0-kube-api-access-6ngt6\") pod \"auto-csr-approver-29535478-4trzm\" (UID: \"455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0\") " pod="openshift-infra/auto-csr-approver-29535478-4trzm" Feb 26 17:58:00 crc kubenswrapper[5033]: I0226 17:58:00.510182 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535478-4trzm" Feb 26 17:58:01 crc kubenswrapper[5033]: I0226 17:58:01.018616 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535478-4trzm"] Feb 26 17:58:01 crc kubenswrapper[5033]: I0226 17:58:01.077392 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535478-4trzm" event={"ID":"455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0","Type":"ContainerStarted","Data":"7aa7ef8bf847c48a119919a5cdf7eb05a8ff41da5907df954919ce90c6322bcf"} Feb 26 17:58:03 crc kubenswrapper[5033]: I0226 17:58:03.099417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535478-4trzm" event={"ID":"455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0","Type":"ContainerStarted","Data":"c77d6edeaaa2618f97aaf6742ce48adaade5220adf6f02a639819c2d7d1f4b8b"} Feb 26 17:58:03 crc kubenswrapper[5033]: I0226 17:58:03.117493 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535478-4trzm" podStartSLOduration=1.8706524230000001 podStartE2EDuration="3.117478081s" podCreationTimestamp="2026-02-26 17:58:00 +0000 UTC" firstStartedPulling="2026-02-26 17:58:01.051642244 +0000 UTC m=+8208.264240894" lastFinishedPulling="2026-02-26 17:58:02.298467902 +0000 UTC m=+8209.511066552" observedRunningTime="2026-02-26 17:58:03.113648857 +0000 UTC m=+8210.326247507" watchObservedRunningTime="2026-02-26 17:58:03.117478081 +0000 UTC m=+8210.330076721" Feb 26 17:58:04 crc kubenswrapper[5033]: I0226 17:58:04.109635 5033 generic.go:334] "Generic (PLEG): container finished" podID="455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0" containerID="c77d6edeaaa2618f97aaf6742ce48adaade5220adf6f02a639819c2d7d1f4b8b" exitCode=0 Feb 26 17:58:04 crc kubenswrapper[5033]: I0226 17:58:04.109755 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535478-4trzm" event={"ID":"455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0","Type":"ContainerDied","Data":"c77d6edeaaa2618f97aaf6742ce48adaade5220adf6f02a639819c2d7d1f4b8b"} Feb 26 17:58:05 crc kubenswrapper[5033]: I0226 17:58:05.589144 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535478-4trzm" Feb 26 17:58:05 crc kubenswrapper[5033]: I0226 17:58:05.735976 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ngt6\" (UniqueName: \"kubernetes.io/projected/455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0-kube-api-access-6ngt6\") pod \"455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0\" (UID: \"455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0\") " Feb 26 17:58:05 crc kubenswrapper[5033]: I0226 17:58:05.743427 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0-kube-api-access-6ngt6" (OuterVolumeSpecName: "kube-api-access-6ngt6") pod "455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0" (UID: "455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0"). InnerVolumeSpecName "kube-api-access-6ngt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:58:05 crc kubenswrapper[5033]: I0226 17:58:05.839187 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ngt6\" (UniqueName: \"kubernetes.io/projected/455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0-kube-api-access-6ngt6\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:06 crc kubenswrapper[5033]: I0226 17:58:06.149677 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535478-4trzm" event={"ID":"455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0","Type":"ContainerDied","Data":"7aa7ef8bf847c48a119919a5cdf7eb05a8ff41da5907df954919ce90c6322bcf"} Feb 26 17:58:06 crc kubenswrapper[5033]: I0226 17:58:06.149725 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aa7ef8bf847c48a119919a5cdf7eb05a8ff41da5907df954919ce90c6322bcf" Feb 26 17:58:06 crc kubenswrapper[5033]: I0226 17:58:06.149806 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535478-4trzm" Feb 26 17:58:06 crc kubenswrapper[5033]: I0226 17:58:06.218918 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535472-h5jfw"] Feb 26 17:58:06 crc kubenswrapper[5033]: I0226 17:58:06.236211 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535472-h5jfw"] Feb 26 17:58:07 crc kubenswrapper[5033]: I0226 17:58:07.648336 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f69ca6-7893-4a70-ab5a-f8e993359d0d" path="/var/lib/kubelet/pods/93f69ca6-7893-4a70-ab5a-f8e993359d0d/volumes" Feb 26 17:58:08 crc kubenswrapper[5033]: I0226 17:58:08.721060 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:58:08 crc kubenswrapper[5033]: I0226 17:58:08.778724 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5w9lf"] Feb 26 17:58:09 crc kubenswrapper[5033]: I0226 17:58:09.183487 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5w9lf" podUID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerName="registry-server" containerID="cri-o://ed4f70a57c22a9932a7baaf48c495b648c8684c17a7eaa074712df6b54ad373c" gracePeriod=2 Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.194846 5033 generic.go:334] "Generic (PLEG): container finished" podID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerID="ed4f70a57c22a9932a7baaf48c495b648c8684c17a7eaa074712df6b54ad373c" exitCode=0 Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.195331 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5w9lf" event={"ID":"92492fc1-efb9-400c-b8da-d8d02e78ceb6","Type":"ContainerDied","Data":"ed4f70a57c22a9932a7baaf48c495b648c8684c17a7eaa074712df6b54ad373c"} Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.368031 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.484642 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-catalog-content\") pod \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.484715 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt2ts\" (UniqueName: \"kubernetes.io/projected/92492fc1-efb9-400c-b8da-d8d02e78ceb6-kube-api-access-gt2ts\") pod \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.484756 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-utilities\") pod \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\" (UID: \"92492fc1-efb9-400c-b8da-d8d02e78ceb6\") " Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.486507 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-utilities" (OuterVolumeSpecName: "utilities") pod "92492fc1-efb9-400c-b8da-d8d02e78ceb6" (UID: "92492fc1-efb9-400c-b8da-d8d02e78ceb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.501641 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92492fc1-efb9-400c-b8da-d8d02e78ceb6-kube-api-access-gt2ts" (OuterVolumeSpecName: "kube-api-access-gt2ts") pod "92492fc1-efb9-400c-b8da-d8d02e78ceb6" (UID: "92492fc1-efb9-400c-b8da-d8d02e78ceb6"). InnerVolumeSpecName "kube-api-access-gt2ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.588407 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt2ts\" (UniqueName: \"kubernetes.io/projected/92492fc1-efb9-400c-b8da-d8d02e78ceb6-kube-api-access-gt2ts\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.588452 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.748880 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92492fc1-efb9-400c-b8da-d8d02e78ceb6" (UID: "92492fc1-efb9-400c-b8da-d8d02e78ceb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 17:58:10 crc kubenswrapper[5033]: I0226 17:58:10.792315 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92492fc1-efb9-400c-b8da-d8d02e78ceb6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:11 crc kubenswrapper[5033]: I0226 17:58:11.212546 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5w9lf" event={"ID":"92492fc1-efb9-400c-b8da-d8d02e78ceb6","Type":"ContainerDied","Data":"f8c6774ef3dc5805771a307be21eba7bc2ef3cb1097f11e8248ba55b43625d7f"} Feb 26 17:58:11 crc kubenswrapper[5033]: I0226 17:58:11.212614 5033 scope.go:117] "RemoveContainer" containerID="ed4f70a57c22a9932a7baaf48c495b648c8684c17a7eaa074712df6b54ad373c" Feb 26 17:58:11 crc kubenswrapper[5033]: I0226 17:58:11.213629 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5w9lf" Feb 26 17:58:11 crc kubenswrapper[5033]: I0226 17:58:11.264815 5033 scope.go:117] "RemoveContainer" containerID="9d482b247a6347730af9ee83b7888c89bae796aa2f40c3f39761f638a9b03310" Feb 26 17:58:11 crc kubenswrapper[5033]: I0226 17:58:11.269742 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5w9lf"] Feb 26 17:58:11 crc kubenswrapper[5033]: I0226 17:58:11.282979 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5w9lf"] Feb 26 17:58:11 crc kubenswrapper[5033]: I0226 17:58:11.290685 5033 scope.go:117] "RemoveContainer" containerID="e0c1e92ec7fdcc88a6f57b6fab31cfb5f5eb186675737c6931c371de514b6982" Feb 26 17:58:11 crc kubenswrapper[5033]: I0226 17:58:11.649887 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" path="/var/lib/kubelet/pods/92492fc1-efb9-400c-b8da-d8d02e78ceb6/volumes" Feb 26 17:58:17 crc kubenswrapper[5033]: I0226 17:58:17.279559 5033 generic.go:334] "Generic (PLEG): container finished" podID="8aa037b5-14a7-41e1-81cf-ffb06e380167" containerID="87089f3bdf4a3e5160f5c4d843a921358a74db2b8f331d03a4c324d4a16be3bd" exitCode=0 Feb 26 17:58:17 crc kubenswrapper[5033]: I0226 17:58:17.279666 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" event={"ID":"8aa037b5-14a7-41e1-81cf-ffb06e380167","Type":"ContainerDied","Data":"87089f3bdf4a3e5160f5c4d843a921358a74db2b8f331d03a4c324d4a16be3bd"} Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.799320 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.970071 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ceph\") pod \"8aa037b5-14a7-41e1-81cf-ffb06e380167\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.970168 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ssh-key-openstack-cell1\") pod \"8aa037b5-14a7-41e1-81cf-ffb06e380167\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.970230 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-combined-ca-bundle\") pod \"8aa037b5-14a7-41e1-81cf-ffb06e380167\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.970396 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-inventory\") pod \"8aa037b5-14a7-41e1-81cf-ffb06e380167\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.970555 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-secret-0\") pod \"8aa037b5-14a7-41e1-81cf-ffb06e380167\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.970621 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gng9m\" (UniqueName: \"kubernetes.io/projected/8aa037b5-14a7-41e1-81cf-ffb06e380167-kube-api-access-gng9m\") pod \"8aa037b5-14a7-41e1-81cf-ffb06e380167\" (UID: \"8aa037b5-14a7-41e1-81cf-ffb06e380167\") " Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.981401 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ceph" (OuterVolumeSpecName: "ceph") pod "8aa037b5-14a7-41e1-81cf-ffb06e380167" (UID: "8aa037b5-14a7-41e1-81cf-ffb06e380167"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.981463 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8aa037b5-14a7-41e1-81cf-ffb06e380167" (UID: "8aa037b5-14a7-41e1-81cf-ffb06e380167"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:58:18 crc kubenswrapper[5033]: I0226 17:58:18.981480 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa037b5-14a7-41e1-81cf-ffb06e380167-kube-api-access-gng9m" (OuterVolumeSpecName: "kube-api-access-gng9m") pod "8aa037b5-14a7-41e1-81cf-ffb06e380167" (UID: "8aa037b5-14a7-41e1-81cf-ffb06e380167"). InnerVolumeSpecName "kube-api-access-gng9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.015315 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "8aa037b5-14a7-41e1-81cf-ffb06e380167" (UID: "8aa037b5-14a7-41e1-81cf-ffb06e380167"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.030080 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-inventory" (OuterVolumeSpecName: "inventory") pod "8aa037b5-14a7-41e1-81cf-ffb06e380167" (UID: "8aa037b5-14a7-41e1-81cf-ffb06e380167"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.033318 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8aa037b5-14a7-41e1-81cf-ffb06e380167" (UID: "8aa037b5-14a7-41e1-81cf-ffb06e380167"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.073425 5033 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.073472 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.073490 5033 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.073507 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gng9m\" (UniqueName: \"kubernetes.io/projected/8aa037b5-14a7-41e1-81cf-ffb06e380167-kube-api-access-gng9m\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.073523 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.073540 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8aa037b5-14a7-41e1-81cf-ffb06e380167-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.320573 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" event={"ID":"8aa037b5-14a7-41e1-81cf-ffb06e380167","Type":"ContainerDied","Data":"a449906116b8b58f986cde063e049cfec781ed891b4e3cd3012cf1542cd890cc"} Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.320622 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a449906116b8b58f986cde063e049cfec781ed891b4e3cd3012cf1542cd890cc" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.320684 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-7ncz9" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.457724 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-vvrps"] Feb 26 17:58:19 crc kubenswrapper[5033]: E0226 17:58:19.458235 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerName="registry-server" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.458325 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerName="registry-server" Feb 26 17:58:19 crc kubenswrapper[5033]: E0226 17:58:19.458346 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0" containerName="oc" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.458355 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0" containerName="oc" Feb 26 17:58:19 crc kubenswrapper[5033]: E0226 17:58:19.458371 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerName="extract-content" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.458380 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerName="extract-content" Feb 26 17:58:19 crc kubenswrapper[5033]: E0226 17:58:19.458435 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerName="extract-utilities" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.458445 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerName="extract-utilities" Feb 26 17:58:19 crc kubenswrapper[5033]: E0226 17:58:19.458459 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa037b5-14a7-41e1-81cf-ffb06e380167" containerName="libvirt-openstack-openstack-cell1" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.458467 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa037b5-14a7-41e1-81cf-ffb06e380167" containerName="libvirt-openstack-openstack-cell1" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.458740 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0" containerName="oc" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.458764 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="92492fc1-efb9-400c-b8da-d8d02e78ceb6" containerName="registry-server" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.458844 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa037b5-14a7-41e1-81cf-ffb06e380167" containerName="libvirt-openstack-openstack-cell1" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.459798 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.462137 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.462406 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.462487 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.462904 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.463034 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.463046 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.463294 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.549066 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-vvrps"] Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588489 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588566 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588603 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588628 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588645 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588687 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-inventory\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588712 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588743 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588774 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skfjd\" (UniqueName: \"kubernetes.io/projected/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-kube-api-access-skfjd\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588792 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588816 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ceph\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588859 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.588880 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.690992 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691056 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skfjd\" (UniqueName: \"kubernetes.io/projected/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-kube-api-access-skfjd\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691081 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691105 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ceph\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691162 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691187 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691222 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691285 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691336 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691367 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691393 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691444 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-inventory\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.691468 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.692338 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.693023 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.696666 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-inventory\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.696915 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ceph\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.699323 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.699949 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.700197 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.700766 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.701000 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.701725 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.703138 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.706373 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.718913 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skfjd\" (UniqueName: \"kubernetes.io/projected/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-kube-api-access-skfjd\") pod \"nova-cell1-openstack-openstack-cell1-vvrps\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:19 crc kubenswrapper[5033]: I0226 17:58:19.832942 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 17:58:20 crc kubenswrapper[5033]: I0226 17:58:20.401948 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-vvrps"] Feb 26 17:58:21 crc kubenswrapper[5033]: I0226 17:58:21.348116 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" event={"ID":"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7","Type":"ContainerStarted","Data":"f226048fbb540db6adf6258d94a9342c8fc60f6fafc70be521c760536271081a"} Feb 26 17:58:22 crc kubenswrapper[5033]: I0226 17:58:22.363191 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" event={"ID":"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7","Type":"ContainerStarted","Data":"9483ab8cc041d2433a4d176ab51a73412cefef99cba20252170d8321933c0d04"} Feb 26 17:58:43 crc kubenswrapper[5033]: I0226 17:58:43.035699 5033 scope.go:117] "RemoveContainer" containerID="f075004e8c1050e3027740457110c52f4e51399e687aa029858b370706aef397" Feb 26 17:58:50 crc kubenswrapper[5033]: I0226 17:58:50.188777 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:58:50 crc kubenswrapper[5033]: I0226 17:58:50.189624 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:59:20 crc kubenswrapper[5033]: I0226 17:59:20.189443 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:59:20 crc kubenswrapper[5033]: I0226 17:59:20.189889 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.305933 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" podStartSLOduration=87.597755738 podStartE2EDuration="1m28.305910166s" podCreationTimestamp="2026-02-26 17:58:19 +0000 UTC" firstStartedPulling="2026-02-26 17:58:20.417072865 +0000 UTC m=+8227.629671525" lastFinishedPulling="2026-02-26 17:58:21.125227273 +0000 UTC m=+8228.337825953" observedRunningTime="2026-02-26 17:58:22.389909317 +0000 UTC m=+8229.602507987" watchObservedRunningTime="2026-02-26 17:59:47.305910166 +0000 UTC m=+8314.518508826" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.307935 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zpv5d"] Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.310159 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.321324 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zpv5d"] Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.410074 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-utilities\") pod \"certified-operators-zpv5d\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.410370 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-catalog-content\") pod \"certified-operators-zpv5d\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.410436 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g699p\" (UniqueName: \"kubernetes.io/projected/7baf4915-d4d1-4207-b947-2a12fd4248dd-kube-api-access-g699p\") pod \"certified-operators-zpv5d\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.512916 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-utilities\") pod \"certified-operators-zpv5d\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.513000 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-catalog-content\") pod \"certified-operators-zpv5d\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.513023 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g699p\" (UniqueName: \"kubernetes.io/projected/7baf4915-d4d1-4207-b947-2a12fd4248dd-kube-api-access-g699p\") pod \"certified-operators-zpv5d\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.513553 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-catalog-content\") pod \"certified-operators-zpv5d\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.513569 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-utilities\") pod \"certified-operators-zpv5d\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.533583 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g699p\" (UniqueName: \"kubernetes.io/projected/7baf4915-d4d1-4207-b947-2a12fd4248dd-kube-api-access-g699p\") pod \"certified-operators-zpv5d\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:47 crc kubenswrapper[5033]: I0226 17:59:47.653396 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:48 crc kubenswrapper[5033]: I0226 17:59:48.253632 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zpv5d"] Feb 26 17:59:48 crc kubenswrapper[5033]: I0226 17:59:48.684404 5033 generic.go:334] "Generic (PLEG): container finished" podID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerID="2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be" exitCode=0 Feb 26 17:59:48 crc kubenswrapper[5033]: I0226 17:59:48.684460 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpv5d" event={"ID":"7baf4915-d4d1-4207-b947-2a12fd4248dd","Type":"ContainerDied","Data":"2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be"} Feb 26 17:59:48 crc kubenswrapper[5033]: I0226 17:59:48.684525 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpv5d" event={"ID":"7baf4915-d4d1-4207-b947-2a12fd4248dd","Type":"ContainerStarted","Data":"11486d77db12d48b67e008513303fe2a64003ba43b413fd11abb56af7a2de5a2"} Feb 26 17:59:49 crc kubenswrapper[5033]: I0226 17:59:49.695928 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpv5d" event={"ID":"7baf4915-d4d1-4207-b947-2a12fd4248dd","Type":"ContainerStarted","Data":"2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7"} Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.188761 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.189112 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.189161 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.190006 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06a7470efd2f33b06881edc40853b2fd654201f80a9674fabd38b2904cbfea8e"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.190142 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://06a7470efd2f33b06881edc40853b2fd654201f80a9674fabd38b2904cbfea8e" gracePeriod=600 Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.708794 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="06a7470efd2f33b06881edc40853b2fd654201f80a9674fabd38b2904cbfea8e" exitCode=0 Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.708912 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"06a7470efd2f33b06881edc40853b2fd654201f80a9674fabd38b2904cbfea8e"} Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.709081 5033 scope.go:117] "RemoveContainer" containerID="135986cfa7a8f6edd9059b726f20a68025679100c0e047ec48077fdcb583e644" Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.713017 5033 generic.go:334] "Generic (PLEG): container finished" podID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerID="2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7" exitCode=0 Feb 26 17:59:50 crc kubenswrapper[5033]: I0226 17:59:50.713059 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpv5d" event={"ID":"7baf4915-d4d1-4207-b947-2a12fd4248dd","Type":"ContainerDied","Data":"2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7"} Feb 26 17:59:51 crc kubenswrapper[5033]: I0226 17:59:51.723569 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d"} Feb 26 17:59:51 crc kubenswrapper[5033]: I0226 17:59:51.726521 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpv5d" event={"ID":"7baf4915-d4d1-4207-b947-2a12fd4248dd","Type":"ContainerStarted","Data":"87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6"} Feb 26 17:59:51 crc kubenswrapper[5033]: I0226 17:59:51.755647 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zpv5d" podStartSLOduration=2.231484457 podStartE2EDuration="4.755626997s" podCreationTimestamp="2026-02-26 17:59:47 +0000 UTC" firstStartedPulling="2026-02-26 17:59:48.686677017 +0000 UTC m=+8315.899275667" lastFinishedPulling="2026-02-26 17:59:51.210819557 +0000 UTC m=+8318.423418207" observedRunningTime="2026-02-26 17:59:51.754390537 +0000 UTC m=+8318.966989187" watchObservedRunningTime="2026-02-26 17:59:51.755626997 +0000 UTC m=+8318.968225647" Feb 26 17:59:57 crc kubenswrapper[5033]: I0226 17:59:57.654100 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:57 crc kubenswrapper[5033]: I0226 17:59:57.654957 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:57 crc kubenswrapper[5033]: I0226 17:59:57.722528 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:57 crc kubenswrapper[5033]: I0226 17:59:57.831166 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 17:59:57 crc kubenswrapper[5033]: I0226 17:59:57.969068 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zpv5d"] Feb 26 17:59:59 crc kubenswrapper[5033]: I0226 17:59:59.808428 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zpv5d" podUID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerName="registry-server" containerID="cri-o://87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6" gracePeriod=2 Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.211327 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535480-lkv9r"] Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.213493 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535480-lkv9r" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.215124 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.216156 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.216385 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.222049 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt"] Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.224314 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.228705 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.231141 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.238200 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535480-lkv9r"] Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.251720 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt"] Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.323840 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-config-volume\") pod \"collect-profiles-29535480-5zkrt\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.324017 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cgxh\" (UniqueName: \"kubernetes.io/projected/e38a49f6-054d-4810-ba49-48bc9ad5d5fe-kube-api-access-2cgxh\") pod \"auto-csr-approver-29535480-lkv9r\" (UID: \"e38a49f6-054d-4810-ba49-48bc9ad5d5fe\") " pod="openshift-infra/auto-csr-approver-29535480-lkv9r" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.324052 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-secret-volume\") pod \"collect-profiles-29535480-5zkrt\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.324121 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8kl5\" (UniqueName: \"kubernetes.io/projected/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-kube-api-access-d8kl5\") pod \"collect-profiles-29535480-5zkrt\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.402290 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.427116 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cgxh\" (UniqueName: \"kubernetes.io/projected/e38a49f6-054d-4810-ba49-48bc9ad5d5fe-kube-api-access-2cgxh\") pod \"auto-csr-approver-29535480-lkv9r\" (UID: \"e38a49f6-054d-4810-ba49-48bc9ad5d5fe\") " pod="openshift-infra/auto-csr-approver-29535480-lkv9r" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.427414 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-secret-volume\") pod \"collect-profiles-29535480-5zkrt\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.427556 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8kl5\" (UniqueName: \"kubernetes.io/projected/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-kube-api-access-d8kl5\") pod \"collect-profiles-29535480-5zkrt\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.427772 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-config-volume\") pod \"collect-profiles-29535480-5zkrt\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.428863 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-config-volume\") pod \"collect-profiles-29535480-5zkrt\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.435308 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-secret-volume\") pod \"collect-profiles-29535480-5zkrt\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.446889 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cgxh\" (UniqueName: \"kubernetes.io/projected/e38a49f6-054d-4810-ba49-48bc9ad5d5fe-kube-api-access-2cgxh\") pod \"auto-csr-approver-29535480-lkv9r\" (UID: \"e38a49f6-054d-4810-ba49-48bc9ad5d5fe\") " pod="openshift-infra/auto-csr-approver-29535480-lkv9r" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.448112 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8kl5\" (UniqueName: \"kubernetes.io/projected/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-kube-api-access-d8kl5\") pod \"collect-profiles-29535480-5zkrt\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.529510 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-catalog-content\") pod \"7baf4915-d4d1-4207-b947-2a12fd4248dd\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.529661 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g699p\" (UniqueName: \"kubernetes.io/projected/7baf4915-d4d1-4207-b947-2a12fd4248dd-kube-api-access-g699p\") pod \"7baf4915-d4d1-4207-b947-2a12fd4248dd\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.529709 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-utilities\") pod \"7baf4915-d4d1-4207-b947-2a12fd4248dd\" (UID: \"7baf4915-d4d1-4207-b947-2a12fd4248dd\") " Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.531439 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-utilities" (OuterVolumeSpecName: "utilities") pod "7baf4915-d4d1-4207-b947-2a12fd4248dd" (UID: "7baf4915-d4d1-4207-b947-2a12fd4248dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.545113 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7baf4915-d4d1-4207-b947-2a12fd4248dd-kube-api-access-g699p" (OuterVolumeSpecName: "kube-api-access-g699p") pod "7baf4915-d4d1-4207-b947-2a12fd4248dd" (UID: "7baf4915-d4d1-4207-b947-2a12fd4248dd"). InnerVolumeSpecName "kube-api-access-g699p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.551978 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535480-lkv9r" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.558110 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.631936 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g699p\" (UniqueName: \"kubernetes.io/projected/7baf4915-d4d1-4207-b947-2a12fd4248dd-kube-api-access-g699p\") on node \"crc\" DevicePath \"\"" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.631972 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.821726 5033 generic.go:334] "Generic (PLEG): container finished" podID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerID="87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6" exitCode=0 Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.821793 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpv5d" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.821800 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpv5d" event={"ID":"7baf4915-d4d1-4207-b947-2a12fd4248dd","Type":"ContainerDied","Data":"87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6"} Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.822184 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpv5d" event={"ID":"7baf4915-d4d1-4207-b947-2a12fd4248dd","Type":"ContainerDied","Data":"11486d77db12d48b67e008513303fe2a64003ba43b413fd11abb56af7a2de5a2"} Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.822204 5033 scope.go:117] "RemoveContainer" containerID="87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.847070 5033 scope.go:117] "RemoveContainer" containerID="2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.866686 5033 scope.go:117] "RemoveContainer" containerID="2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.901562 5033 scope.go:117] "RemoveContainer" containerID="87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6" Feb 26 18:00:00 crc kubenswrapper[5033]: E0226 18:00:00.902029 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6\": container with ID starting with 87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6 not found: ID does not exist" containerID="87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.902066 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6"} err="failed to get container status \"87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6\": rpc error: code = NotFound desc = could not find container \"87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6\": container with ID starting with 87293e1ef562ad32af027854978b1cc64df7dbe7b3ba0d1b8a8dc5bb291a88c6 not found: ID does not exist" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.902095 5033 scope.go:117] "RemoveContainer" containerID="2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7" Feb 26 18:00:00 crc kubenswrapper[5033]: E0226 18:00:00.902412 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7\": container with ID starting with 2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7 not found: ID does not exist" containerID="2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.902452 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7"} err="failed to get container status \"2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7\": rpc error: code = NotFound desc = could not find container \"2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7\": container with ID starting with 2c759cf3f2f15b2c40944ec7c8d055797d37bfdba7e8053be6b23e8a145639f7 not found: ID does not exist" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.902478 5033 scope.go:117] "RemoveContainer" containerID="2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be" Feb 26 18:00:00 crc kubenswrapper[5033]: E0226 18:00:00.902736 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be\": container with ID starting with 2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be not found: ID does not exist" containerID="2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be" Feb 26 18:00:00 crc kubenswrapper[5033]: I0226 18:00:00.902764 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be"} err="failed to get container status \"2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be\": rpc error: code = NotFound desc = could not find container \"2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be\": container with ID starting with 2f9700e2370af62891b708616e2ca94acd8b5b4aefe2a3df60a661d19aa112be not found: ID does not exist" Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.059277 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535480-lkv9r"] Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.071248 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt"] Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.148732 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7baf4915-d4d1-4207-b947-2a12fd4248dd" (UID: "7baf4915-d4d1-4207-b947-2a12fd4248dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.246533 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7baf4915-d4d1-4207-b947-2a12fd4248dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.475599 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zpv5d"] Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.488017 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zpv5d"] Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.757684 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7baf4915-d4d1-4207-b947-2a12fd4248dd" path="/var/lib/kubelet/pods/7baf4915-d4d1-4207-b947-2a12fd4248dd/volumes" Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.842233 5033 generic.go:334] "Generic (PLEG): container finished" podID="caeb061c-3753-4ffd-bc16-90ca9a8ad0f2" containerID="7b0e962fc6cf134f5c2e8e62013254db5d68638d94a45764c91b676c93e64810" exitCode=0 Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.842527 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" event={"ID":"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2","Type":"ContainerDied","Data":"7b0e962fc6cf134f5c2e8e62013254db5d68638d94a45764c91b676c93e64810"} Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.842581 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" event={"ID":"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2","Type":"ContainerStarted","Data":"4efc47b21704c4bac766b96f08127767dc47dc6c143ec40b4754d73643ce576b"} Feb 26 18:00:01 crc kubenswrapper[5033]: I0226 18:00:01.844426 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535480-lkv9r" event={"ID":"e38a49f6-054d-4810-ba49-48bc9ad5d5fe","Type":"ContainerStarted","Data":"500514b081e46f0d7488d604190883b0079086bd879345dffd99438c9f30c390"} Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.299890 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.394347 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8kl5\" (UniqueName: \"kubernetes.io/projected/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-kube-api-access-d8kl5\") pod \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.394704 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-config-volume\") pod \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.394762 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-secret-volume\") pod \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\" (UID: \"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2\") " Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.395248 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-config-volume" (OuterVolumeSpecName: "config-volume") pod "caeb061c-3753-4ffd-bc16-90ca9a8ad0f2" (UID: "caeb061c-3753-4ffd-bc16-90ca9a8ad0f2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.406459 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-kube-api-access-d8kl5" (OuterVolumeSpecName: "kube-api-access-d8kl5") pod "caeb061c-3753-4ffd-bc16-90ca9a8ad0f2" (UID: "caeb061c-3753-4ffd-bc16-90ca9a8ad0f2"). InnerVolumeSpecName "kube-api-access-d8kl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.407734 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "caeb061c-3753-4ffd-bc16-90ca9a8ad0f2" (UID: "caeb061c-3753-4ffd-bc16-90ca9a8ad0f2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.497387 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8kl5\" (UniqueName: \"kubernetes.io/projected/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-kube-api-access-d8kl5\") on node \"crc\" DevicePath \"\"" Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.497441 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.497454 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/caeb061c-3753-4ffd-bc16-90ca9a8ad0f2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.866462 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" event={"ID":"caeb061c-3753-4ffd-bc16-90ca9a8ad0f2","Type":"ContainerDied","Data":"4efc47b21704c4bac766b96f08127767dc47dc6c143ec40b4754d73643ce576b"} Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.866727 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4efc47b21704c4bac766b96f08127767dc47dc6c143ec40b4754d73643ce576b" Feb 26 18:00:03 crc kubenswrapper[5033]: I0226 18:00:03.866542 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535480-5zkrt" Feb 26 18:00:04 crc kubenswrapper[5033]: I0226 18:00:04.406890 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66"] Feb 26 18:00:04 crc kubenswrapper[5033]: I0226 18:00:04.415874 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535435-99h66"] Feb 26 18:00:04 crc kubenswrapper[5033]: I0226 18:00:04.877056 5033 generic.go:334] "Generic (PLEG): container finished" podID="e38a49f6-054d-4810-ba49-48bc9ad5d5fe" containerID="debbf59216d9df949748f823c0eeb4fa587c3004ba88c5b40907c7cb1c7ade50" exitCode=0 Feb 26 18:00:04 crc kubenswrapper[5033]: I0226 18:00:04.877143 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535480-lkv9r" event={"ID":"e38a49f6-054d-4810-ba49-48bc9ad5d5fe","Type":"ContainerDied","Data":"debbf59216d9df949748f823c0eeb4fa587c3004ba88c5b40907c7cb1c7ade50"} Feb 26 18:00:05 crc kubenswrapper[5033]: I0226 18:00:05.641093 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24a7aaa-2809-4c7e-8892-18229730d971" path="/var/lib/kubelet/pods/e24a7aaa-2809-4c7e-8892-18229730d971/volumes" Feb 26 18:00:06 crc kubenswrapper[5033]: I0226 18:00:06.275301 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535480-lkv9r" Feb 26 18:00:06 crc kubenswrapper[5033]: I0226 18:00:06.366632 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cgxh\" (UniqueName: \"kubernetes.io/projected/e38a49f6-054d-4810-ba49-48bc9ad5d5fe-kube-api-access-2cgxh\") pod \"e38a49f6-054d-4810-ba49-48bc9ad5d5fe\" (UID: \"e38a49f6-054d-4810-ba49-48bc9ad5d5fe\") " Feb 26 18:00:06 crc kubenswrapper[5033]: I0226 18:00:06.374539 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38a49f6-054d-4810-ba49-48bc9ad5d5fe-kube-api-access-2cgxh" (OuterVolumeSpecName: "kube-api-access-2cgxh") pod "e38a49f6-054d-4810-ba49-48bc9ad5d5fe" (UID: "e38a49f6-054d-4810-ba49-48bc9ad5d5fe"). InnerVolumeSpecName "kube-api-access-2cgxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:00:06 crc kubenswrapper[5033]: I0226 18:00:06.468756 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cgxh\" (UniqueName: \"kubernetes.io/projected/e38a49f6-054d-4810-ba49-48bc9ad5d5fe-kube-api-access-2cgxh\") on node \"crc\" DevicePath \"\"" Feb 26 18:00:06 crc kubenswrapper[5033]: I0226 18:00:06.902318 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535480-lkv9r" event={"ID":"e38a49f6-054d-4810-ba49-48bc9ad5d5fe","Type":"ContainerDied","Data":"500514b081e46f0d7488d604190883b0079086bd879345dffd99438c9f30c390"} Feb 26 18:00:06 crc kubenswrapper[5033]: I0226 18:00:06.902642 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="500514b081e46f0d7488d604190883b0079086bd879345dffd99438c9f30c390" Feb 26 18:00:06 crc kubenswrapper[5033]: I0226 18:00:06.902400 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535480-lkv9r" Feb 26 18:00:07 crc kubenswrapper[5033]: I0226 18:00:07.333355 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535474-vxz4v"] Feb 26 18:00:07 crc kubenswrapper[5033]: I0226 18:00:07.342242 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535474-vxz4v"] Feb 26 18:00:07 crc kubenswrapper[5033]: I0226 18:00:07.639517 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="875dfb56-69a9-4423-85ba-a88ba5e4f401" path="/var/lib/kubelet/pods/875dfb56-69a9-4423-85ba-a88ba5e4f401/volumes" Feb 26 18:00:43 crc kubenswrapper[5033]: I0226 18:00:43.176108 5033 scope.go:117] "RemoveContainer" containerID="2ad5de41b5d7de1b18c5165838b57191146a9199a84945e3061d6b5459d46155" Feb 26 18:00:43 crc kubenswrapper[5033]: I0226 18:00:43.232302 5033 scope.go:117] "RemoveContainer" containerID="77b4103b1fd596a6df56a0b1438bb60c5cbe857a0b840497c04fec4b2e8706eb" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.156523 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29535481-l2mhq"] Feb 26 18:01:00 crc kubenswrapper[5033]: E0226 18:01:00.158944 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38a49f6-054d-4810-ba49-48bc9ad5d5fe" containerName="oc" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.158978 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38a49f6-054d-4810-ba49-48bc9ad5d5fe" containerName="oc" Feb 26 18:01:00 crc kubenswrapper[5033]: E0226 18:01:00.158993 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caeb061c-3753-4ffd-bc16-90ca9a8ad0f2" containerName="collect-profiles" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.159002 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="caeb061c-3753-4ffd-bc16-90ca9a8ad0f2" containerName="collect-profiles" Feb 26 18:01:00 crc kubenswrapper[5033]: E0226 18:01:00.159024 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerName="registry-server" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.159033 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerName="registry-server" Feb 26 18:01:00 crc kubenswrapper[5033]: E0226 18:01:00.159049 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerName="extract-utilities" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.159058 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerName="extract-utilities" Feb 26 18:01:00 crc kubenswrapper[5033]: E0226 18:01:00.159098 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerName="extract-content" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.159106 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerName="extract-content" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.159416 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="caeb061c-3753-4ffd-bc16-90ca9a8ad0f2" containerName="collect-profiles" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.159453 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="7baf4915-d4d1-4207-b947-2a12fd4248dd" containerName="registry-server" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.159467 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38a49f6-054d-4810-ba49-48bc9ad5d5fe" containerName="oc" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.160379 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.173218 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535481-l2mhq"] Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.262213 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvj2j\" (UniqueName: \"kubernetes.io/projected/08deaf39-80a4-408d-a07e-63c2428e7b2c-kube-api-access-pvj2j\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.262344 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-fernet-keys\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.262601 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-config-data\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.262736 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-combined-ca-bundle\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.365734 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-fernet-keys\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.365865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-config-data\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.365924 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-combined-ca-bundle\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.366068 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvj2j\" (UniqueName: \"kubernetes.io/projected/08deaf39-80a4-408d-a07e-63c2428e7b2c-kube-api-access-pvj2j\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.382183 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-fernet-keys\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.382569 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-config-data\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.382950 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-combined-ca-bundle\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.387335 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvj2j\" (UniqueName: \"kubernetes.io/projected/08deaf39-80a4-408d-a07e-63c2428e7b2c-kube-api-access-pvj2j\") pod \"keystone-cron-29535481-l2mhq\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:00 crc kubenswrapper[5033]: I0226 18:01:00.498923 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:01 crc kubenswrapper[5033]: I0226 18:01:01.002322 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535481-l2mhq"] Feb 26 18:01:01 crc kubenswrapper[5033]: I0226 18:01:01.578202 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535481-l2mhq" event={"ID":"08deaf39-80a4-408d-a07e-63c2428e7b2c","Type":"ContainerStarted","Data":"979f0aef360fec7a9097682fe17655141b69084b0731d3c0c95adc0a62e37a7c"} Feb 26 18:01:01 crc kubenswrapper[5033]: I0226 18:01:01.578730 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535481-l2mhq" event={"ID":"08deaf39-80a4-408d-a07e-63c2428e7b2c","Type":"ContainerStarted","Data":"b740e99a72053a0d44134360ed99c475595271338f5df36db462fe58e2b7928f"} Feb 26 18:01:04 crc kubenswrapper[5033]: I0226 18:01:04.609427 5033 generic.go:334] "Generic (PLEG): container finished" podID="08deaf39-80a4-408d-a07e-63c2428e7b2c" containerID="979f0aef360fec7a9097682fe17655141b69084b0731d3c0c95adc0a62e37a7c" exitCode=0 Feb 26 18:01:04 crc kubenswrapper[5033]: I0226 18:01:04.609549 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535481-l2mhq" event={"ID":"08deaf39-80a4-408d-a07e-63c2428e7b2c","Type":"ContainerDied","Data":"979f0aef360fec7a9097682fe17655141b69084b0731d3c0c95adc0a62e37a7c"} Feb 26 18:01:05 crc kubenswrapper[5033]: I0226 18:01:05.622546 5033 generic.go:334] "Generic (PLEG): container finished" podID="abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" containerID="9483ab8cc041d2433a4d176ab51a73412cefef99cba20252170d8321933c0d04" exitCode=0 Feb 26 18:01:05 crc kubenswrapper[5033]: I0226 18:01:05.622651 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" event={"ID":"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7","Type":"ContainerDied","Data":"9483ab8cc041d2433a4d176ab51a73412cefef99cba20252170d8321933c0d04"} Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.007069 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.104112 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-combined-ca-bundle\") pod \"08deaf39-80a4-408d-a07e-63c2428e7b2c\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.104318 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvj2j\" (UniqueName: \"kubernetes.io/projected/08deaf39-80a4-408d-a07e-63c2428e7b2c-kube-api-access-pvj2j\") pod \"08deaf39-80a4-408d-a07e-63c2428e7b2c\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.104959 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-fernet-keys\") pod \"08deaf39-80a4-408d-a07e-63c2428e7b2c\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.105228 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-config-data\") pod \"08deaf39-80a4-408d-a07e-63c2428e7b2c\" (UID: \"08deaf39-80a4-408d-a07e-63c2428e7b2c\") " Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.110351 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "08deaf39-80a4-408d-a07e-63c2428e7b2c" (UID: "08deaf39-80a4-408d-a07e-63c2428e7b2c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.111355 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08deaf39-80a4-408d-a07e-63c2428e7b2c-kube-api-access-pvj2j" (OuterVolumeSpecName: "kube-api-access-pvj2j") pod "08deaf39-80a4-408d-a07e-63c2428e7b2c" (UID: "08deaf39-80a4-408d-a07e-63c2428e7b2c"). InnerVolumeSpecName "kube-api-access-pvj2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.137418 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08deaf39-80a4-408d-a07e-63c2428e7b2c" (UID: "08deaf39-80a4-408d-a07e-63c2428e7b2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.161358 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-config-data" (OuterVolumeSpecName: "config-data") pod "08deaf39-80a4-408d-a07e-63c2428e7b2c" (UID: "08deaf39-80a4-408d-a07e-63c2428e7b2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.208843 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.208895 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvj2j\" (UniqueName: \"kubernetes.io/projected/08deaf39-80a4-408d-a07e-63c2428e7b2c-kube-api-access-pvj2j\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.208913 5033 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.208929 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08deaf39-80a4-408d-a07e-63c2428e7b2c-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.635180 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535481-l2mhq" Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.635190 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535481-l2mhq" event={"ID":"08deaf39-80a4-408d-a07e-63c2428e7b2c","Type":"ContainerDied","Data":"b740e99a72053a0d44134360ed99c475595271338f5df36db462fe58e2b7928f"} Feb 26 18:01:06 crc kubenswrapper[5033]: I0226 18:01:06.635247 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b740e99a72053a0d44134360ed99c475595271338f5df36db462fe58e2b7928f" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.184105 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.341448 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-combined-ca-bundle\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.341555 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-3\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.341598 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ssh-key-openstack-cell1\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.341736 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-1\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.341792 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-0\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.341847 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-1\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.341929 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-1\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.341966 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-2\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.341996 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-0\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.342019 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ceph\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.342064 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-inventory\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.342133 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-0\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.342180 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skfjd\" (UniqueName: \"kubernetes.io/projected/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-kube-api-access-skfjd\") pod \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\" (UID: \"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7\") " Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.349198 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.351677 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ceph" (OuterVolumeSpecName: "ceph") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.356148 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-kube-api-access-skfjd" (OuterVolumeSpecName: "kube-api-access-skfjd") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "kube-api-access-skfjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.378455 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.382314 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.383430 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.385360 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.399315 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-inventory" (OuterVolumeSpecName: "inventory") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.399799 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.401109 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.403411 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.403871 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.405933 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" (UID: "abd39c5a-ea6d-4d76-8370-d3179f6ae5c7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.444923 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.444965 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.444978 5033 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.444990 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.445007 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.445019 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.445028 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skfjd\" (UniqueName: \"kubernetes.io/projected/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-kube-api-access-skfjd\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.445038 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.445047 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.445056 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.445067 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.445078 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.445089 5033 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/abd39c5a-ea6d-4d76-8370-d3179f6ae5c7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.649229 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" event={"ID":"abd39c5a-ea6d-4d76-8370-d3179f6ae5c7","Type":"ContainerDied","Data":"f226048fbb540db6adf6258d94a9342c8fc60f6fafc70be521c760536271081a"} Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.649309 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vvrps" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.649328 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f226048fbb540db6adf6258d94a9342c8fc60f6fafc70be521c760536271081a" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.731533 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-rswv4"] Feb 26 18:01:07 crc kubenswrapper[5033]: E0226 18:01:07.731925 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" containerName="nova-cell1-openstack-openstack-cell1" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.731940 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" containerName="nova-cell1-openstack-openstack-cell1" Feb 26 18:01:07 crc kubenswrapper[5033]: E0226 18:01:07.731959 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08deaf39-80a4-408d-a07e-63c2428e7b2c" containerName="keystone-cron" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.731966 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="08deaf39-80a4-408d-a07e-63c2428e7b2c" containerName="keystone-cron" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.732182 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="08deaf39-80a4-408d-a07e-63c2428e7b2c" containerName="keystone-cron" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.732206 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd39c5a-ea6d-4d76-8370-d3179f6ae5c7" containerName="nova-cell1-openstack-openstack-cell1" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.732898 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.734984 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.735148 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.735372 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.737895 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.738322 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.754847 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-rswv4"] Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.853538 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceph\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.853587 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-inventory\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.853628 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5hsc\" (UniqueName: \"kubernetes.io/projected/cc11177a-f224-4e9d-9b44-92dd6741bb1c-kube-api-access-n5hsc\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.853647 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.853673 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.853739 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.853784 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.853805 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.956005 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceph\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.956222 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-inventory\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.956362 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5hsc\" (UniqueName: \"kubernetes.io/projected/cc11177a-f224-4e9d-9b44-92dd6741bb1c-kube-api-access-n5hsc\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.956532 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.956622 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.956809 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.956925 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.957024 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.959891 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceph\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.960347 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.960479 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-inventory\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.961413 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.962409 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.962956 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.963085 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:07 crc kubenswrapper[5033]: I0226 18:01:07.976041 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5hsc\" (UniqueName: \"kubernetes.io/projected/cc11177a-f224-4e9d-9b44-92dd6741bb1c-kube-api-access-n5hsc\") pod \"telemetry-openstack-openstack-cell1-rswv4\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:08 crc kubenswrapper[5033]: I0226 18:01:08.053474 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:01:08 crc kubenswrapper[5033]: I0226 18:01:08.595484 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-rswv4"] Feb 26 18:01:08 crc kubenswrapper[5033]: I0226 18:01:08.603066 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 18:01:08 crc kubenswrapper[5033]: I0226 18:01:08.658554 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-rswv4" event={"ID":"cc11177a-f224-4e9d-9b44-92dd6741bb1c","Type":"ContainerStarted","Data":"2938aa7b3cebf4cde44f4c98ee230a3caf1331489b0fd1d8de762b3465c17301"} Feb 26 18:01:09 crc kubenswrapper[5033]: I0226 18:01:09.669014 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-rswv4" event={"ID":"cc11177a-f224-4e9d-9b44-92dd6741bb1c","Type":"ContainerStarted","Data":"740af80ca85f3c98265489fea39ea35bc955675beba015182fb8014e79278c67"} Feb 26 18:01:09 crc kubenswrapper[5033]: I0226 18:01:09.695155 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-rswv4" podStartSLOduration=2.284284217 podStartE2EDuration="2.695138054s" podCreationTimestamp="2026-02-26 18:01:07 +0000 UTC" firstStartedPulling="2026-02-26 18:01:08.60284861 +0000 UTC m=+8395.815447250" lastFinishedPulling="2026-02-26 18:01:09.013702427 +0000 UTC m=+8396.226301087" observedRunningTime="2026-02-26 18:01:09.685482439 +0000 UTC m=+8396.898081089" watchObservedRunningTime="2026-02-26 18:01:09.695138054 +0000 UTC m=+8396.907736704" Feb 26 18:01:50 crc kubenswrapper[5033]: I0226 18:01:50.189191 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:01:50 crc kubenswrapper[5033]: I0226 18:01:50.189810 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.168266 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535482-xlf2d"] Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.170639 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535482-xlf2d" Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.173342 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.174007 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.174792 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.178565 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535482-xlf2d"] Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.317813 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfv5d\" (UniqueName: \"kubernetes.io/projected/74182ef9-becf-4b65-9931-90e2712d8750-kube-api-access-tfv5d\") pod \"auto-csr-approver-29535482-xlf2d\" (UID: \"74182ef9-becf-4b65-9931-90e2712d8750\") " pod="openshift-infra/auto-csr-approver-29535482-xlf2d" Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.420484 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfv5d\" (UniqueName: \"kubernetes.io/projected/74182ef9-becf-4b65-9931-90e2712d8750-kube-api-access-tfv5d\") pod \"auto-csr-approver-29535482-xlf2d\" (UID: \"74182ef9-becf-4b65-9931-90e2712d8750\") " pod="openshift-infra/auto-csr-approver-29535482-xlf2d" Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.448001 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfv5d\" (UniqueName: \"kubernetes.io/projected/74182ef9-becf-4b65-9931-90e2712d8750-kube-api-access-tfv5d\") pod \"auto-csr-approver-29535482-xlf2d\" (UID: \"74182ef9-becf-4b65-9931-90e2712d8750\") " pod="openshift-infra/auto-csr-approver-29535482-xlf2d" Feb 26 18:02:00 crc kubenswrapper[5033]: I0226 18:02:00.497023 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535482-xlf2d" Feb 26 18:02:01 crc kubenswrapper[5033]: I0226 18:02:01.003074 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535482-xlf2d"] Feb 26 18:02:01 crc kubenswrapper[5033]: I0226 18:02:01.281709 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535482-xlf2d" event={"ID":"74182ef9-becf-4b65-9931-90e2712d8750","Type":"ContainerStarted","Data":"97126f42631f009698e8ce424d9d6ff07eca7c0d44aad6d474672ab8cf81e9f7"} Feb 26 18:02:03 crc kubenswrapper[5033]: I0226 18:02:03.313452 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535482-xlf2d" event={"ID":"74182ef9-becf-4b65-9931-90e2712d8750","Type":"ContainerStarted","Data":"9fb3fa763e4343b434b4254a95bd31c5d1616ae4c31490d00f9efaa1a5111d3e"} Feb 26 18:02:03 crc kubenswrapper[5033]: I0226 18:02:03.342247 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535482-xlf2d" podStartSLOduration=1.462772749 podStartE2EDuration="3.342225947s" podCreationTimestamp="2026-02-26 18:02:00 +0000 UTC" firstStartedPulling="2026-02-26 18:02:01.009485708 +0000 UTC m=+8448.222084398" lastFinishedPulling="2026-02-26 18:02:02.888938936 +0000 UTC m=+8450.101537596" observedRunningTime="2026-02-26 18:02:03.330313696 +0000 UTC m=+8450.542912346" watchObservedRunningTime="2026-02-26 18:02:03.342225947 +0000 UTC m=+8450.554824607" Feb 26 18:02:04 crc kubenswrapper[5033]: I0226 18:02:04.327194 5033 generic.go:334] "Generic (PLEG): container finished" podID="74182ef9-becf-4b65-9931-90e2712d8750" containerID="9fb3fa763e4343b434b4254a95bd31c5d1616ae4c31490d00f9efaa1a5111d3e" exitCode=0 Feb 26 18:02:04 crc kubenswrapper[5033]: I0226 18:02:04.327319 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535482-xlf2d" event={"ID":"74182ef9-becf-4b65-9931-90e2712d8750","Type":"ContainerDied","Data":"9fb3fa763e4343b434b4254a95bd31c5d1616ae4c31490d00f9efaa1a5111d3e"} Feb 26 18:02:05 crc kubenswrapper[5033]: I0226 18:02:05.757723 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535482-xlf2d" Feb 26 18:02:05 crc kubenswrapper[5033]: I0226 18:02:05.854421 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfv5d\" (UniqueName: \"kubernetes.io/projected/74182ef9-becf-4b65-9931-90e2712d8750-kube-api-access-tfv5d\") pod \"74182ef9-becf-4b65-9931-90e2712d8750\" (UID: \"74182ef9-becf-4b65-9931-90e2712d8750\") " Feb 26 18:02:05 crc kubenswrapper[5033]: I0226 18:02:05.861367 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74182ef9-becf-4b65-9931-90e2712d8750-kube-api-access-tfv5d" (OuterVolumeSpecName: "kube-api-access-tfv5d") pod "74182ef9-becf-4b65-9931-90e2712d8750" (UID: "74182ef9-becf-4b65-9931-90e2712d8750"). InnerVolumeSpecName "kube-api-access-tfv5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:02:05 crc kubenswrapper[5033]: I0226 18:02:05.957734 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfv5d\" (UniqueName: \"kubernetes.io/projected/74182ef9-becf-4b65-9931-90e2712d8750-kube-api-access-tfv5d\") on node \"crc\" DevicePath \"\"" Feb 26 18:02:06 crc kubenswrapper[5033]: I0226 18:02:06.348969 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535482-xlf2d" event={"ID":"74182ef9-becf-4b65-9931-90e2712d8750","Type":"ContainerDied","Data":"97126f42631f009698e8ce424d9d6ff07eca7c0d44aad6d474672ab8cf81e9f7"} Feb 26 18:02:06 crc kubenswrapper[5033]: I0226 18:02:06.349020 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97126f42631f009698e8ce424d9d6ff07eca7c0d44aad6d474672ab8cf81e9f7" Feb 26 18:02:06 crc kubenswrapper[5033]: I0226 18:02:06.349099 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535482-xlf2d" Feb 26 18:02:06 crc kubenswrapper[5033]: I0226 18:02:06.399382 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535476-dtzld"] Feb 26 18:02:06 crc kubenswrapper[5033]: I0226 18:02:06.408115 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535476-dtzld"] Feb 26 18:02:07 crc kubenswrapper[5033]: I0226 18:02:07.644416 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c394577-5481-48c5-ab68-20d54e193f60" path="/var/lib/kubelet/pods/0c394577-5481-48c5-ab68-20d54e193f60/volumes" Feb 26 18:02:20 crc kubenswrapper[5033]: I0226 18:02:20.188728 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:02:20 crc kubenswrapper[5033]: I0226 18:02:20.189436 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:02:43 crc kubenswrapper[5033]: I0226 18:02:43.368982 5033 scope.go:117] "RemoveContainer" containerID="5477b15b341f3eb1dbda1a3001893363cfbadeda78214432114af9ce9fefcc56" Feb 26 18:02:50 crc kubenswrapper[5033]: I0226 18:02:50.189167 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:02:50 crc kubenswrapper[5033]: I0226 18:02:50.189778 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:02:50 crc kubenswrapper[5033]: I0226 18:02:50.189824 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 18:02:50 crc kubenswrapper[5033]: I0226 18:02:50.190469 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 18:02:50 crc kubenswrapper[5033]: I0226 18:02:50.190521 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" gracePeriod=600 Feb 26 18:02:50 crc kubenswrapper[5033]: E0226 18:02:50.316109 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:02:50 crc kubenswrapper[5033]: I0226 18:02:50.855346 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" exitCode=0 Feb 26 18:02:50 crc kubenswrapper[5033]: I0226 18:02:50.855437 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d"} Feb 26 18:02:50 crc kubenswrapper[5033]: I0226 18:02:50.855752 5033 scope.go:117] "RemoveContainer" containerID="06a7470efd2f33b06881edc40853b2fd654201f80a9674fabd38b2904cbfea8e" Feb 26 18:02:50 crc kubenswrapper[5033]: I0226 18:02:50.856898 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:02:50 crc kubenswrapper[5033]: E0226 18:02:50.857488 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:03:05 crc kubenswrapper[5033]: I0226 18:03:05.629380 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:03:05 crc kubenswrapper[5033]: E0226 18:03:05.631036 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.139890 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x9jlv"] Feb 26 18:03:08 crc kubenswrapper[5033]: E0226 18:03:08.140990 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74182ef9-becf-4b65-9931-90e2712d8750" containerName="oc" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.141006 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="74182ef9-becf-4b65-9931-90e2712d8750" containerName="oc" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.141328 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="74182ef9-becf-4b65-9931-90e2712d8750" containerName="oc" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.143417 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.157903 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x9jlv"] Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.238419 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9tgl\" (UniqueName: \"kubernetes.io/projected/ec2997a2-842b-4090-a444-48f355bf2249-kube-api-access-c9tgl\") pod \"redhat-operators-x9jlv\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.238533 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-utilities\") pod \"redhat-operators-x9jlv\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.238597 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-catalog-content\") pod \"redhat-operators-x9jlv\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.340787 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9tgl\" (UniqueName: \"kubernetes.io/projected/ec2997a2-842b-4090-a444-48f355bf2249-kube-api-access-c9tgl\") pod \"redhat-operators-x9jlv\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.340874 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-utilities\") pod \"redhat-operators-x9jlv\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.340921 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-catalog-content\") pod \"redhat-operators-x9jlv\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.341535 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-catalog-content\") pod \"redhat-operators-x9jlv\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.341577 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-utilities\") pod \"redhat-operators-x9jlv\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.362866 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9tgl\" (UniqueName: \"kubernetes.io/projected/ec2997a2-842b-4090-a444-48f355bf2249-kube-api-access-c9tgl\") pod \"redhat-operators-x9jlv\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:08 crc kubenswrapper[5033]: I0226 18:03:08.476022 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:09 crc kubenswrapper[5033]: I0226 18:03:09.027964 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x9jlv"] Feb 26 18:03:09 crc kubenswrapper[5033]: I0226 18:03:09.058907 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x9jlv" event={"ID":"ec2997a2-842b-4090-a444-48f355bf2249","Type":"ContainerStarted","Data":"8a4a6407c1377d85d571d0bedd503dfdc147809b26cc440c6bfb7070c1648c7f"} Feb 26 18:03:10 crc kubenswrapper[5033]: I0226 18:03:10.070573 5033 generic.go:334] "Generic (PLEG): container finished" podID="ec2997a2-842b-4090-a444-48f355bf2249" containerID="f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8" exitCode=0 Feb 26 18:03:10 crc kubenswrapper[5033]: I0226 18:03:10.070669 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x9jlv" event={"ID":"ec2997a2-842b-4090-a444-48f355bf2249","Type":"ContainerDied","Data":"f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8"} Feb 26 18:03:11 crc kubenswrapper[5033]: E0226 18:03:11.993912 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2997a2_842b_4090_a444_48f355bf2249.slice/crio-conmon-48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789.scope\": RecentStats: unable to find data in memory cache]" Feb 26 18:03:12 crc kubenswrapper[5033]: I0226 18:03:12.094729 5033 generic.go:334] "Generic (PLEG): container finished" podID="ec2997a2-842b-4090-a444-48f355bf2249" containerID="48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789" exitCode=0 Feb 26 18:03:12 crc kubenswrapper[5033]: I0226 18:03:12.094804 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x9jlv" event={"ID":"ec2997a2-842b-4090-a444-48f355bf2249","Type":"ContainerDied","Data":"48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789"} Feb 26 18:03:14 crc kubenswrapper[5033]: I0226 18:03:14.115907 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x9jlv" event={"ID":"ec2997a2-842b-4090-a444-48f355bf2249","Type":"ContainerStarted","Data":"80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a"} Feb 26 18:03:14 crc kubenswrapper[5033]: I0226 18:03:14.149004 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x9jlv" podStartSLOduration=3.6766684549999997 podStartE2EDuration="6.148983013s" podCreationTimestamp="2026-02-26 18:03:08 +0000 UTC" firstStartedPulling="2026-02-26 18:03:10.0731991 +0000 UTC m=+8517.285797750" lastFinishedPulling="2026-02-26 18:03:12.545513668 +0000 UTC m=+8519.758112308" observedRunningTime="2026-02-26 18:03:14.139874391 +0000 UTC m=+8521.352473051" watchObservedRunningTime="2026-02-26 18:03:14.148983013 +0000 UTC m=+8521.361581663" Feb 26 18:03:18 crc kubenswrapper[5033]: I0226 18:03:18.477728 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:18 crc kubenswrapper[5033]: I0226 18:03:18.478161 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:18 crc kubenswrapper[5033]: I0226 18:03:18.628796 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:03:18 crc kubenswrapper[5033]: E0226 18:03:18.629290 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:03:19 crc kubenswrapper[5033]: I0226 18:03:19.537331 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x9jlv" podUID="ec2997a2-842b-4090-a444-48f355bf2249" containerName="registry-server" probeResult="failure" output=< Feb 26 18:03:19 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 18:03:19 crc kubenswrapper[5033]: > Feb 26 18:03:28 crc kubenswrapper[5033]: I0226 18:03:28.530224 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:28 crc kubenswrapper[5033]: I0226 18:03:28.582976 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:28 crc kubenswrapper[5033]: I0226 18:03:28.777439 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x9jlv"] Feb 26 18:03:30 crc kubenswrapper[5033]: I0226 18:03:30.285688 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x9jlv" podUID="ec2997a2-842b-4090-a444-48f355bf2249" containerName="registry-server" containerID="cri-o://80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a" gracePeriod=2 Feb 26 18:03:30 crc kubenswrapper[5033]: I0226 18:03:30.846363 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:30 crc kubenswrapper[5033]: I0226 18:03:30.972036 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9tgl\" (UniqueName: \"kubernetes.io/projected/ec2997a2-842b-4090-a444-48f355bf2249-kube-api-access-c9tgl\") pod \"ec2997a2-842b-4090-a444-48f355bf2249\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " Feb 26 18:03:30 crc kubenswrapper[5033]: I0226 18:03:30.972117 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-catalog-content\") pod \"ec2997a2-842b-4090-a444-48f355bf2249\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " Feb 26 18:03:30 crc kubenswrapper[5033]: I0226 18:03:30.972315 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-utilities\") pod \"ec2997a2-842b-4090-a444-48f355bf2249\" (UID: \"ec2997a2-842b-4090-a444-48f355bf2249\") " Feb 26 18:03:30 crc kubenswrapper[5033]: I0226 18:03:30.974542 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-utilities" (OuterVolumeSpecName: "utilities") pod "ec2997a2-842b-4090-a444-48f355bf2249" (UID: "ec2997a2-842b-4090-a444-48f355bf2249"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:03:30 crc kubenswrapper[5033]: I0226 18:03:30.977970 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec2997a2-842b-4090-a444-48f355bf2249-kube-api-access-c9tgl" (OuterVolumeSpecName: "kube-api-access-c9tgl") pod "ec2997a2-842b-4090-a444-48f355bf2249" (UID: "ec2997a2-842b-4090-a444-48f355bf2249"). InnerVolumeSpecName "kube-api-access-c9tgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.074955 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9tgl\" (UniqueName: \"kubernetes.io/projected/ec2997a2-842b-4090-a444-48f355bf2249-kube-api-access-c9tgl\") on node \"crc\" DevicePath \"\"" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.075216 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.123313 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec2997a2-842b-4090-a444-48f355bf2249" (UID: "ec2997a2-842b-4090-a444-48f355bf2249"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.176923 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec2997a2-842b-4090-a444-48f355bf2249-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.296585 5033 generic.go:334] "Generic (PLEG): container finished" podID="ec2997a2-842b-4090-a444-48f355bf2249" containerID="80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a" exitCode=0 Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.296630 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x9jlv" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.296640 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x9jlv" event={"ID":"ec2997a2-842b-4090-a444-48f355bf2249","Type":"ContainerDied","Data":"80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a"} Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.296664 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x9jlv" event={"ID":"ec2997a2-842b-4090-a444-48f355bf2249","Type":"ContainerDied","Data":"8a4a6407c1377d85d571d0bedd503dfdc147809b26cc440c6bfb7070c1648c7f"} Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.296681 5033 scope.go:117] "RemoveContainer" containerID="80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.334597 5033 scope.go:117] "RemoveContainer" containerID="48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.341785 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x9jlv"] Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.350583 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x9jlv"] Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.362341 5033 scope.go:117] "RemoveContainer" containerID="f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.412801 5033 scope.go:117] "RemoveContainer" containerID="80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a" Feb 26 18:03:31 crc kubenswrapper[5033]: E0226 18:03:31.413888 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a\": container with ID starting with 80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a not found: ID does not exist" containerID="80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.413930 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a"} err="failed to get container status \"80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a\": rpc error: code = NotFound desc = could not find container \"80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a\": container with ID starting with 80bb91b4edbd1139879d02e6db926026fdd3c3caf7c1138bed8086875b31354a not found: ID does not exist" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.413955 5033 scope.go:117] "RemoveContainer" containerID="48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789" Feb 26 18:03:31 crc kubenswrapper[5033]: E0226 18:03:31.414481 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789\": container with ID starting with 48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789 not found: ID does not exist" containerID="48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.414502 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789"} err="failed to get container status \"48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789\": rpc error: code = NotFound desc = could not find container \"48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789\": container with ID starting with 48c366cdc8b4fc02a75abb5f603632fa45f73d52d33cfb462a31a0e493739789 not found: ID does not exist" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.414521 5033 scope.go:117] "RemoveContainer" containerID="f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8" Feb 26 18:03:31 crc kubenswrapper[5033]: E0226 18:03:31.414785 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8\": container with ID starting with f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8 not found: ID does not exist" containerID="f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.414805 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8"} err="failed to get container status \"f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8\": rpc error: code = NotFound desc = could not find container \"f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8\": container with ID starting with f5140d0700d34e8518daa3a31102c69524152039c5e8c20a84c3caa3cf52c2d8 not found: ID does not exist" Feb 26 18:03:31 crc kubenswrapper[5033]: I0226 18:03:31.649437 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec2997a2-842b-4090-a444-48f355bf2249" path="/var/lib/kubelet/pods/ec2997a2-842b-4090-a444-48f355bf2249/volumes" Feb 26 18:03:33 crc kubenswrapper[5033]: I0226 18:03:33.636432 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:03:33 crc kubenswrapper[5033]: E0226 18:03:33.637357 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:03:46 crc kubenswrapper[5033]: I0226 18:03:46.628630 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:03:46 crc kubenswrapper[5033]: E0226 18:03:46.629442 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:03:58 crc kubenswrapper[5033]: I0226 18:03:58.628631 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:03:58 crc kubenswrapper[5033]: E0226 18:03:58.629458 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.147462 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535484-xbpz9"] Feb 26 18:04:00 crc kubenswrapper[5033]: E0226 18:04:00.148216 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2997a2-842b-4090-a444-48f355bf2249" containerName="extract-content" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.148231 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2997a2-842b-4090-a444-48f355bf2249" containerName="extract-content" Feb 26 18:04:00 crc kubenswrapper[5033]: E0226 18:04:00.148277 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2997a2-842b-4090-a444-48f355bf2249" containerName="registry-server" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.148283 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2997a2-842b-4090-a444-48f355bf2249" containerName="registry-server" Feb 26 18:04:00 crc kubenswrapper[5033]: E0226 18:04:00.148298 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2997a2-842b-4090-a444-48f355bf2249" containerName="extract-utilities" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.148307 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2997a2-842b-4090-a444-48f355bf2249" containerName="extract-utilities" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.148495 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec2997a2-842b-4090-a444-48f355bf2249" containerName="registry-server" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.149349 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535484-xbpz9" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.160742 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.160949 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.161150 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.165358 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535484-xbpz9"] Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.264750 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6pxk\" (UniqueName: \"kubernetes.io/projected/189975a7-197f-47c4-8cab-6d320c957d27-kube-api-access-j6pxk\") pod \"auto-csr-approver-29535484-xbpz9\" (UID: \"189975a7-197f-47c4-8cab-6d320c957d27\") " pod="openshift-infra/auto-csr-approver-29535484-xbpz9" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.366828 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6pxk\" (UniqueName: \"kubernetes.io/projected/189975a7-197f-47c4-8cab-6d320c957d27-kube-api-access-j6pxk\") pod \"auto-csr-approver-29535484-xbpz9\" (UID: \"189975a7-197f-47c4-8cab-6d320c957d27\") " pod="openshift-infra/auto-csr-approver-29535484-xbpz9" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.390521 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6pxk\" (UniqueName: \"kubernetes.io/projected/189975a7-197f-47c4-8cab-6d320c957d27-kube-api-access-j6pxk\") pod \"auto-csr-approver-29535484-xbpz9\" (UID: \"189975a7-197f-47c4-8cab-6d320c957d27\") " pod="openshift-infra/auto-csr-approver-29535484-xbpz9" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.477902 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535484-xbpz9" Feb 26 18:04:00 crc kubenswrapper[5033]: I0226 18:04:00.967226 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535484-xbpz9"] Feb 26 18:04:01 crc kubenswrapper[5033]: I0226 18:04:01.654652 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535484-xbpz9" event={"ID":"189975a7-197f-47c4-8cab-6d320c957d27","Type":"ContainerStarted","Data":"e96a77ebd947be63b8b32e9fb39f38865d83d9e5205be286dca0ad728d6ea405"} Feb 26 18:04:02 crc kubenswrapper[5033]: I0226 18:04:02.667852 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535484-xbpz9" event={"ID":"189975a7-197f-47c4-8cab-6d320c957d27","Type":"ContainerStarted","Data":"e632efbbdc67d8f7f239b28828370012b78308bc07f58c2e3672fddccf8498ae"} Feb 26 18:04:02 crc kubenswrapper[5033]: I0226 18:04:02.686323 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535484-xbpz9" podStartSLOduration=1.795325931 podStartE2EDuration="2.686295402s" podCreationTimestamp="2026-02-26 18:04:00 +0000 UTC" firstStartedPulling="2026-02-26 18:04:00.979737356 +0000 UTC m=+8568.192336006" lastFinishedPulling="2026-02-26 18:04:01.870706797 +0000 UTC m=+8569.083305477" observedRunningTime="2026-02-26 18:04:02.685465172 +0000 UTC m=+8569.898063822" watchObservedRunningTime="2026-02-26 18:04:02.686295402 +0000 UTC m=+8569.898894092" Feb 26 18:04:03 crc kubenswrapper[5033]: I0226 18:04:03.683927 5033 generic.go:334] "Generic (PLEG): container finished" podID="189975a7-197f-47c4-8cab-6d320c957d27" containerID="e632efbbdc67d8f7f239b28828370012b78308bc07f58c2e3672fddccf8498ae" exitCode=0 Feb 26 18:04:03 crc kubenswrapper[5033]: I0226 18:04:03.684034 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535484-xbpz9" event={"ID":"189975a7-197f-47c4-8cab-6d320c957d27","Type":"ContainerDied","Data":"e632efbbdc67d8f7f239b28828370012b78308bc07f58c2e3672fddccf8498ae"} Feb 26 18:04:05 crc kubenswrapper[5033]: I0226 18:04:05.213355 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535484-xbpz9" Feb 26 18:04:05 crc kubenswrapper[5033]: I0226 18:04:05.291506 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6pxk\" (UniqueName: \"kubernetes.io/projected/189975a7-197f-47c4-8cab-6d320c957d27-kube-api-access-j6pxk\") pod \"189975a7-197f-47c4-8cab-6d320c957d27\" (UID: \"189975a7-197f-47c4-8cab-6d320c957d27\") " Feb 26 18:04:05 crc kubenswrapper[5033]: I0226 18:04:05.298590 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189975a7-197f-47c4-8cab-6d320c957d27-kube-api-access-j6pxk" (OuterVolumeSpecName: "kube-api-access-j6pxk") pod "189975a7-197f-47c4-8cab-6d320c957d27" (UID: "189975a7-197f-47c4-8cab-6d320c957d27"). InnerVolumeSpecName "kube-api-access-j6pxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:04:05 crc kubenswrapper[5033]: I0226 18:04:05.394333 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6pxk\" (UniqueName: \"kubernetes.io/projected/189975a7-197f-47c4-8cab-6d320c957d27-kube-api-access-j6pxk\") on node \"crc\" DevicePath \"\"" Feb 26 18:04:05 crc kubenswrapper[5033]: I0226 18:04:05.713395 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535484-xbpz9" event={"ID":"189975a7-197f-47c4-8cab-6d320c957d27","Type":"ContainerDied","Data":"e96a77ebd947be63b8b32e9fb39f38865d83d9e5205be286dca0ad728d6ea405"} Feb 26 18:04:05 crc kubenswrapper[5033]: I0226 18:04:05.713467 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e96a77ebd947be63b8b32e9fb39f38865d83d9e5205be286dca0ad728d6ea405" Feb 26 18:04:05 crc kubenswrapper[5033]: I0226 18:04:05.713424 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535484-xbpz9" Feb 26 18:04:05 crc kubenswrapper[5033]: I0226 18:04:05.767812 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535478-4trzm"] Feb 26 18:04:05 crc kubenswrapper[5033]: I0226 18:04:05.776574 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535478-4trzm"] Feb 26 18:04:07 crc kubenswrapper[5033]: I0226 18:04:07.651579 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0" path="/var/lib/kubelet/pods/455c7377-ca4a-4b0d-8d83-c3f40c9ee3b0/volumes" Feb 26 18:04:09 crc kubenswrapper[5033]: I0226 18:04:09.633698 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:04:09 crc kubenswrapper[5033]: E0226 18:04:09.634311 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:04:20 crc kubenswrapper[5033]: I0226 18:04:20.628178 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:04:20 crc kubenswrapper[5033]: E0226 18:04:20.629102 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:04:25 crc kubenswrapper[5033]: I0226 18:04:25.949062 5033 generic.go:334] "Generic (PLEG): container finished" podID="cc11177a-f224-4e9d-9b44-92dd6741bb1c" containerID="740af80ca85f3c98265489fea39ea35bc955675beba015182fb8014e79278c67" exitCode=0 Feb 26 18:04:25 crc kubenswrapper[5033]: I0226 18:04:25.949159 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-rswv4" event={"ID":"cc11177a-f224-4e9d-9b44-92dd6741bb1c","Type":"ContainerDied","Data":"740af80ca85f3c98265489fea39ea35bc955675beba015182fb8014e79278c67"} Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.478192 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.633574 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-2\") pod \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.633697 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-telemetry-combined-ca-bundle\") pod \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.633809 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-1\") pod \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.633890 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5hsc\" (UniqueName: \"kubernetes.io/projected/cc11177a-f224-4e9d-9b44-92dd6741bb1c-kube-api-access-n5hsc\") pod \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.633939 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ssh-key-openstack-cell1\") pod \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.634049 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceph\") pod \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.634122 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-inventory\") pod \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.634198 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-0\") pod \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\" (UID: \"cc11177a-f224-4e9d-9b44-92dd6741bb1c\") " Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.645835 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "cc11177a-f224-4e9d-9b44-92dd6741bb1c" (UID: "cc11177a-f224-4e9d-9b44-92dd6741bb1c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.645870 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceph" (OuterVolumeSpecName: "ceph") pod "cc11177a-f224-4e9d-9b44-92dd6741bb1c" (UID: "cc11177a-f224-4e9d-9b44-92dd6741bb1c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.646146 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc11177a-f224-4e9d-9b44-92dd6741bb1c-kube-api-access-n5hsc" (OuterVolumeSpecName: "kube-api-access-n5hsc") pod "cc11177a-f224-4e9d-9b44-92dd6741bb1c" (UID: "cc11177a-f224-4e9d-9b44-92dd6741bb1c"). InnerVolumeSpecName "kube-api-access-n5hsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.673653 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-inventory" (OuterVolumeSpecName: "inventory") pod "cc11177a-f224-4e9d-9b44-92dd6741bb1c" (UID: "cc11177a-f224-4e9d-9b44-92dd6741bb1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.678137 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "cc11177a-f224-4e9d-9b44-92dd6741bb1c" (UID: "cc11177a-f224-4e9d-9b44-92dd6741bb1c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.693637 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "cc11177a-f224-4e9d-9b44-92dd6741bb1c" (UID: "cc11177a-f224-4e9d-9b44-92dd6741bb1c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.698636 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "cc11177a-f224-4e9d-9b44-92dd6741bb1c" (UID: "cc11177a-f224-4e9d-9b44-92dd6741bb1c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.713770 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cc11177a-f224-4e9d-9b44-92dd6741bb1c" (UID: "cc11177a-f224-4e9d-9b44-92dd6741bb1c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.737622 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5hsc\" (UniqueName: \"kubernetes.io/projected/cc11177a-f224-4e9d-9b44-92dd6741bb1c-kube-api-access-n5hsc\") on node \"crc\" DevicePath \"\"" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.737658 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.737670 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.737681 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.737690 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.737700 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.737711 5033 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.737720 5033 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc11177a-f224-4e9d-9b44-92dd6741bb1c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.984173 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-rswv4" event={"ID":"cc11177a-f224-4e9d-9b44-92dd6741bb1c","Type":"ContainerDied","Data":"2938aa7b3cebf4cde44f4c98ee230a3caf1331489b0fd1d8de762b3465c17301"} Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.984643 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2938aa7b3cebf4cde44f4c98ee230a3caf1331489b0fd1d8de762b3465c17301" Feb 26 18:04:27 crc kubenswrapper[5033]: I0226 18:04:27.984286 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-rswv4" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.120093 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-6txrj"] Feb 26 18:04:28 crc kubenswrapper[5033]: E0226 18:04:28.120849 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc11177a-f224-4e9d-9b44-92dd6741bb1c" containerName="telemetry-openstack-openstack-cell1" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.120886 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc11177a-f224-4e9d-9b44-92dd6741bb1c" containerName="telemetry-openstack-openstack-cell1" Feb 26 18:04:28 crc kubenswrapper[5033]: E0226 18:04:28.120917 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189975a7-197f-47c4-8cab-6d320c957d27" containerName="oc" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.120930 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="189975a7-197f-47c4-8cab-6d320c957d27" containerName="oc" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.121245 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc11177a-f224-4e9d-9b44-92dd6741bb1c" containerName="telemetry-openstack-openstack-cell1" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.121328 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="189975a7-197f-47c4-8cab-6d320c957d27" containerName="oc" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.125419 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.129002 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.129396 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.129718 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.129830 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.130080 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.157129 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-6txrj"] Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.252463 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.252534 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq9hn\" (UniqueName: \"kubernetes.io/projected/0aece9f3-88d3-458e-8587-fbb1090e0f29-kube-api-access-jq9hn\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.252565 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.252585 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.252641 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.252786 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.354158 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.354205 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.354242 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq9hn\" (UniqueName: \"kubernetes.io/projected/0aece9f3-88d3-458e-8587-fbb1090e0f29-kube-api-access-jq9hn\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.354285 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.354307 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.354352 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.358079 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.358131 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.358987 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.359657 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.360083 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.372488 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq9hn\" (UniqueName: \"kubernetes.io/projected/0aece9f3-88d3-458e-8587-fbb1090e0f29-kube-api-access-jq9hn\") pod \"neutron-sriov-openstack-openstack-cell1-6txrj\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:28 crc kubenswrapper[5033]: I0226 18:04:28.462670 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:04:29 crc kubenswrapper[5033]: W0226 18:04:29.047794 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aece9f3_88d3_458e_8587_fbb1090e0f29.slice/crio-70d9ed5a26eccfdbce0e20d74571e0c730df72ea3742390ee363eca4bfae7017 WatchSource:0}: Error finding container 70d9ed5a26eccfdbce0e20d74571e0c730df72ea3742390ee363eca4bfae7017: Status 404 returned error can't find the container with id 70d9ed5a26eccfdbce0e20d74571e0c730df72ea3742390ee363eca4bfae7017 Feb 26 18:04:29 crc kubenswrapper[5033]: I0226 18:04:29.047894 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-6txrj"] Feb 26 18:04:30 crc kubenswrapper[5033]: I0226 18:04:30.004855 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" event={"ID":"0aece9f3-88d3-458e-8587-fbb1090e0f29","Type":"ContainerStarted","Data":"f4e67be4aaf9fce1d044a7549392ecc28f85f4f12866431773b07610c1ff8cd2"} Feb 26 18:04:30 crc kubenswrapper[5033]: I0226 18:04:30.005395 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" event={"ID":"0aece9f3-88d3-458e-8587-fbb1090e0f29","Type":"ContainerStarted","Data":"70d9ed5a26eccfdbce0e20d74571e0c730df72ea3742390ee363eca4bfae7017"} Feb 26 18:04:30 crc kubenswrapper[5033]: I0226 18:04:30.033242 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" podStartSLOduration=1.5170571160000001 podStartE2EDuration="2.033225439s" podCreationTimestamp="2026-02-26 18:04:28 +0000 UTC" firstStartedPulling="2026-02-26 18:04:29.050873482 +0000 UTC m=+8596.263472132" lastFinishedPulling="2026-02-26 18:04:29.567041805 +0000 UTC m=+8596.779640455" observedRunningTime="2026-02-26 18:04:30.021025182 +0000 UTC m=+8597.233623832" watchObservedRunningTime="2026-02-26 18:04:30.033225439 +0000 UTC m=+8597.245824089" Feb 26 18:04:31 crc kubenswrapper[5033]: I0226 18:04:31.629147 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:04:31 crc kubenswrapper[5033]: E0226 18:04:31.629964 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:04:43 crc kubenswrapper[5033]: I0226 18:04:43.547556 5033 scope.go:117] "RemoveContainer" containerID="c77d6edeaaa2618f97aaf6742ce48adaade5220adf6f02a639819c2d7d1f4b8b" Feb 26 18:04:44 crc kubenswrapper[5033]: I0226 18:04:44.629080 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:04:44 crc kubenswrapper[5033]: E0226 18:04:44.631080 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:04:59 crc kubenswrapper[5033]: I0226 18:04:59.628283 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:04:59 crc kubenswrapper[5033]: E0226 18:04:59.629355 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:05:11 crc kubenswrapper[5033]: I0226 18:05:11.628241 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:05:11 crc kubenswrapper[5033]: E0226 18:05:11.629523 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:05:25 crc kubenswrapper[5033]: I0226 18:05:25.632893 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:05:25 crc kubenswrapper[5033]: E0226 18:05:25.633771 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:05:32 crc kubenswrapper[5033]: I0226 18:05:32.073024 5033 generic.go:334] "Generic (PLEG): container finished" podID="0aece9f3-88d3-458e-8587-fbb1090e0f29" containerID="f4e67be4aaf9fce1d044a7549392ecc28f85f4f12866431773b07610c1ff8cd2" exitCode=0 Feb 26 18:05:32 crc kubenswrapper[5033]: I0226 18:05:32.073113 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" event={"ID":"0aece9f3-88d3-458e-8587-fbb1090e0f29","Type":"ContainerDied","Data":"f4e67be4aaf9fce1d044a7549392ecc28f85f4f12866431773b07610c1ff8cd2"} Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.621761 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.791180 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ssh-key-openstack-cell1\") pod \"0aece9f3-88d3-458e-8587-fbb1090e0f29\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.791391 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ceph\") pod \"0aece9f3-88d3-458e-8587-fbb1090e0f29\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.791492 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-inventory\") pod \"0aece9f3-88d3-458e-8587-fbb1090e0f29\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.791671 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-combined-ca-bundle\") pod \"0aece9f3-88d3-458e-8587-fbb1090e0f29\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.791759 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-agent-neutron-config-0\") pod \"0aece9f3-88d3-458e-8587-fbb1090e0f29\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.791876 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq9hn\" (UniqueName: \"kubernetes.io/projected/0aece9f3-88d3-458e-8587-fbb1090e0f29-kube-api-access-jq9hn\") pod \"0aece9f3-88d3-458e-8587-fbb1090e0f29\" (UID: \"0aece9f3-88d3-458e-8587-fbb1090e0f29\") " Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.798529 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ceph" (OuterVolumeSpecName: "ceph") pod "0aece9f3-88d3-458e-8587-fbb1090e0f29" (UID: "0aece9f3-88d3-458e-8587-fbb1090e0f29"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.803472 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aece9f3-88d3-458e-8587-fbb1090e0f29-kube-api-access-jq9hn" (OuterVolumeSpecName: "kube-api-access-jq9hn") pod "0aece9f3-88d3-458e-8587-fbb1090e0f29" (UID: "0aece9f3-88d3-458e-8587-fbb1090e0f29"). InnerVolumeSpecName "kube-api-access-jq9hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.804390 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "0aece9f3-88d3-458e-8587-fbb1090e0f29" (UID: "0aece9f3-88d3-458e-8587-fbb1090e0f29"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.821448 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-inventory" (OuterVolumeSpecName: "inventory") pod "0aece9f3-88d3-458e-8587-fbb1090e0f29" (UID: "0aece9f3-88d3-458e-8587-fbb1090e0f29"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.822962 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "0aece9f3-88d3-458e-8587-fbb1090e0f29" (UID: "0aece9f3-88d3-458e-8587-fbb1090e0f29"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.827800 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0aece9f3-88d3-458e-8587-fbb1090e0f29" (UID: "0aece9f3-88d3-458e-8587-fbb1090e0f29"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.895698 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.895734 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.895749 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq9hn\" (UniqueName: \"kubernetes.io/projected/0aece9f3-88d3-458e-8587-fbb1090e0f29-kube-api-access-jq9hn\") on node \"crc\" DevicePath \"\"" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.895761 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.895770 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 18:05:33 crc kubenswrapper[5033]: I0226 18:05:33.895800 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0aece9f3-88d3-458e-8587-fbb1090e0f29-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.092622 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" event={"ID":"0aece9f3-88d3-458e-8587-fbb1090e0f29","Type":"ContainerDied","Data":"70d9ed5a26eccfdbce0e20d74571e0c730df72ea3742390ee363eca4bfae7017"} Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.092669 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70d9ed5a26eccfdbce0e20d74571e0c730df72ea3742390ee363eca4bfae7017" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.092701 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6txrj" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.188724 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-b6psf"] Feb 26 18:05:34 crc kubenswrapper[5033]: E0226 18:05:34.189197 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aece9f3-88d3-458e-8587-fbb1090e0f29" containerName="neutron-sriov-openstack-openstack-cell1" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.189216 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aece9f3-88d3-458e-8587-fbb1090e0f29" containerName="neutron-sriov-openstack-openstack-cell1" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.189415 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aece9f3-88d3-458e-8587-fbb1090e0f29" containerName="neutron-sriov-openstack-openstack-cell1" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.190163 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.199443 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.200495 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.200565 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.200660 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.200775 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.200850 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zfbp\" (UniqueName: \"kubernetes.io/projected/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-kube-api-access-4zfbp\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.201124 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.202388 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.202543 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.202557 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.202926 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.227083 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-b6psf"] Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.302268 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.302365 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.302400 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zfbp\" (UniqueName: \"kubernetes.io/projected/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-kube-api-access-4zfbp\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.302436 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.302463 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.302485 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.307310 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.307700 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.308094 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.308235 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.309381 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.325615 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zfbp\" (UniqueName: \"kubernetes.io/projected/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-kube-api-access-4zfbp\") pod \"neutron-dhcp-openstack-openstack-cell1-b6psf\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:34 crc kubenswrapper[5033]: I0226 18:05:34.511625 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:05:35 crc kubenswrapper[5033]: I0226 18:05:35.055519 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-b6psf"] Feb 26 18:05:35 crc kubenswrapper[5033]: I0226 18:05:35.104249 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" event={"ID":"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397","Type":"ContainerStarted","Data":"04bda00d1f3af0905b1b542851ee0574abbb21c8c075f19f0cb6a762de2d2ff8"} Feb 26 18:05:36 crc kubenswrapper[5033]: I0226 18:05:36.116368 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" event={"ID":"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397","Type":"ContainerStarted","Data":"037f9ec2cc0de9b552f7ccc35983e8b8eaac1fa546b690c8744623f8df5ec208"} Feb 26 18:05:36 crc kubenswrapper[5033]: I0226 18:05:36.144334 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" podStartSLOduration=1.6601395079999999 podStartE2EDuration="2.144317421s" podCreationTimestamp="2026-02-26 18:05:34 +0000 UTC" firstStartedPulling="2026-02-26 18:05:35.06613448 +0000 UTC m=+8662.278733130" lastFinishedPulling="2026-02-26 18:05:35.550312383 +0000 UTC m=+8662.762911043" observedRunningTime="2026-02-26 18:05:36.132301818 +0000 UTC m=+8663.344900518" watchObservedRunningTime="2026-02-26 18:05:36.144317421 +0000 UTC m=+8663.356916071" Feb 26 18:05:36 crc kubenswrapper[5033]: I0226 18:05:36.630126 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:05:36 crc kubenswrapper[5033]: E0226 18:05:36.632610 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:05:51 crc kubenswrapper[5033]: I0226 18:05:51.627692 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:05:51 crc kubenswrapper[5033]: E0226 18:05:51.628444 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.156073 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535486-6npdb"] Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.158424 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535486-6npdb" Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.161505 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.161618 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.162530 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.176396 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535486-6npdb"] Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.310741 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxs26\" (UniqueName: \"kubernetes.io/projected/eb559c1b-f454-48a4-9afa-1b52c3a221b2-kube-api-access-mxs26\") pod \"auto-csr-approver-29535486-6npdb\" (UID: \"eb559c1b-f454-48a4-9afa-1b52c3a221b2\") " pod="openshift-infra/auto-csr-approver-29535486-6npdb" Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.413151 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxs26\" (UniqueName: \"kubernetes.io/projected/eb559c1b-f454-48a4-9afa-1b52c3a221b2-kube-api-access-mxs26\") pod \"auto-csr-approver-29535486-6npdb\" (UID: \"eb559c1b-f454-48a4-9afa-1b52c3a221b2\") " pod="openshift-infra/auto-csr-approver-29535486-6npdb" Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.432313 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxs26\" (UniqueName: \"kubernetes.io/projected/eb559c1b-f454-48a4-9afa-1b52c3a221b2-kube-api-access-mxs26\") pod \"auto-csr-approver-29535486-6npdb\" (UID: \"eb559c1b-f454-48a4-9afa-1b52c3a221b2\") " pod="openshift-infra/auto-csr-approver-29535486-6npdb" Feb 26 18:06:00 crc kubenswrapper[5033]: I0226 18:06:00.482843 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535486-6npdb" Feb 26 18:06:01 crc kubenswrapper[5033]: I0226 18:06:01.024612 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535486-6npdb"] Feb 26 18:06:01 crc kubenswrapper[5033]: I0226 18:06:01.653943 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535486-6npdb" event={"ID":"eb559c1b-f454-48a4-9afa-1b52c3a221b2","Type":"ContainerStarted","Data":"3ddd4ed931011a91fe1bb2754e0c13442e4869e8175faf4ae853178278ebdbb9"} Feb 26 18:06:02 crc kubenswrapper[5033]: I0226 18:06:02.665184 5033 generic.go:334] "Generic (PLEG): container finished" podID="eb559c1b-f454-48a4-9afa-1b52c3a221b2" containerID="3eead5fbf24f5f600fc14ce245509c7ebb8364b66a6b3f1d533c274e5b84dbd6" exitCode=0 Feb 26 18:06:02 crc kubenswrapper[5033]: I0226 18:06:02.665293 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535486-6npdb" event={"ID":"eb559c1b-f454-48a4-9afa-1b52c3a221b2","Type":"ContainerDied","Data":"3eead5fbf24f5f600fc14ce245509c7ebb8364b66a6b3f1d533c274e5b84dbd6"} Feb 26 18:06:04 crc kubenswrapper[5033]: I0226 18:06:04.063201 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535486-6npdb" Feb 26 18:06:04 crc kubenswrapper[5033]: I0226 18:06:04.202764 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxs26\" (UniqueName: \"kubernetes.io/projected/eb559c1b-f454-48a4-9afa-1b52c3a221b2-kube-api-access-mxs26\") pod \"eb559c1b-f454-48a4-9afa-1b52c3a221b2\" (UID: \"eb559c1b-f454-48a4-9afa-1b52c3a221b2\") " Feb 26 18:06:04 crc kubenswrapper[5033]: I0226 18:06:04.207903 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb559c1b-f454-48a4-9afa-1b52c3a221b2-kube-api-access-mxs26" (OuterVolumeSpecName: "kube-api-access-mxs26") pod "eb559c1b-f454-48a4-9afa-1b52c3a221b2" (UID: "eb559c1b-f454-48a4-9afa-1b52c3a221b2"). InnerVolumeSpecName "kube-api-access-mxs26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:06:04 crc kubenswrapper[5033]: I0226 18:06:04.305486 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxs26\" (UniqueName: \"kubernetes.io/projected/eb559c1b-f454-48a4-9afa-1b52c3a221b2-kube-api-access-mxs26\") on node \"crc\" DevicePath \"\"" Feb 26 18:06:04 crc kubenswrapper[5033]: I0226 18:06:04.686729 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535486-6npdb" event={"ID":"eb559c1b-f454-48a4-9afa-1b52c3a221b2","Type":"ContainerDied","Data":"3ddd4ed931011a91fe1bb2754e0c13442e4869e8175faf4ae853178278ebdbb9"} Feb 26 18:06:04 crc kubenswrapper[5033]: I0226 18:06:04.686768 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ddd4ed931011a91fe1bb2754e0c13442e4869e8175faf4ae853178278ebdbb9" Feb 26 18:06:04 crc kubenswrapper[5033]: I0226 18:06:04.686788 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535486-6npdb" Feb 26 18:06:05 crc kubenswrapper[5033]: I0226 18:06:05.147831 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535480-lkv9r"] Feb 26 18:06:05 crc kubenswrapper[5033]: I0226 18:06:05.159857 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535480-lkv9r"] Feb 26 18:06:05 crc kubenswrapper[5033]: I0226 18:06:05.630416 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:06:05 crc kubenswrapper[5033]: E0226 18:06:05.630769 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:06:05 crc kubenswrapper[5033]: I0226 18:06:05.643137 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e38a49f6-054d-4810-ba49-48bc9ad5d5fe" path="/var/lib/kubelet/pods/e38a49f6-054d-4810-ba49-48bc9ad5d5fe/volumes" Feb 26 18:06:17 crc kubenswrapper[5033]: I0226 18:06:17.628873 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:06:17 crc kubenswrapper[5033]: E0226 18:06:17.629818 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:06:31 crc kubenswrapper[5033]: I0226 18:06:31.629470 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:06:31 crc kubenswrapper[5033]: E0226 18:06:31.631006 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:06:43 crc kubenswrapper[5033]: I0226 18:06:43.662049 5033 scope.go:117] "RemoveContainer" containerID="debbf59216d9df949748f823c0eeb4fa587c3004ba88c5b40907c7cb1c7ade50" Feb 26 18:06:45 crc kubenswrapper[5033]: I0226 18:06:45.629104 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:06:45 crc kubenswrapper[5033]: E0226 18:06:45.629753 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:06:48 crc kubenswrapper[5033]: I0226 18:06:48.145112 5033 generic.go:334] "Generic (PLEG): container finished" podID="8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" containerID="037f9ec2cc0de9b552f7ccc35983e8b8eaac1fa546b690c8744623f8df5ec208" exitCode=0 Feb 26 18:06:48 crc kubenswrapper[5033]: I0226 18:06:48.145209 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" event={"ID":"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397","Type":"ContainerDied","Data":"037f9ec2cc0de9b552f7ccc35983e8b8eaac1fa546b690c8744623f8df5ec208"} Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.690195 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.873159 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-inventory\") pod \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.873390 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zfbp\" (UniqueName: \"kubernetes.io/projected/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-kube-api-access-4zfbp\") pod \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.873518 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-agent-neutron-config-0\") pod \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.873545 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ssh-key-openstack-cell1\") pod \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.873603 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ceph\") pod \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.873693 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-combined-ca-bundle\") pod \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\" (UID: \"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397\") " Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.884417 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ceph" (OuterVolumeSpecName: "ceph") pod "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" (UID: "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.884465 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" (UID: "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.888823 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-kube-api-access-4zfbp" (OuterVolumeSpecName: "kube-api-access-4zfbp") pod "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" (UID: "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397"). InnerVolumeSpecName "kube-api-access-4zfbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.905430 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-inventory" (OuterVolumeSpecName: "inventory") pod "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" (UID: "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.905793 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" (UID: "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.907687 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" (UID: "8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.980039 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zfbp\" (UniqueName: \"kubernetes.io/projected/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-kube-api-access-4zfbp\") on node \"crc\" DevicePath \"\"" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.980080 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.980093 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.980103 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.980115 5033 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:06:49 crc kubenswrapper[5033]: I0226 18:06:49.980127 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 18:06:50 crc kubenswrapper[5033]: I0226 18:06:50.165085 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" event={"ID":"8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397","Type":"ContainerDied","Data":"04bda00d1f3af0905b1b542851ee0574abbb21c8c075f19f0cb6a762de2d2ff8"} Feb 26 18:06:50 crc kubenswrapper[5033]: I0226 18:06:50.165398 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04bda00d1f3af0905b1b542851ee0574abbb21c8c075f19f0cb6a762de2d2ff8" Feb 26 18:06:50 crc kubenswrapper[5033]: I0226 18:06:50.165492 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-b6psf" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.128614 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwwc"] Feb 26 18:06:54 crc kubenswrapper[5033]: E0226 18:06:54.129684 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb559c1b-f454-48a4-9afa-1b52c3a221b2" containerName="oc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.129699 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb559c1b-f454-48a4-9afa-1b52c3a221b2" containerName="oc" Feb 26 18:06:54 crc kubenswrapper[5033]: E0226 18:06:54.129720 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.129726 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.129935 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.129960 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb559c1b-f454-48a4-9afa-1b52c3a221b2" containerName="oc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.133981 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.150002 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwwc"] Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.176819 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-utilities\") pod \"redhat-marketplace-5rwwc\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.177042 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-catalog-content\") pod \"redhat-marketplace-5rwwc\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.177118 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsl66\" (UniqueName: \"kubernetes.io/projected/f6f61039-d98f-477f-abc4-685db9784bb3-kube-api-access-dsl66\") pod \"redhat-marketplace-5rwwc\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.279613 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-utilities\") pod \"redhat-marketplace-5rwwc\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.279962 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-catalog-content\") pod \"redhat-marketplace-5rwwc\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.279981 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsl66\" (UniqueName: \"kubernetes.io/projected/f6f61039-d98f-477f-abc4-685db9784bb3-kube-api-access-dsl66\") pod \"redhat-marketplace-5rwwc\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.280393 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-utilities\") pod \"redhat-marketplace-5rwwc\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.280601 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-catalog-content\") pod \"redhat-marketplace-5rwwc\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.299008 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsl66\" (UniqueName: \"kubernetes.io/projected/f6f61039-d98f-477f-abc4-685db9784bb3-kube-api-access-dsl66\") pod \"redhat-marketplace-5rwwc\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.466887 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:06:54 crc kubenswrapper[5033]: I0226 18:06:54.961976 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwwc"] Feb 26 18:06:55 crc kubenswrapper[5033]: I0226 18:06:55.223114 5033 generic.go:334] "Generic (PLEG): container finished" podID="f6f61039-d98f-477f-abc4-685db9784bb3" containerID="7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867" exitCode=0 Feb 26 18:06:55 crc kubenswrapper[5033]: I0226 18:06:55.223306 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwwc" event={"ID":"f6f61039-d98f-477f-abc4-685db9784bb3","Type":"ContainerDied","Data":"7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867"} Feb 26 18:06:55 crc kubenswrapper[5033]: I0226 18:06:55.223417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwwc" event={"ID":"f6f61039-d98f-477f-abc4-685db9784bb3","Type":"ContainerStarted","Data":"44d9754ecca0a288e484c93e73da49bda0e363300872109bd75fc5d660c8cab4"} Feb 26 18:06:55 crc kubenswrapper[5033]: I0226 18:06:55.226599 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 18:06:56 crc kubenswrapper[5033]: I0226 18:06:56.243147 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwwc" event={"ID":"f6f61039-d98f-477f-abc4-685db9784bb3","Type":"ContainerStarted","Data":"68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e"} Feb 26 18:06:57 crc kubenswrapper[5033]: I0226 18:06:57.260321 5033 generic.go:334] "Generic (PLEG): container finished" podID="f6f61039-d98f-477f-abc4-685db9784bb3" containerID="68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e" exitCode=0 Feb 26 18:06:57 crc kubenswrapper[5033]: I0226 18:06:57.260375 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwwc" event={"ID":"f6f61039-d98f-477f-abc4-685db9784bb3","Type":"ContainerDied","Data":"68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e"} Feb 26 18:06:58 crc kubenswrapper[5033]: I0226 18:06:58.271533 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwwc" event={"ID":"f6f61039-d98f-477f-abc4-685db9784bb3","Type":"ContainerStarted","Data":"4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8"} Feb 26 18:06:58 crc kubenswrapper[5033]: I0226 18:06:58.295947 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5rwwc" podStartSLOduration=1.752162229 podStartE2EDuration="4.29592668s" podCreationTimestamp="2026-02-26 18:06:54 +0000 UTC" firstStartedPulling="2026-02-26 18:06:55.226361312 +0000 UTC m=+8742.438959962" lastFinishedPulling="2026-02-26 18:06:57.770125723 +0000 UTC m=+8744.982724413" observedRunningTime="2026-02-26 18:06:58.288556311 +0000 UTC m=+8745.501154971" watchObservedRunningTime="2026-02-26 18:06:58.29592668 +0000 UTC m=+8745.508525340" Feb 26 18:07:00 crc kubenswrapper[5033]: I0226 18:07:00.629518 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:07:00 crc kubenswrapper[5033]: E0226 18:07:00.630126 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:07:03 crc kubenswrapper[5033]: I0226 18:07:03.320499 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 18:07:03 crc kubenswrapper[5033]: I0226 18:07:03.321295 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="229c2ab4-2387-4058-b422-f10ad64325bf" containerName="nova-cell0-conductor-conductor" containerID="cri-o://27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3" gracePeriod=30 Feb 26 18:07:03 crc kubenswrapper[5033]: I0226 18:07:03.819772 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 18:07:03 crc kubenswrapper[5033]: I0226 18:07:03.820071 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="98ee9213-7874-4f09-8b6f-5b4361f24147" containerName="nova-cell1-conductor-conductor" containerID="cri-o://68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc" gracePeriod=30 Feb 26 18:07:03 crc kubenswrapper[5033]: I0226 18:07:03.962526 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 18:07:03 crc kubenswrapper[5033]: I0226 18:07:03.962966 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-api" containerID="cri-o://cd5bb9ba709f7c053bfba60d1708532c94c2591e5dcb65ee3855c293b02abb70" gracePeriod=30 Feb 26 18:07:03 crc kubenswrapper[5033]: I0226 18:07:03.962982 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-log" containerID="cri-o://c2479315d0e5d67819319cc183d9efdf68960a1b9cf36223c878fd2494275e5a" gracePeriod=30 Feb 26 18:07:03 crc kubenswrapper[5033]: I0226 18:07:03.981444 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 18:07:03 crc kubenswrapper[5033]: I0226 18:07:03.981676 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6dbd74aa-2791-4724-946a-21b175fdedd6" containerName="nova-scheduler-scheduler" containerID="cri-o://28cb606fda414c24a338470fdf12da2bbb4e68d9c9794d8b44e088c1b467cb96" gracePeriod=30 Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.029242 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.030335 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-log" containerID="cri-o://c1ebd84ed8a59c6a3d9052f764e3fd0bfea732cc448fc55d840de3366cf78e85" gracePeriod=30 Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.030688 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-metadata" containerID="cri-o://44eda056281b096957a0d89db6856d952952d71fddfa9946b5125358e77a3722" gracePeriod=30 Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.342313 5033 generic.go:334] "Generic (PLEG): container finished" podID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerID="c1ebd84ed8a59c6a3d9052f764e3fd0bfea732cc448fc55d840de3366cf78e85" exitCode=143 Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.342377 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbb98341-a8eb-4bed-94ed-c70daa525734","Type":"ContainerDied","Data":"c1ebd84ed8a59c6a3d9052f764e3fd0bfea732cc448fc55d840de3366cf78e85"} Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.344647 5033 generic.go:334] "Generic (PLEG): container finished" podID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerID="c2479315d0e5d67819319cc183d9efdf68960a1b9cf36223c878fd2494275e5a" exitCode=143 Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.344671 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b24d004-b104-4214-a60d-dfc3a02c0110","Type":"ContainerDied","Data":"c2479315d0e5d67819319cc183d9efdf68960a1b9cf36223c878fd2494275e5a"} Feb 26 18:07:04 crc kubenswrapper[5033]: E0226 18:07:04.447831 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 26 18:07:04 crc kubenswrapper[5033]: E0226 18:07:04.449153 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 26 18:07:04 crc kubenswrapper[5033]: E0226 18:07:04.450188 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 26 18:07:04 crc kubenswrapper[5033]: E0226 18:07:04.450217 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="229c2ab4-2387-4058-b422-f10ad64325bf" containerName="nova-cell0-conductor-conductor" Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.467366 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.467466 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:07:04 crc kubenswrapper[5033]: I0226 18:07:04.516671 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:07:05 crc kubenswrapper[5033]: I0226 18:07:05.408847 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:07:05 crc kubenswrapper[5033]: I0226 18:07:05.473287 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwwc"] Feb 26 18:07:05 crc kubenswrapper[5033]: E0226 18:07:05.513951 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28cb606fda414c24a338470fdf12da2bbb4e68d9c9794d8b44e088c1b467cb96" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 18:07:05 crc kubenswrapper[5033]: E0226 18:07:05.517391 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28cb606fda414c24a338470fdf12da2bbb4e68d9c9794d8b44e088c1b467cb96" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 18:07:05 crc kubenswrapper[5033]: E0226 18:07:05.519456 5033 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28cb606fda414c24a338470fdf12da2bbb4e68d9c9794d8b44e088c1b467cb96" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 18:07:05 crc kubenswrapper[5033]: E0226 18:07:05.519510 5033 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6dbd74aa-2791-4724-946a-21b175fdedd6" containerName="nova-scheduler-scheduler" Feb 26 18:07:05 crc kubenswrapper[5033]: I0226 18:07:05.908652 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.045030 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc95t\" (UniqueName: \"kubernetes.io/projected/98ee9213-7874-4f09-8b6f-5b4361f24147-kube-api-access-mc95t\") pod \"98ee9213-7874-4f09-8b6f-5b4361f24147\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.045169 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-config-data\") pod \"98ee9213-7874-4f09-8b6f-5b4361f24147\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.045332 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-combined-ca-bundle\") pod \"98ee9213-7874-4f09-8b6f-5b4361f24147\" (UID: \"98ee9213-7874-4f09-8b6f-5b4361f24147\") " Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.061246 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ee9213-7874-4f09-8b6f-5b4361f24147-kube-api-access-mc95t" (OuterVolumeSpecName: "kube-api-access-mc95t") pod "98ee9213-7874-4f09-8b6f-5b4361f24147" (UID: "98ee9213-7874-4f09-8b6f-5b4361f24147"). InnerVolumeSpecName "kube-api-access-mc95t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.091606 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98ee9213-7874-4f09-8b6f-5b4361f24147" (UID: "98ee9213-7874-4f09-8b6f-5b4361f24147"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.092083 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-config-data" (OuterVolumeSpecName: "config-data") pod "98ee9213-7874-4f09-8b6f-5b4361f24147" (UID: "98ee9213-7874-4f09-8b6f-5b4361f24147"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.148325 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc95t\" (UniqueName: \"kubernetes.io/projected/98ee9213-7874-4f09-8b6f-5b4361f24147-kube-api-access-mc95t\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.148371 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.148384 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ee9213-7874-4f09-8b6f-5b4361f24147-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.371568 5033 generic.go:334] "Generic (PLEG): container finished" podID="98ee9213-7874-4f09-8b6f-5b4361f24147" containerID="68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc" exitCode=0 Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.371632 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.371682 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"98ee9213-7874-4f09-8b6f-5b4361f24147","Type":"ContainerDied","Data":"68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc"} Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.371748 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"98ee9213-7874-4f09-8b6f-5b4361f24147","Type":"ContainerDied","Data":"53d8d45194b746ee54964b2bb8180e54fd080427c2b78e4144e4aac6b9c6078c"} Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.371774 5033 scope.go:117] "RemoveContainer" containerID="68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.421393 5033 scope.go:117] "RemoveContainer" containerID="68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc" Feb 26 18:07:06 crc kubenswrapper[5033]: E0226 18:07:06.421876 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc\": container with ID starting with 68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc not found: ID does not exist" containerID="68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.421908 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc"} err="failed to get container status \"68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc\": rpc error: code = NotFound desc = could not find container \"68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc\": container with ID starting with 68d3abea13374a7445a73486b84601f4b99bcdc9e40c482f2cc8cbf585dc94cc not found: ID does not exist" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.445776 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.461314 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.473534 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 18:07:06 crc kubenswrapper[5033]: E0226 18:07:06.474213 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ee9213-7874-4f09-8b6f-5b4361f24147" containerName="nova-cell1-conductor-conductor" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.474242 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ee9213-7874-4f09-8b6f-5b4361f24147" containerName="nova-cell1-conductor-conductor" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.474642 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="98ee9213-7874-4f09-8b6f-5b4361f24147" containerName="nova-cell1-conductor-conductor" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.475850 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.478437 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.483380 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.661724 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e\") " pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.663038 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn6wj\" (UniqueName: \"kubernetes.io/projected/b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e-kube-api-access-sn6wj\") pod \"nova-cell1-conductor-0\" (UID: \"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e\") " pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.664773 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e\") " pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.768439 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6wj\" (UniqueName: \"kubernetes.io/projected/b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e-kube-api-access-sn6wj\") pod \"nova-cell1-conductor-0\" (UID: \"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e\") " pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.768539 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e\") " pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.768742 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e\") " pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.774619 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e\") " pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.774971 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e\") " pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.788086 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn6wj\" (UniqueName: \"kubernetes.io/projected/b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e-kube-api-access-sn6wj\") pod \"nova-cell1-conductor-0\" (UID: \"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e\") " pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:06 crc kubenswrapper[5033]: I0226 18:07:06.800779 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.151016 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.120:8775/\": read tcp 10.217.0.2:33428->10.217.1.120:8775: read: connection reset by peer" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.151065 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.120:8775/\": read tcp 10.217.0.2:33440->10.217.1.120:8775: read: connection reset by peer" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.311557 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.389117 5033 generic.go:334] "Generic (PLEG): container finished" podID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerID="cd5bb9ba709f7c053bfba60d1708532c94c2591e5dcb65ee3855c293b02abb70" exitCode=0 Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.389192 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b24d004-b104-4214-a60d-dfc3a02c0110","Type":"ContainerDied","Data":"cd5bb9ba709f7c053bfba60d1708532c94c2591e5dcb65ee3855c293b02abb70"} Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.392991 5033 generic.go:334] "Generic (PLEG): container finished" podID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerID="44eda056281b096957a0d89db6856d952952d71fddfa9946b5125358e77a3722" exitCode=0 Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.393175 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5rwwc" podUID="f6f61039-d98f-477f-abc4-685db9784bb3" containerName="registry-server" containerID="cri-o://4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8" gracePeriod=2 Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.393456 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbb98341-a8eb-4bed-94ed-c70daa525734","Type":"ContainerDied","Data":"44eda056281b096957a0d89db6856d952952d71fddfa9946b5125358e77a3722"} Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.575574 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.587451 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbb98341-a8eb-4bed-94ed-c70daa525734-logs\") pod \"dbb98341-a8eb-4bed-94ed-c70daa525734\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.587579 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-combined-ca-bundle\") pod \"dbb98341-a8eb-4bed-94ed-c70daa525734\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.587603 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78wzc\" (UniqueName: \"kubernetes.io/projected/dbb98341-a8eb-4bed-94ed-c70daa525734-kube-api-access-78wzc\") pod \"dbb98341-a8eb-4bed-94ed-c70daa525734\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.587643 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-config-data\") pod \"dbb98341-a8eb-4bed-94ed-c70daa525734\" (UID: \"dbb98341-a8eb-4bed-94ed-c70daa525734\") " Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.588713 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbb98341-a8eb-4bed-94ed-c70daa525734-logs" (OuterVolumeSpecName: "logs") pod "dbb98341-a8eb-4bed-94ed-c70daa525734" (UID: "dbb98341-a8eb-4bed-94ed-c70daa525734"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.591736 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb98341-a8eb-4bed-94ed-c70daa525734-kube-api-access-78wzc" (OuterVolumeSpecName: "kube-api-access-78wzc") pod "dbb98341-a8eb-4bed-94ed-c70daa525734" (UID: "dbb98341-a8eb-4bed-94ed-c70daa525734"). InnerVolumeSpecName "kube-api-access-78wzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.637284 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbb98341-a8eb-4bed-94ed-c70daa525734" (UID: "dbb98341-a8eb-4bed-94ed-c70daa525734"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.651041 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98ee9213-7874-4f09-8b6f-5b4361f24147" path="/var/lib/kubelet/pods/98ee9213-7874-4f09-8b6f-5b4361f24147/volumes" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.691922 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbb98341-a8eb-4bed-94ed-c70daa525734-logs\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.691959 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.691972 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78wzc\" (UniqueName: \"kubernetes.io/projected/dbb98341-a8eb-4bed-94ed-c70daa525734-kube-api-access-78wzc\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.699853 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-config-data" (OuterVolumeSpecName: "config-data") pod "dbb98341-a8eb-4bed-94ed-c70daa525734" (UID: "dbb98341-a8eb-4bed-94ed-c70daa525734"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.793810 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb98341-a8eb-4bed-94ed-c70daa525734-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.956044 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.997023 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-utilities\") pod \"f6f61039-d98f-477f-abc4-685db9784bb3\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.997300 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-catalog-content\") pod \"f6f61039-d98f-477f-abc4-685db9784bb3\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.997339 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsl66\" (UniqueName: \"kubernetes.io/projected/f6f61039-d98f-477f-abc4-685db9784bb3-kube-api-access-dsl66\") pod \"f6f61039-d98f-477f-abc4-685db9784bb3\" (UID: \"f6f61039-d98f-477f-abc4-685db9784bb3\") " Feb 26 18:07:07 crc kubenswrapper[5033]: I0226 18:07:07.999137 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-utilities" (OuterVolumeSpecName: "utilities") pod "f6f61039-d98f-477f-abc4-685db9784bb3" (UID: "f6f61039-d98f-477f-abc4-685db9784bb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.003404 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f61039-d98f-477f-abc4-685db9784bb3-kube-api-access-dsl66" (OuterVolumeSpecName: "kube-api-access-dsl66") pod "f6f61039-d98f-477f-abc4-685db9784bb3" (UID: "f6f61039-d98f-477f-abc4-685db9784bb3"). InnerVolumeSpecName "kube-api-access-dsl66". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.024486 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6f61039-d98f-477f-abc4-685db9784bb3" (UID: "f6f61039-d98f-477f-abc4-685db9784bb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.099496 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.099530 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsl66\" (UniqueName: \"kubernetes.io/projected/f6f61039-d98f-477f-abc4-685db9784bb3-kube-api-access-dsl66\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.099539 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f61039-d98f-477f-abc4-685db9784bb3-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.208126 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.303608 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfgx4\" (UniqueName: \"kubernetes.io/projected/8b24d004-b104-4214-a60d-dfc3a02c0110-kube-api-access-xfgx4\") pod \"8b24d004-b104-4214-a60d-dfc3a02c0110\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.304177 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b24d004-b104-4214-a60d-dfc3a02c0110-logs\") pod \"8b24d004-b104-4214-a60d-dfc3a02c0110\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.304467 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-combined-ca-bundle\") pod \"8b24d004-b104-4214-a60d-dfc3a02c0110\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.304874 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-config-data\") pod \"8b24d004-b104-4214-a60d-dfc3a02c0110\" (UID: \"8b24d004-b104-4214-a60d-dfc3a02c0110\") " Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.305915 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b24d004-b104-4214-a60d-dfc3a02c0110-logs" (OuterVolumeSpecName: "logs") pod "8b24d004-b104-4214-a60d-dfc3a02c0110" (UID: "8b24d004-b104-4214-a60d-dfc3a02c0110"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.306812 5033 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b24d004-b104-4214-a60d-dfc3a02c0110-logs\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.310643 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b24d004-b104-4214-a60d-dfc3a02c0110-kube-api-access-xfgx4" (OuterVolumeSpecName: "kube-api-access-xfgx4") pod "8b24d004-b104-4214-a60d-dfc3a02c0110" (UID: "8b24d004-b104-4214-a60d-dfc3a02c0110"). InnerVolumeSpecName "kube-api-access-xfgx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.337377 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b24d004-b104-4214-a60d-dfc3a02c0110" (UID: "8b24d004-b104-4214-a60d-dfc3a02c0110"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.346334 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-config-data" (OuterVolumeSpecName: "config-data") pod "8b24d004-b104-4214-a60d-dfc3a02c0110" (UID: "8b24d004-b104-4214-a60d-dfc3a02c0110"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.406837 5033 generic.go:334] "Generic (PLEG): container finished" podID="f6f61039-d98f-477f-abc4-685db9784bb3" containerID="4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8" exitCode=0 Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.406923 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5rwwc" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.406932 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwwc" event={"ID":"f6f61039-d98f-477f-abc4-685db9784bb3","Type":"ContainerDied","Data":"4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8"} Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.406967 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5rwwc" event={"ID":"f6f61039-d98f-477f-abc4-685db9784bb3","Type":"ContainerDied","Data":"44d9754ecca0a288e484c93e73da49bda0e363300872109bd75fc5d660c8cab4"} Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.406986 5033 scope.go:117] "RemoveContainer" containerID="4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.410391 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dbb98341-a8eb-4bed-94ed-c70daa525734","Type":"ContainerDied","Data":"3c70690149d1608f112373bfa8efdb0d39482a2ce8a0ea5d097ce22b05dc385e"} Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.410736 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.412417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e","Type":"ContainerStarted","Data":"768beb62e9e2258cccc334de5a00c6e1ec849d03df405bdb9581d75be90c4e3e"} Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.412456 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e","Type":"ContainerStarted","Data":"52326a124d3d62ee58e34732f479410b3c061da61b0e150f39bbb28038ecff90"} Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.412859 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.414139 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfgx4\" (UniqueName: \"kubernetes.io/projected/8b24d004-b104-4214-a60d-dfc3a02c0110-kube-api-access-xfgx4\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.414776 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.416104 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b24d004-b104-4214-a60d-dfc3a02c0110-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.417865 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8b24d004-b104-4214-a60d-dfc3a02c0110","Type":"ContainerDied","Data":"0f6732edcd752b96edec5a6c3937b80948f621316ae9626386b4670628986b4f"} Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.418006 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.433867 5033 scope.go:117] "RemoveContainer" containerID="68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.445911 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.445893821 podStartE2EDuration="2.445893821s" podCreationTimestamp="2026-02-26 18:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 18:07:08.435013986 +0000 UTC m=+8755.647612636" watchObservedRunningTime="2026-02-26 18:07:08.445893821 +0000 UTC m=+8755.658492471" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.474484 5033 scope.go:117] "RemoveContainer" containerID="7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.490125 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwwc"] Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.524653 5033 scope.go:117] "RemoveContainer" containerID="4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8" Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.529787 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8\": container with ID starting with 4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8 not found: ID does not exist" containerID="4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.529845 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8"} err="failed to get container status \"4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8\": rpc error: code = NotFound desc = could not find container \"4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8\": container with ID starting with 4149f56072fa2eae7af3661a1fcf6ea72444439836504bceb166ea8cf208a0c8 not found: ID does not exist" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.529876 5033 scope.go:117] "RemoveContainer" containerID="68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e" Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.530619 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e\": container with ID starting with 68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e not found: ID does not exist" containerID="68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.530651 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e"} err="failed to get container status \"68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e\": rpc error: code = NotFound desc = could not find container \"68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e\": container with ID starting with 68c968a8f2cd73a045c13fafcdf890d64965174101d022e17890f94a7211f06e not found: ID does not exist" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.530670 5033 scope.go:117] "RemoveContainer" containerID="7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867" Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.530839 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867\": container with ID starting with 7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867 not found: ID does not exist" containerID="7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.530861 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867"} err="failed to get container status \"7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867\": rpc error: code = NotFound desc = could not find container \"7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867\": container with ID starting with 7a8907b1299c9d83be6017d9ca20ae322cf002c8c257a060cc1c617439e2d867 not found: ID does not exist" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.530876 5033 scope.go:117] "RemoveContainer" containerID="44eda056281b096957a0d89db6856d952952d71fddfa9946b5125358e77a3722" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.552302 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5rwwc"] Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.566988 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.578512 5033 scope.go:117] "RemoveContainer" containerID="c1ebd84ed8a59c6a3d9052f764e3fd0bfea732cc448fc55d840de3366cf78e85" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.583721 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.589100 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f61039_d98f_477f_abc4_685db9784bb3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbb98341_a8eb_4bed_94ed_c70daa525734.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbb98341_a8eb_4bed_94ed_c70daa525734.slice/crio-3c70690149d1608f112373bfa8efdb0d39482a2ce8a0ea5d097ce22b05dc385e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b24d004_b104_4214_a60d_dfc3a02c0110.slice/crio-0f6732edcd752b96edec5a6c3937b80948f621316ae9626386b4670628986b4f\": RecentStats: unable to find data in memory cache]" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.595757 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.609046 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.611840 5033 scope.go:117] "RemoveContainer" containerID="cd5bb9ba709f7c053bfba60d1708532c94c2591e5dcb65ee3855c293b02abb70" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.625358 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.625936 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-log" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.625951 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-log" Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.625959 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-metadata" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.625966 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-metadata" Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.625996 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f61039-d98f-477f-abc4-685db9784bb3" containerName="extract-content" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626004 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f61039-d98f-477f-abc4-685db9784bb3" containerName="extract-content" Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.626012 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f61039-d98f-477f-abc4-685db9784bb3" containerName="registry-server" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626018 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f61039-d98f-477f-abc4-685db9784bb3" containerName="registry-server" Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.626042 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-log" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626048 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-log" Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.626062 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-api" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626069 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-api" Feb 26 18:07:08 crc kubenswrapper[5033]: E0226 18:07:08.626082 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f61039-d98f-477f-abc4-685db9784bb3" containerName="extract-utilities" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626088 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f61039-d98f-477f-abc4-685db9784bb3" containerName="extract-utilities" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626361 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-log" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626386 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f61039-d98f-477f-abc4-685db9784bb3" containerName="registry-server" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626407 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-api" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626431 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" containerName="nova-metadata-metadata" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.626443 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" containerName="nova-api-log" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.627930 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.630598 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.640826 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.656124 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.658196 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.661125 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.661781 5033 scope.go:117] "RemoveContainer" containerID="c2479315d0e5d67819319cc183d9efdf68960a1b9cf36223c878fd2494275e5a" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.670893 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.721459 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c6a87a-6888-4658-883f-bc4b1e2b469a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.721582 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa57611-c06c-499a-ac11-e1c080227fc9-config-data\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.721640 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrlc\" (UniqueName: \"kubernetes.io/projected/daa57611-c06c-499a-ac11-e1c080227fc9-kube-api-access-vfrlc\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.721668 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c69b\" (UniqueName: \"kubernetes.io/projected/27c6a87a-6888-4658-883f-bc4b1e2b469a-kube-api-access-4c69b\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.721698 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daa57611-c06c-499a-ac11-e1c080227fc9-logs\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.721733 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c6a87a-6888-4658-883f-bc4b1e2b469a-config-data\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.721792 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c6a87a-6888-4658-883f-bc4b1e2b469a-logs\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.721869 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa57611-c06c-499a-ac11-e1c080227fc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.823753 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c6a87a-6888-4658-883f-bc4b1e2b469a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.826245 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa57611-c06c-499a-ac11-e1c080227fc9-config-data\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.826505 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrlc\" (UniqueName: \"kubernetes.io/projected/daa57611-c06c-499a-ac11-e1c080227fc9-kube-api-access-vfrlc\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.826672 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c69b\" (UniqueName: \"kubernetes.io/projected/27c6a87a-6888-4658-883f-bc4b1e2b469a-kube-api-access-4c69b\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.826803 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daa57611-c06c-499a-ac11-e1c080227fc9-logs\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.826944 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c6a87a-6888-4658-883f-bc4b1e2b469a-config-data\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.827118 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c6a87a-6888-4658-883f-bc4b1e2b469a-logs\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.827433 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa57611-c06c-499a-ac11-e1c080227fc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.828710 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27c6a87a-6888-4658-883f-bc4b1e2b469a-logs\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.828717 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daa57611-c06c-499a-ac11-e1c080227fc9-logs\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.835167 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c6a87a-6888-4658-883f-bc4b1e2b469a-config-data\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.837273 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa57611-c06c-499a-ac11-e1c080227fc9-config-data\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.837500 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c6a87a-6888-4658-883f-bc4b1e2b469a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.838816 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa57611-c06c-499a-ac11-e1c080227fc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.844651 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrlc\" (UniqueName: \"kubernetes.io/projected/daa57611-c06c-499a-ac11-e1c080227fc9-kube-api-access-vfrlc\") pod \"nova-api-0\" (UID: \"daa57611-c06c-499a-ac11-e1c080227fc9\") " pod="openstack/nova-api-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.848097 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c69b\" (UniqueName: \"kubernetes.io/projected/27c6a87a-6888-4658-883f-bc4b1e2b469a-kube-api-access-4c69b\") pod \"nova-metadata-0\" (UID: \"27c6a87a-6888-4658-883f-bc4b1e2b469a\") " pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.948442 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.950594 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 18:07:08 crc kubenswrapper[5033]: I0226 18:07:08.990813 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.030773 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxvmb\" (UniqueName: \"kubernetes.io/projected/229c2ab4-2387-4058-b422-f10ad64325bf-kube-api-access-lxvmb\") pod \"229c2ab4-2387-4058-b422-f10ad64325bf\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.031365 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-combined-ca-bundle\") pod \"229c2ab4-2387-4058-b422-f10ad64325bf\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.031428 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-config-data\") pod \"229c2ab4-2387-4058-b422-f10ad64325bf\" (UID: \"229c2ab4-2387-4058-b422-f10ad64325bf\") " Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.041529 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/229c2ab4-2387-4058-b422-f10ad64325bf-kube-api-access-lxvmb" (OuterVolumeSpecName: "kube-api-access-lxvmb") pod "229c2ab4-2387-4058-b422-f10ad64325bf" (UID: "229c2ab4-2387-4058-b422-f10ad64325bf"). InnerVolumeSpecName "kube-api-access-lxvmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.112861 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-config-data" (OuterVolumeSpecName: "config-data") pod "229c2ab4-2387-4058-b422-f10ad64325bf" (UID: "229c2ab4-2387-4058-b422-f10ad64325bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.121082 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "229c2ab4-2387-4058-b422-f10ad64325bf" (UID: "229c2ab4-2387-4058-b422-f10ad64325bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.136781 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.137052 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229c2ab4-2387-4058-b422-f10ad64325bf-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.137061 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxvmb\" (UniqueName: \"kubernetes.io/projected/229c2ab4-2387-4058-b422-f10ad64325bf-kube-api-access-lxvmb\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.437466 5033 generic.go:334] "Generic (PLEG): container finished" podID="229c2ab4-2387-4058-b422-f10ad64325bf" containerID="27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3" exitCode=0 Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.437558 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.438111 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"229c2ab4-2387-4058-b422-f10ad64325bf","Type":"ContainerDied","Data":"27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3"} Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.438163 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"229c2ab4-2387-4058-b422-f10ad64325bf","Type":"ContainerDied","Data":"987c509b3c299ff2a56c2c6bb40ecdbdca2dc1e510ebbfb138a2c8d7f6c0a382"} Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.438183 5033 scope.go:117] "RemoveContainer" containerID="27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.443680 5033 generic.go:334] "Generic (PLEG): container finished" podID="6dbd74aa-2791-4724-946a-21b175fdedd6" containerID="28cb606fda414c24a338470fdf12da2bbb4e68d9c9794d8b44e088c1b467cb96" exitCode=0 Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.444864 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6dbd74aa-2791-4724-946a-21b175fdedd6","Type":"ContainerDied","Data":"28cb606fda414c24a338470fdf12da2bbb4e68d9c9794d8b44e088c1b467cb96"} Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.481300 5033 scope.go:117] "RemoveContainer" containerID="27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3" Feb 26 18:07:09 crc kubenswrapper[5033]: E0226 18:07:09.484050 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3\": container with ID starting with 27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3 not found: ID does not exist" containerID="27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.484090 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3"} err="failed to get container status \"27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3\": rpc error: code = NotFound desc = could not find container \"27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3\": container with ID starting with 27869a2fa7bc42680f1a02baf61927b78bb0b74c6ba4e03c63591f4bfa7644d3 not found: ID does not exist" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.501496 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.534795 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 18:07:09 crc kubenswrapper[5033]: W0226 18:07:09.553925 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27c6a87a_6888_4658_883f_bc4b1e2b469a.slice/crio-b4b584116c69625d26a20e32c2f2bbcbce791963e339195a740bf5afb1736416 WatchSource:0}: Error finding container b4b584116c69625d26a20e32c2f2bbcbce791963e339195a740bf5afb1736416: Status 404 returned error can't find the container with id b4b584116c69625d26a20e32c2f2bbcbce791963e339195a740bf5afb1736416 Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.585549 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 18:07:09 crc kubenswrapper[5033]: E0226 18:07:09.588518 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229c2ab4-2387-4058-b422-f10ad64325bf" containerName="nova-cell0-conductor-conductor" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.588545 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="229c2ab4-2387-4058-b422-f10ad64325bf" containerName="nova-cell0-conductor-conductor" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.589132 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="229c2ab4-2387-4058-b422-f10ad64325bf" containerName="nova-cell0-conductor-conductor" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.591915 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.595100 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.611562 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.645448 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="229c2ab4-2387-4058-b422-f10ad64325bf" path="/var/lib/kubelet/pods/229c2ab4-2387-4058-b422-f10ad64325bf/volumes" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.646263 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b24d004-b104-4214-a60d-dfc3a02c0110" path="/var/lib/kubelet/pods/8b24d004-b104-4214-a60d-dfc3a02c0110/volumes" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.647623 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb98341-a8eb-4bed-94ed-c70daa525734" path="/var/lib/kubelet/pods/dbb98341-a8eb-4bed-94ed-c70daa525734/volumes" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.648844 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f61039-d98f-477f-abc4-685db9784bb3" path="/var/lib/kubelet/pods/f6f61039-d98f-477f-abc4-685db9784bb3/volumes" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.649841 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 18:07:09 crc kubenswrapper[5033]: W0226 18:07:09.655197 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaa57611_c06c_499a_ac11_e1c080227fc9.slice/crio-b018b03e29e1859956e2b5dbe5fe9b53c41360909d201eae9b6ef12ffb7bc307 WatchSource:0}: Error finding container b018b03e29e1859956e2b5dbe5fe9b53c41360909d201eae9b6ef12ffb7bc307: Status 404 returned error can't find the container with id b018b03e29e1859956e2b5dbe5fe9b53c41360909d201eae9b6ef12ffb7bc307 Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.668939 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.676958 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9snhx\" (UniqueName: \"kubernetes.io/projected/b97279be-904e-4e29-86d1-42feab6dad6a-kube-api-access-9snhx\") pod \"nova-cell0-conductor-0\" (UID: \"b97279be-904e-4e29-86d1-42feab6dad6a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.677119 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97279be-904e-4e29-86d1-42feab6dad6a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b97279be-904e-4e29-86d1-42feab6dad6a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.677142 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97279be-904e-4e29-86d1-42feab6dad6a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b97279be-904e-4e29-86d1-42feab6dad6a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.779217 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9snhx\" (UniqueName: \"kubernetes.io/projected/b97279be-904e-4e29-86d1-42feab6dad6a-kube-api-access-9snhx\") pod \"nova-cell0-conductor-0\" (UID: \"b97279be-904e-4e29-86d1-42feab6dad6a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.779483 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97279be-904e-4e29-86d1-42feab6dad6a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b97279be-904e-4e29-86d1-42feab6dad6a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.779516 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97279be-904e-4e29-86d1-42feab6dad6a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b97279be-904e-4e29-86d1-42feab6dad6a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.783555 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97279be-904e-4e29-86d1-42feab6dad6a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b97279be-904e-4e29-86d1-42feab6dad6a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.785615 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97279be-904e-4e29-86d1-42feab6dad6a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b97279be-904e-4e29-86d1-42feab6dad6a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.802090 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9snhx\" (UniqueName: \"kubernetes.io/projected/b97279be-904e-4e29-86d1-42feab6dad6a-kube-api-access-9snhx\") pod \"nova-cell0-conductor-0\" (UID: \"b97279be-904e-4e29-86d1-42feab6dad6a\") " pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:09 crc kubenswrapper[5033]: I0226 18:07:09.943792 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.285020 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.392444 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-combined-ca-bundle\") pod \"6dbd74aa-2791-4724-946a-21b175fdedd6\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.392779 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-config-data\") pod \"6dbd74aa-2791-4724-946a-21b175fdedd6\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.392998 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhz5v\" (UniqueName: \"kubernetes.io/projected/6dbd74aa-2791-4724-946a-21b175fdedd6-kube-api-access-dhz5v\") pod \"6dbd74aa-2791-4724-946a-21b175fdedd6\" (UID: \"6dbd74aa-2791-4724-946a-21b175fdedd6\") " Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.398208 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dbd74aa-2791-4724-946a-21b175fdedd6-kube-api-access-dhz5v" (OuterVolumeSpecName: "kube-api-access-dhz5v") pod "6dbd74aa-2791-4724-946a-21b175fdedd6" (UID: "6dbd74aa-2791-4724-946a-21b175fdedd6"). InnerVolumeSpecName "kube-api-access-dhz5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.428288 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dbd74aa-2791-4724-946a-21b175fdedd6" (UID: "6dbd74aa-2791-4724-946a-21b175fdedd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.435374 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-config-data" (OuterVolumeSpecName: "config-data") pod "6dbd74aa-2791-4724-946a-21b175fdedd6" (UID: "6dbd74aa-2791-4724-946a-21b175fdedd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.473694 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"27c6a87a-6888-4658-883f-bc4b1e2b469a","Type":"ContainerStarted","Data":"811910397ea6af8cb3612454c92297029958644507d7d1fe88f4fbd575def936"} Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.473740 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"27c6a87a-6888-4658-883f-bc4b1e2b469a","Type":"ContainerStarted","Data":"4d05aeda0aa24f87ef4501faa6e0bcb1fc806a7d1df8f8c3d2b7a618e202f3f5"} Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.473752 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"27c6a87a-6888-4658-883f-bc4b1e2b469a","Type":"ContainerStarted","Data":"b4b584116c69625d26a20e32c2f2bbcbce791963e339195a740bf5afb1736416"} Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.486775 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6dbd74aa-2791-4724-946a-21b175fdedd6","Type":"ContainerDied","Data":"741aedc47f559a1d5ef80384c91df0e3e604c64417bf1dd1c18a38c0a9d328d8"} Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.486827 5033 scope.go:117] "RemoveContainer" containerID="28cb606fda414c24a338470fdf12da2bbb4e68d9c9794d8b44e088c1b467cb96" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.486936 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.493814 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.496426 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhz5v\" (UniqueName: \"kubernetes.io/projected/6dbd74aa-2791-4724-946a-21b175fdedd6-kube-api-access-dhz5v\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.496467 5033 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.496481 5033 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbd74aa-2791-4724-946a-21b175fdedd6-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.502194 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"daa57611-c06c-499a-ac11-e1c080227fc9","Type":"ContainerStarted","Data":"d9b4496701704c6c3bf2a5fb5e17f129fd2eac754c660bd1c283ec4f8d47eb45"} Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.502275 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"daa57611-c06c-499a-ac11-e1c080227fc9","Type":"ContainerStarted","Data":"0e26bbf333933bedfb7927e42be02985893b0a0f1e89155d3930e152da82012e"} Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.502291 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"daa57611-c06c-499a-ac11-e1c080227fc9","Type":"ContainerStarted","Data":"b018b03e29e1859956e2b5dbe5fe9b53c41360909d201eae9b6ef12ffb7bc307"} Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.507924 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.507905316 podStartE2EDuration="2.507905316s" podCreationTimestamp="2026-02-26 18:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 18:07:10.501992352 +0000 UTC m=+8757.714591002" watchObservedRunningTime="2026-02-26 18:07:10.507905316 +0000 UTC m=+8757.720503966" Feb 26 18:07:10 crc kubenswrapper[5033]: W0226 18:07:10.525430 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb97279be_904e_4e29_86d1_42feab6dad6a.slice/crio-cf8653b27c632bde7e7840b3576017f433eff7f4f55842fda3f18f96f32ba1f1 WatchSource:0}: Error finding container cf8653b27c632bde7e7840b3576017f433eff7f4f55842fda3f18f96f32ba1f1: Status 404 returned error can't find the container with id cf8653b27c632bde7e7840b3576017f433eff7f4f55842fda3f18f96f32ba1f1 Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.527963 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.527940363 podStartE2EDuration="2.527940363s" podCreationTimestamp="2026-02-26 18:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 18:07:10.525702129 +0000 UTC m=+8757.738300779" watchObservedRunningTime="2026-02-26 18:07:10.527940363 +0000 UTC m=+8757.740539013" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.565233 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.584377 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.596248 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 18:07:10 crc kubenswrapper[5033]: E0226 18:07:10.596820 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbd74aa-2791-4724-946a-21b175fdedd6" containerName="nova-scheduler-scheduler" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.596833 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbd74aa-2791-4724-946a-21b175fdedd6" containerName="nova-scheduler-scheduler" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.597064 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dbd74aa-2791-4724-946a-21b175fdedd6" containerName="nova-scheduler-scheduler" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.598023 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.601143 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.609237 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.700207 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab\") " pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.700279 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvc62\" (UniqueName: \"kubernetes.io/projected/b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab-kube-api-access-hvc62\") pod \"nova-scheduler-0\" (UID: \"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab\") " pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.700535 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab-config-data\") pod \"nova-scheduler-0\" (UID: \"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab\") " pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.802222 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab\") " pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.802560 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvc62\" (UniqueName: \"kubernetes.io/projected/b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab-kube-api-access-hvc62\") pod \"nova-scheduler-0\" (UID: \"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab\") " pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.802625 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab-config-data\") pod \"nova-scheduler-0\" (UID: \"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab\") " pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.806007 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab\") " pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.807321 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab-config-data\") pod \"nova-scheduler-0\" (UID: \"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab\") " pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.820271 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvc62\" (UniqueName: \"kubernetes.io/projected/b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab-kube-api-access-hvc62\") pod \"nova-scheduler-0\" (UID: \"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab\") " pod="openstack/nova-scheduler-0" Feb 26 18:07:10 crc kubenswrapper[5033]: I0226 18:07:10.932474 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 18:07:11 crc kubenswrapper[5033]: W0226 18:07:11.422287 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2b76dc3_ff08_4cd1_8cde_aaa65f14c0ab.slice/crio-4628d209b05e3a7d95072d14715b9dada96f0aa7a77776e7ce54bd890597afd0 WatchSource:0}: Error finding container 4628d209b05e3a7d95072d14715b9dada96f0aa7a77776e7ce54bd890597afd0: Status 404 returned error can't find the container with id 4628d209b05e3a7d95072d14715b9dada96f0aa7a77776e7ce54bd890597afd0 Feb 26 18:07:11 crc kubenswrapper[5033]: I0226 18:07:11.423375 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 18:07:11 crc kubenswrapper[5033]: I0226 18:07:11.512392 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b97279be-904e-4e29-86d1-42feab6dad6a","Type":"ContainerStarted","Data":"3cdd3f154da5724bccfda489c1ae017660acf81b6af209ca41c3382aa3ece59d"} Feb 26 18:07:11 crc kubenswrapper[5033]: I0226 18:07:11.512453 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b97279be-904e-4e29-86d1-42feab6dad6a","Type":"ContainerStarted","Data":"cf8653b27c632bde7e7840b3576017f433eff7f4f55842fda3f18f96f32ba1f1"} Feb 26 18:07:11 crc kubenswrapper[5033]: I0226 18:07:11.513092 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:11 crc kubenswrapper[5033]: I0226 18:07:11.519393 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab","Type":"ContainerStarted","Data":"4628d209b05e3a7d95072d14715b9dada96f0aa7a77776e7ce54bd890597afd0"} Feb 26 18:07:11 crc kubenswrapper[5033]: I0226 18:07:11.535761 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.53573893 podStartE2EDuration="2.53573893s" podCreationTimestamp="2026-02-26 18:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 18:07:11.53083219 +0000 UTC m=+8758.743430870" watchObservedRunningTime="2026-02-26 18:07:11.53573893 +0000 UTC m=+8758.748337580" Feb 26 18:07:11 crc kubenswrapper[5033]: I0226 18:07:11.646298 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dbd74aa-2791-4724-946a-21b175fdedd6" path="/var/lib/kubelet/pods/6dbd74aa-2791-4724-946a-21b175fdedd6/volumes" Feb 26 18:07:12 crc kubenswrapper[5033]: I0226 18:07:12.532237 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab","Type":"ContainerStarted","Data":"8e83e299cc8317fbd88c28ccada4039c6c8eafa777a0044a77865bcdcdce7107"} Feb 26 18:07:12 crc kubenswrapper[5033]: I0226 18:07:12.550617 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.550580289 podStartE2EDuration="2.550580289s" podCreationTimestamp="2026-02-26 18:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 18:07:12.547334779 +0000 UTC m=+8759.759933449" watchObservedRunningTime="2026-02-26 18:07:12.550580289 +0000 UTC m=+8759.763178939" Feb 26 18:07:12 crc kubenswrapper[5033]: I0226 18:07:12.628851 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:07:12 crc kubenswrapper[5033]: E0226 18:07:12.629303 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:07:13 crc kubenswrapper[5033]: I0226 18:07:13.951927 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 18:07:13 crc kubenswrapper[5033]: I0226 18:07:13.952227 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 18:07:15 crc kubenswrapper[5033]: I0226 18:07:15.933148 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 18:07:16 crc kubenswrapper[5033]: I0226 18:07:16.841287 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 26 18:07:18 crc kubenswrapper[5033]: I0226 18:07:18.951657 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 18:07:18 crc kubenswrapper[5033]: I0226 18:07:18.951994 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 18:07:18 crc kubenswrapper[5033]: I0226 18:07:18.991982 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 18:07:18 crc kubenswrapper[5033]: I0226 18:07:18.992133 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 18:07:19 crc kubenswrapper[5033]: I0226 18:07:19.975918 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 26 18:07:20 crc kubenswrapper[5033]: I0226 18:07:20.035840 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="27c6a87a-6888-4658-883f-bc4b1e2b469a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.249:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 18:07:20 crc kubenswrapper[5033]: I0226 18:07:20.118434 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="27c6a87a-6888-4658-883f-bc4b1e2b469a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.249:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 18:07:20 crc kubenswrapper[5033]: I0226 18:07:20.118476 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="daa57611-c06c-499a-ac11-e1c080227fc9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.250:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 18:07:20 crc kubenswrapper[5033]: I0226 18:07:20.118578 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="daa57611-c06c-499a-ac11-e1c080227fc9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.250:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 18:07:20 crc kubenswrapper[5033]: I0226 18:07:20.933784 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 18:07:20 crc kubenswrapper[5033]: I0226 18:07:20.968804 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 18:07:21 crc kubenswrapper[5033]: I0226 18:07:21.698457 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 18:07:26 crc kubenswrapper[5033]: I0226 18:07:26.630087 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:07:26 crc kubenswrapper[5033]: E0226 18:07:26.631966 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:07:28 crc kubenswrapper[5033]: I0226 18:07:28.955046 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 18:07:28 crc kubenswrapper[5033]: I0226 18:07:28.956797 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 18:07:28 crc kubenswrapper[5033]: I0226 18:07:28.959384 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 18:07:29 crc kubenswrapper[5033]: I0226 18:07:29.000440 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 18:07:29 crc kubenswrapper[5033]: I0226 18:07:29.000885 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 18:07:29 crc kubenswrapper[5033]: I0226 18:07:29.002984 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 18:07:29 crc kubenswrapper[5033]: I0226 18:07:29.006869 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 18:07:29 crc kubenswrapper[5033]: I0226 18:07:29.747400 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 18:07:29 crc kubenswrapper[5033]: I0226 18:07:29.748950 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 18:07:29 crc kubenswrapper[5033]: I0226 18:07:29.752707 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.225506 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4"] Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.228315 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.236374 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.237906 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.238320 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.238339 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-gz5j8" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.238476 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.239879 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.239979 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.242316 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4"] Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325360 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325441 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325474 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325541 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325604 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325681 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f897d\" (UniqueName: \"kubernetes.io/projected/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-kube-api-access-f897d\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325768 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325798 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325897 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325947 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.325984 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.326034 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.326059 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.427667 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.427712 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.427743 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.427774 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.427805 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.427847 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f897d\" (UniqueName: \"kubernetes.io/projected/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-kube-api-access-f897d\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.427905 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.427932 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.428004 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.428055 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.428090 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.428142 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.428172 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.429709 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.431562 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.434832 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.435858 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.436324 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.436447 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.436506 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.436771 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.437023 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.437047 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.437423 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.448906 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.466223 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f897d\" (UniqueName: \"kubernetes.io/projected/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-kube-api-access-f897d\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:31 crc kubenswrapper[5033]: I0226 18:07:31.562498 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:07:32 crc kubenswrapper[5033]: W0226 18:07:32.143488 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7dc2fb3_80be_4fc0_afe0_25f298176e0e.slice/crio-083071c8666847db409a93c709030469f9002f7db6db2afea9a7fd509645cb0e WatchSource:0}: Error finding container 083071c8666847db409a93c709030469f9002f7db6db2afea9a7fd509645cb0e: Status 404 returned error can't find the container with id 083071c8666847db409a93c709030469f9002f7db6db2afea9a7fd509645cb0e Feb 26 18:07:32 crc kubenswrapper[5033]: I0226 18:07:32.147037 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4"] Feb 26 18:07:32 crc kubenswrapper[5033]: I0226 18:07:32.785631 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" event={"ID":"d7dc2fb3-80be-4fc0-afe0-25f298176e0e","Type":"ContainerStarted","Data":"083071c8666847db409a93c709030469f9002f7db6db2afea9a7fd509645cb0e"} Feb 26 18:07:33 crc kubenswrapper[5033]: I0226 18:07:33.797347 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" event={"ID":"d7dc2fb3-80be-4fc0-afe0-25f298176e0e","Type":"ContainerStarted","Data":"a5e166099f186b730b81b180e363c0a792aad8197ec84113ba3da5848aa11f82"} Feb 26 18:07:33 crc kubenswrapper[5033]: I0226 18:07:33.829570 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" podStartSLOduration=2.173089247 podStartE2EDuration="2.829552957s" podCreationTimestamp="2026-02-26 18:07:31 +0000 UTC" firstStartedPulling="2026-02-26 18:07:32.147815345 +0000 UTC m=+8779.360413995" lastFinishedPulling="2026-02-26 18:07:32.804279055 +0000 UTC m=+8780.016877705" observedRunningTime="2026-02-26 18:07:33.82306437 +0000 UTC m=+8781.035663020" watchObservedRunningTime="2026-02-26 18:07:33.829552957 +0000 UTC m=+8781.042151607" Feb 26 18:07:38 crc kubenswrapper[5033]: I0226 18:07:38.628146 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:07:38 crc kubenswrapper[5033]: E0226 18:07:38.628793 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:07:53 crc kubenswrapper[5033]: I0226 18:07:53.635640 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:07:54 crc kubenswrapper[5033]: I0226 18:07:54.015101 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"4e59aefdf6606d8a6c8fb0cbff2ade0955939bcd9d9ba59b237ee2cd1f8472b2"} Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.137760 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535488-8wks2"] Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.149671 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535488-8wks2" Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.166202 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535488-8wks2"] Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.195413 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.195945 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.195984 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.205441 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvppk\" (UniqueName: \"kubernetes.io/projected/dc73b1db-9563-497f-b1e9-1a6381d79e17-kube-api-access-fvppk\") pod \"auto-csr-approver-29535488-8wks2\" (UID: \"dc73b1db-9563-497f-b1e9-1a6381d79e17\") " pod="openshift-infra/auto-csr-approver-29535488-8wks2" Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.307583 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvppk\" (UniqueName: \"kubernetes.io/projected/dc73b1db-9563-497f-b1e9-1a6381d79e17-kube-api-access-fvppk\") pod \"auto-csr-approver-29535488-8wks2\" (UID: \"dc73b1db-9563-497f-b1e9-1a6381d79e17\") " pod="openshift-infra/auto-csr-approver-29535488-8wks2" Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.327052 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvppk\" (UniqueName: \"kubernetes.io/projected/dc73b1db-9563-497f-b1e9-1a6381d79e17-kube-api-access-fvppk\") pod \"auto-csr-approver-29535488-8wks2\" (UID: \"dc73b1db-9563-497f-b1e9-1a6381d79e17\") " pod="openshift-infra/auto-csr-approver-29535488-8wks2" Feb 26 18:08:00 crc kubenswrapper[5033]: I0226 18:08:00.520804 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535488-8wks2" Feb 26 18:08:01 crc kubenswrapper[5033]: I0226 18:08:01.005572 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535488-8wks2"] Feb 26 18:08:01 crc kubenswrapper[5033]: I0226 18:08:01.090096 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535488-8wks2" event={"ID":"dc73b1db-9563-497f-b1e9-1a6381d79e17","Type":"ContainerStarted","Data":"ca7fc570c46071c5dcb750fcad11d0ccc63ba237c98d4738bbeb4be6a1af6dab"} Feb 26 18:08:03 crc kubenswrapper[5033]: I0226 18:08:03.115725 5033 generic.go:334] "Generic (PLEG): container finished" podID="dc73b1db-9563-497f-b1e9-1a6381d79e17" containerID="b69f25c489f9376c576a4987a573d52239f075a91ce5bcb8ad601c0030bf0c08" exitCode=0 Feb 26 18:08:03 crc kubenswrapper[5033]: I0226 18:08:03.115836 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535488-8wks2" event={"ID":"dc73b1db-9563-497f-b1e9-1a6381d79e17","Type":"ContainerDied","Data":"b69f25c489f9376c576a4987a573d52239f075a91ce5bcb8ad601c0030bf0c08"} Feb 26 18:08:04 crc kubenswrapper[5033]: I0226 18:08:04.547213 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535488-8wks2" Feb 26 18:08:04 crc kubenswrapper[5033]: I0226 18:08:04.631731 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvppk\" (UniqueName: \"kubernetes.io/projected/dc73b1db-9563-497f-b1e9-1a6381d79e17-kube-api-access-fvppk\") pod \"dc73b1db-9563-497f-b1e9-1a6381d79e17\" (UID: \"dc73b1db-9563-497f-b1e9-1a6381d79e17\") " Feb 26 18:08:04 crc kubenswrapper[5033]: I0226 18:08:04.642447 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc73b1db-9563-497f-b1e9-1a6381d79e17-kube-api-access-fvppk" (OuterVolumeSpecName: "kube-api-access-fvppk") pod "dc73b1db-9563-497f-b1e9-1a6381d79e17" (UID: "dc73b1db-9563-497f-b1e9-1a6381d79e17"). InnerVolumeSpecName "kube-api-access-fvppk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:08:04 crc kubenswrapper[5033]: I0226 18:08:04.735327 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvppk\" (UniqueName: \"kubernetes.io/projected/dc73b1db-9563-497f-b1e9-1a6381d79e17-kube-api-access-fvppk\") on node \"crc\" DevicePath \"\"" Feb 26 18:08:05 crc kubenswrapper[5033]: I0226 18:08:05.147769 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535488-8wks2" event={"ID":"dc73b1db-9563-497f-b1e9-1a6381d79e17","Type":"ContainerDied","Data":"ca7fc570c46071c5dcb750fcad11d0ccc63ba237c98d4738bbeb4be6a1af6dab"} Feb 26 18:08:05 crc kubenswrapper[5033]: I0226 18:08:05.148099 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca7fc570c46071c5dcb750fcad11d0ccc63ba237c98d4738bbeb4be6a1af6dab" Feb 26 18:08:05 crc kubenswrapper[5033]: I0226 18:08:05.148193 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535488-8wks2" Feb 26 18:08:05 crc kubenswrapper[5033]: I0226 18:08:05.614063 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535482-xlf2d"] Feb 26 18:08:05 crc kubenswrapper[5033]: I0226 18:08:05.626094 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535482-xlf2d"] Feb 26 18:08:05 crc kubenswrapper[5033]: I0226 18:08:05.641952 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74182ef9-becf-4b65-9931-90e2712d8750" path="/var/lib/kubelet/pods/74182ef9-becf-4b65-9931-90e2712d8750/volumes" Feb 26 18:08:43 crc kubenswrapper[5033]: I0226 18:08:43.883215 5033 scope.go:117] "RemoveContainer" containerID="9fb3fa763e4343b434b4254a95bd31c5d1616ae4c31490d00f9efaa1a5111d3e" Feb 26 18:08:44 crc kubenswrapper[5033]: I0226 18:08:44.932684 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-brxmv"] Feb 26 18:08:44 crc kubenswrapper[5033]: E0226 18:08:44.934072 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc73b1db-9563-497f-b1e9-1a6381d79e17" containerName="oc" Feb 26 18:08:44 crc kubenswrapper[5033]: I0226 18:08:44.934094 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc73b1db-9563-497f-b1e9-1a6381d79e17" containerName="oc" Feb 26 18:08:44 crc kubenswrapper[5033]: I0226 18:08:44.934321 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc73b1db-9563-497f-b1e9-1a6381d79e17" containerName="oc" Feb 26 18:08:44 crc kubenswrapper[5033]: I0226 18:08:44.935829 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:44 crc kubenswrapper[5033]: I0226 18:08:44.954363 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brxmv"] Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.048127 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-utilities\") pod \"community-operators-brxmv\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.048766 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrpgv\" (UniqueName: \"kubernetes.io/projected/49727e64-819e-4584-a9a0-d13741f60e28-kube-api-access-hrpgv\") pod \"community-operators-brxmv\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.048816 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-catalog-content\") pod \"community-operators-brxmv\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.150554 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrpgv\" (UniqueName: \"kubernetes.io/projected/49727e64-819e-4584-a9a0-d13741f60e28-kube-api-access-hrpgv\") pod \"community-operators-brxmv\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.150607 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-catalog-content\") pod \"community-operators-brxmv\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.150660 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-utilities\") pod \"community-operators-brxmv\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.151161 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-utilities\") pod \"community-operators-brxmv\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.151281 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-catalog-content\") pod \"community-operators-brxmv\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.171465 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrpgv\" (UniqueName: \"kubernetes.io/projected/49727e64-819e-4584-a9a0-d13741f60e28-kube-api-access-hrpgv\") pod \"community-operators-brxmv\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.258597 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:45 crc kubenswrapper[5033]: I0226 18:08:45.812541 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brxmv"] Feb 26 18:08:46 crc kubenswrapper[5033]: I0226 18:08:46.632802 5033 generic.go:334] "Generic (PLEG): container finished" podID="49727e64-819e-4584-a9a0-d13741f60e28" containerID="ce54022c41281fcaf6ae8aa1fb88948906a14eaa9ce32f9b6ef4d2f22adf32bc" exitCode=0 Feb 26 18:08:46 crc kubenswrapper[5033]: I0226 18:08:46.632881 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brxmv" event={"ID":"49727e64-819e-4584-a9a0-d13741f60e28","Type":"ContainerDied","Data":"ce54022c41281fcaf6ae8aa1fb88948906a14eaa9ce32f9b6ef4d2f22adf32bc"} Feb 26 18:08:46 crc kubenswrapper[5033]: I0226 18:08:46.634380 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brxmv" event={"ID":"49727e64-819e-4584-a9a0-d13741f60e28","Type":"ContainerStarted","Data":"aad8ec9e0ec9c35cf27073fb040ebcb1fc71846bdb8b6c032d6c4e0f8325c49c"} Feb 26 18:08:48 crc kubenswrapper[5033]: I0226 18:08:48.654672 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brxmv" event={"ID":"49727e64-819e-4584-a9a0-d13741f60e28","Type":"ContainerStarted","Data":"ed4c85c1e6d1cd78e76a2f2535dafafaaa5b70ea232b7e982d25b77ad18332f7"} Feb 26 18:08:51 crc kubenswrapper[5033]: I0226 18:08:51.698923 5033 generic.go:334] "Generic (PLEG): container finished" podID="49727e64-819e-4584-a9a0-d13741f60e28" containerID="ed4c85c1e6d1cd78e76a2f2535dafafaaa5b70ea232b7e982d25b77ad18332f7" exitCode=0 Feb 26 18:08:51 crc kubenswrapper[5033]: I0226 18:08:51.699058 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brxmv" event={"ID":"49727e64-819e-4584-a9a0-d13741f60e28","Type":"ContainerDied","Data":"ed4c85c1e6d1cd78e76a2f2535dafafaaa5b70ea232b7e982d25b77ad18332f7"} Feb 26 18:08:52 crc kubenswrapper[5033]: I0226 18:08:52.714759 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brxmv" event={"ID":"49727e64-819e-4584-a9a0-d13741f60e28","Type":"ContainerStarted","Data":"f239dffdf8ceaee692d300a6a2c49e0fd933240b0e402f7644c245a8d5fe2d34"} Feb 26 18:08:52 crc kubenswrapper[5033]: I0226 18:08:52.744744 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-brxmv" podStartSLOduration=3.148386051 podStartE2EDuration="8.74472527s" podCreationTimestamp="2026-02-26 18:08:44 +0000 UTC" firstStartedPulling="2026-02-26 18:08:46.634489214 +0000 UTC m=+8853.847087874" lastFinishedPulling="2026-02-26 18:08:52.230828433 +0000 UTC m=+8859.443427093" observedRunningTime="2026-02-26 18:08:52.734700696 +0000 UTC m=+8859.947299376" watchObservedRunningTime="2026-02-26 18:08:52.74472527 +0000 UTC m=+8859.957323920" Feb 26 18:08:55 crc kubenswrapper[5033]: I0226 18:08:55.259792 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:55 crc kubenswrapper[5033]: I0226 18:08:55.260673 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:08:55 crc kubenswrapper[5033]: I0226 18:08:55.326787 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:09:05 crc kubenswrapper[5033]: I0226 18:09:05.309298 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:09:05 crc kubenswrapper[5033]: I0226 18:09:05.368271 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brxmv"] Feb 26 18:09:05 crc kubenswrapper[5033]: I0226 18:09:05.859594 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-brxmv" podUID="49727e64-819e-4584-a9a0-d13741f60e28" containerName="registry-server" containerID="cri-o://f239dffdf8ceaee692d300a6a2c49e0fd933240b0e402f7644c245a8d5fe2d34" gracePeriod=2 Feb 26 18:09:06 crc kubenswrapper[5033]: I0226 18:09:06.870878 5033 generic.go:334] "Generic (PLEG): container finished" podID="49727e64-819e-4584-a9a0-d13741f60e28" containerID="f239dffdf8ceaee692d300a6a2c49e0fd933240b0e402f7644c245a8d5fe2d34" exitCode=0 Feb 26 18:09:06 crc kubenswrapper[5033]: I0226 18:09:06.871005 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brxmv" event={"ID":"49727e64-819e-4584-a9a0-d13741f60e28","Type":"ContainerDied","Data":"f239dffdf8ceaee692d300a6a2c49e0fd933240b0e402f7644c245a8d5fe2d34"} Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.222007 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.345766 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-utilities\") pod \"49727e64-819e-4584-a9a0-d13741f60e28\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.345972 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrpgv\" (UniqueName: \"kubernetes.io/projected/49727e64-819e-4584-a9a0-d13741f60e28-kube-api-access-hrpgv\") pod \"49727e64-819e-4584-a9a0-d13741f60e28\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.346036 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-catalog-content\") pod \"49727e64-819e-4584-a9a0-d13741f60e28\" (UID: \"49727e64-819e-4584-a9a0-d13741f60e28\") " Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.346695 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-utilities" (OuterVolumeSpecName: "utilities") pod "49727e64-819e-4584-a9a0-d13741f60e28" (UID: "49727e64-819e-4584-a9a0-d13741f60e28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.351806 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49727e64-819e-4584-a9a0-d13741f60e28-kube-api-access-hrpgv" (OuterVolumeSpecName: "kube-api-access-hrpgv") pod "49727e64-819e-4584-a9a0-d13741f60e28" (UID: "49727e64-819e-4584-a9a0-d13741f60e28"). InnerVolumeSpecName "kube-api-access-hrpgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.402136 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49727e64-819e-4584-a9a0-d13741f60e28" (UID: "49727e64-819e-4584-a9a0-d13741f60e28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.448327 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.448358 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrpgv\" (UniqueName: \"kubernetes.io/projected/49727e64-819e-4584-a9a0-d13741f60e28-kube-api-access-hrpgv\") on node \"crc\" DevicePath \"\"" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.448368 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49727e64-819e-4584-a9a0-d13741f60e28-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.887202 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brxmv" event={"ID":"49727e64-819e-4584-a9a0-d13741f60e28","Type":"ContainerDied","Data":"aad8ec9e0ec9c35cf27073fb040ebcb1fc71846bdb8b6c032d6c4e0f8325c49c"} Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.887279 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brxmv" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.888572 5033 scope.go:117] "RemoveContainer" containerID="f239dffdf8ceaee692d300a6a2c49e0fd933240b0e402f7644c245a8d5fe2d34" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.920338 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brxmv"] Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.931239 5033 scope.go:117] "RemoveContainer" containerID="ed4c85c1e6d1cd78e76a2f2535dafafaaa5b70ea232b7e982d25b77ad18332f7" Feb 26 18:09:07 crc kubenswrapper[5033]: I0226 18:09:07.933986 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-brxmv"] Feb 26 18:09:08 crc kubenswrapper[5033]: I0226 18:09:08.287714 5033 scope.go:117] "RemoveContainer" containerID="ce54022c41281fcaf6ae8aa1fb88948906a14eaa9ce32f9b6ef4d2f22adf32bc" Feb 26 18:09:09 crc kubenswrapper[5033]: I0226 18:09:09.638843 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49727e64-819e-4584-a9a0-d13741f60e28" path="/var/lib/kubelet/pods/49727e64-819e-4584-a9a0-d13741f60e28/volumes" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.143064 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535490-cg7q5"] Feb 26 18:10:00 crc kubenswrapper[5033]: E0226 18:10:00.144563 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49727e64-819e-4584-a9a0-d13741f60e28" containerName="extract-content" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.144579 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="49727e64-819e-4584-a9a0-d13741f60e28" containerName="extract-content" Feb 26 18:10:00 crc kubenswrapper[5033]: E0226 18:10:00.144595 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49727e64-819e-4584-a9a0-d13741f60e28" containerName="extract-utilities" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.144602 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="49727e64-819e-4584-a9a0-d13741f60e28" containerName="extract-utilities" Feb 26 18:10:00 crc kubenswrapper[5033]: E0226 18:10:00.144631 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49727e64-819e-4584-a9a0-d13741f60e28" containerName="registry-server" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.144638 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="49727e64-819e-4584-a9a0-d13741f60e28" containerName="registry-server" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.144880 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="49727e64-819e-4584-a9a0-d13741f60e28" containerName="registry-server" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.146002 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535490-cg7q5" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.148933 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.149117 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.150218 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.160556 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535490-cg7q5"] Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.187775 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmttj\" (UniqueName: \"kubernetes.io/projected/93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de-kube-api-access-fmttj\") pod \"auto-csr-approver-29535490-cg7q5\" (UID: \"93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de\") " pod="openshift-infra/auto-csr-approver-29535490-cg7q5" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.289265 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmttj\" (UniqueName: \"kubernetes.io/projected/93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de-kube-api-access-fmttj\") pod \"auto-csr-approver-29535490-cg7q5\" (UID: \"93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de\") " pod="openshift-infra/auto-csr-approver-29535490-cg7q5" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.308949 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmttj\" (UniqueName: \"kubernetes.io/projected/93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de-kube-api-access-fmttj\") pod \"auto-csr-approver-29535490-cg7q5\" (UID: \"93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de\") " pod="openshift-infra/auto-csr-approver-29535490-cg7q5" Feb 26 18:10:00 crc kubenswrapper[5033]: I0226 18:10:00.469476 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535490-cg7q5" Feb 26 18:10:01 crc kubenswrapper[5033]: I0226 18:10:01.022063 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535490-cg7q5"] Feb 26 18:10:01 crc kubenswrapper[5033]: I0226 18:10:01.435133 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535490-cg7q5" event={"ID":"93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de","Type":"ContainerStarted","Data":"89283092fe2eeb7659f8a4b41db734b56a781aa560edef3e91d17589e71329e1"} Feb 26 18:10:03 crc kubenswrapper[5033]: I0226 18:10:03.458587 5033 generic.go:334] "Generic (PLEG): container finished" podID="93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de" containerID="f08303ed23a1bef0ffff0f2607f822a4e96a9fe068a1454b9c1b13add19070ba" exitCode=0 Feb 26 18:10:03 crc kubenswrapper[5033]: I0226 18:10:03.458641 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535490-cg7q5" event={"ID":"93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de","Type":"ContainerDied","Data":"f08303ed23a1bef0ffff0f2607f822a4e96a9fe068a1454b9c1b13add19070ba"} Feb 26 18:10:04 crc kubenswrapper[5033]: I0226 18:10:04.858238 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535490-cg7q5" Feb 26 18:10:04 crc kubenswrapper[5033]: I0226 18:10:04.935119 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmttj\" (UniqueName: \"kubernetes.io/projected/93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de-kube-api-access-fmttj\") pod \"93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de\" (UID: \"93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de\") " Feb 26 18:10:04 crc kubenswrapper[5033]: I0226 18:10:04.941667 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de-kube-api-access-fmttj" (OuterVolumeSpecName: "kube-api-access-fmttj") pod "93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de" (UID: "93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de"). InnerVolumeSpecName "kube-api-access-fmttj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:10:05 crc kubenswrapper[5033]: I0226 18:10:05.038298 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmttj\" (UniqueName: \"kubernetes.io/projected/93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de-kube-api-access-fmttj\") on node \"crc\" DevicePath \"\"" Feb 26 18:10:05 crc kubenswrapper[5033]: I0226 18:10:05.482584 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535490-cg7q5" event={"ID":"93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de","Type":"ContainerDied","Data":"89283092fe2eeb7659f8a4b41db734b56a781aa560edef3e91d17589e71329e1"} Feb 26 18:10:05 crc kubenswrapper[5033]: I0226 18:10:05.482627 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535490-cg7q5" Feb 26 18:10:05 crc kubenswrapper[5033]: I0226 18:10:05.482637 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89283092fe2eeb7659f8a4b41db734b56a781aa560edef3e91d17589e71329e1" Feb 26 18:10:05 crc kubenswrapper[5033]: I0226 18:10:05.925813 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535484-xbpz9"] Feb 26 18:10:05 crc kubenswrapper[5033]: I0226 18:10:05.937881 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535484-xbpz9"] Feb 26 18:10:07 crc kubenswrapper[5033]: I0226 18:10:07.640469 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189975a7-197f-47c4-8cab-6d320c957d27" path="/var/lib/kubelet/pods/189975a7-197f-47c4-8cab-6d320c957d27/volumes" Feb 26 18:10:20 crc kubenswrapper[5033]: I0226 18:10:20.189570 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:10:20 crc kubenswrapper[5033]: I0226 18:10:20.190123 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:10:33 crc kubenswrapper[5033]: I0226 18:10:33.988044 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mxtg4"] Feb 26 18:10:33 crc kubenswrapper[5033]: E0226 18:10:33.989178 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de" containerName="oc" Feb 26 18:10:33 crc kubenswrapper[5033]: I0226 18:10:33.989194 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de" containerName="oc" Feb 26 18:10:33 crc kubenswrapper[5033]: I0226 18:10:33.989519 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de" containerName="oc" Feb 26 18:10:33 crc kubenswrapper[5033]: I0226 18:10:33.991752 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:33 crc kubenswrapper[5033]: I0226 18:10:33.999869 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mxtg4"] Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.011308 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-catalog-content\") pod \"certified-operators-mxtg4\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.011408 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlvks\" (UniqueName: \"kubernetes.io/projected/e2ab13ff-9724-4bff-99f1-e943597b9bd6-kube-api-access-nlvks\") pod \"certified-operators-mxtg4\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.011583 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-utilities\") pod \"certified-operators-mxtg4\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.113348 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-catalog-content\") pod \"certified-operators-mxtg4\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.113400 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlvks\" (UniqueName: \"kubernetes.io/projected/e2ab13ff-9724-4bff-99f1-e943597b9bd6-kube-api-access-nlvks\") pod \"certified-operators-mxtg4\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.113544 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-utilities\") pod \"certified-operators-mxtg4\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.114366 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-utilities\") pod \"certified-operators-mxtg4\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.114772 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-catalog-content\") pod \"certified-operators-mxtg4\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.134178 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlvks\" (UniqueName: \"kubernetes.io/projected/e2ab13ff-9724-4bff-99f1-e943597b9bd6-kube-api-access-nlvks\") pod \"certified-operators-mxtg4\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.316610 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:34 crc kubenswrapper[5033]: I0226 18:10:34.887236 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mxtg4"] Feb 26 18:10:35 crc kubenswrapper[5033]: I0226 18:10:35.796961 5033 generic.go:334] "Generic (PLEG): container finished" podID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerID="61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166" exitCode=0 Feb 26 18:10:35 crc kubenswrapper[5033]: I0226 18:10:35.797030 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxtg4" event={"ID":"e2ab13ff-9724-4bff-99f1-e943597b9bd6","Type":"ContainerDied","Data":"61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166"} Feb 26 18:10:35 crc kubenswrapper[5033]: I0226 18:10:35.797305 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxtg4" event={"ID":"e2ab13ff-9724-4bff-99f1-e943597b9bd6","Type":"ContainerStarted","Data":"3e2c6a4299cef6d140846c050df306e366a3b5c4ac13b33962a13efc9ab75e0f"} Feb 26 18:10:36 crc kubenswrapper[5033]: I0226 18:10:36.809095 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxtg4" event={"ID":"e2ab13ff-9724-4bff-99f1-e943597b9bd6","Type":"ContainerStarted","Data":"3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca"} Feb 26 18:10:38 crc kubenswrapper[5033]: I0226 18:10:38.833238 5033 generic.go:334] "Generic (PLEG): container finished" podID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerID="3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca" exitCode=0 Feb 26 18:10:38 crc kubenswrapper[5033]: I0226 18:10:38.833290 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxtg4" event={"ID":"e2ab13ff-9724-4bff-99f1-e943597b9bd6","Type":"ContainerDied","Data":"3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca"} Feb 26 18:10:39 crc kubenswrapper[5033]: I0226 18:10:39.872040 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxtg4" event={"ID":"e2ab13ff-9724-4bff-99f1-e943597b9bd6","Type":"ContainerStarted","Data":"da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915"} Feb 26 18:10:39 crc kubenswrapper[5033]: I0226 18:10:39.896213 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mxtg4" podStartSLOduration=3.386937799 podStartE2EDuration="6.896195994s" podCreationTimestamp="2026-02-26 18:10:33 +0000 UTC" firstStartedPulling="2026-02-26 18:10:35.800053165 +0000 UTC m=+8963.012651815" lastFinishedPulling="2026-02-26 18:10:39.30931133 +0000 UTC m=+8966.521910010" observedRunningTime="2026-02-26 18:10:39.891670834 +0000 UTC m=+8967.104269484" watchObservedRunningTime="2026-02-26 18:10:39.896195994 +0000 UTC m=+8967.108794644" Feb 26 18:10:44 crc kubenswrapper[5033]: I0226 18:10:43.999667 5033 scope.go:117] "RemoveContainer" containerID="e632efbbdc67d8f7f239b28828370012b78308bc07f58c2e3672fddccf8498ae" Feb 26 18:10:44 crc kubenswrapper[5033]: I0226 18:10:44.316940 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:44 crc kubenswrapper[5033]: I0226 18:10:44.318885 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:44 crc kubenswrapper[5033]: I0226 18:10:44.364815 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:44 crc kubenswrapper[5033]: I0226 18:10:44.988146 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:45 crc kubenswrapper[5033]: I0226 18:10:45.061599 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mxtg4"] Feb 26 18:10:46 crc kubenswrapper[5033]: I0226 18:10:46.941202 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mxtg4" podUID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerName="registry-server" containerID="cri-o://da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915" gracePeriod=2 Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.525149 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.704449 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-utilities\") pod \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.704536 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlvks\" (UniqueName: \"kubernetes.io/projected/e2ab13ff-9724-4bff-99f1-e943597b9bd6-kube-api-access-nlvks\") pod \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.704769 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-catalog-content\") pod \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\" (UID: \"e2ab13ff-9724-4bff-99f1-e943597b9bd6\") " Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.706329 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-utilities" (OuterVolumeSpecName: "utilities") pod "e2ab13ff-9724-4bff-99f1-e943597b9bd6" (UID: "e2ab13ff-9724-4bff-99f1-e943597b9bd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.713874 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ab13ff-9724-4bff-99f1-e943597b9bd6-kube-api-access-nlvks" (OuterVolumeSpecName: "kube-api-access-nlvks") pod "e2ab13ff-9724-4bff-99f1-e943597b9bd6" (UID: "e2ab13ff-9724-4bff-99f1-e943597b9bd6"). InnerVolumeSpecName "kube-api-access-nlvks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.764792 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2ab13ff-9724-4bff-99f1-e943597b9bd6" (UID: "e2ab13ff-9724-4bff-99f1-e943597b9bd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.812329 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.812369 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlvks\" (UniqueName: \"kubernetes.io/projected/e2ab13ff-9724-4bff-99f1-e943597b9bd6-kube-api-access-nlvks\") on node \"crc\" DevicePath \"\"" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.812380 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2ab13ff-9724-4bff-99f1-e943597b9bd6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.951601 5033 generic.go:334] "Generic (PLEG): container finished" podID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerID="da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915" exitCode=0 Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.951643 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxtg4" event={"ID":"e2ab13ff-9724-4bff-99f1-e943597b9bd6","Type":"ContainerDied","Data":"da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915"} Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.951669 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxtg4" event={"ID":"e2ab13ff-9724-4bff-99f1-e943597b9bd6","Type":"ContainerDied","Data":"3e2c6a4299cef6d140846c050df306e366a3b5c4ac13b33962a13efc9ab75e0f"} Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.951686 5033 scope.go:117] "RemoveContainer" containerID="da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.951818 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxtg4" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.981530 5033 scope.go:117] "RemoveContainer" containerID="3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca" Feb 26 18:10:47 crc kubenswrapper[5033]: I0226 18:10:47.996032 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mxtg4"] Feb 26 18:10:48 crc kubenswrapper[5033]: I0226 18:10:48.003720 5033 scope.go:117] "RemoveContainer" containerID="61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166" Feb 26 18:10:48 crc kubenswrapper[5033]: I0226 18:10:48.007075 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mxtg4"] Feb 26 18:10:48 crc kubenswrapper[5033]: I0226 18:10:48.052596 5033 scope.go:117] "RemoveContainer" containerID="da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915" Feb 26 18:10:48 crc kubenswrapper[5033]: E0226 18:10:48.053002 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915\": container with ID starting with da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915 not found: ID does not exist" containerID="da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915" Feb 26 18:10:48 crc kubenswrapper[5033]: I0226 18:10:48.053043 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915"} err="failed to get container status \"da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915\": rpc error: code = NotFound desc = could not find container \"da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915\": container with ID starting with da7e63443b54db2e36c53fd4e354a453018b7062fb4b1f9ed6bc13ef25951915 not found: ID does not exist" Feb 26 18:10:48 crc kubenswrapper[5033]: I0226 18:10:48.053064 5033 scope.go:117] "RemoveContainer" containerID="3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca" Feb 26 18:10:48 crc kubenswrapper[5033]: E0226 18:10:48.053345 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca\": container with ID starting with 3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca not found: ID does not exist" containerID="3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca" Feb 26 18:10:48 crc kubenswrapper[5033]: I0226 18:10:48.053373 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca"} err="failed to get container status \"3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca\": rpc error: code = NotFound desc = could not find container \"3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca\": container with ID starting with 3d968cfdfa219e028c0e79349ef5521017947279c9c947a0d254f23961fb50ca not found: ID does not exist" Feb 26 18:10:48 crc kubenswrapper[5033]: I0226 18:10:48.053391 5033 scope.go:117] "RemoveContainer" containerID="61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166" Feb 26 18:10:48 crc kubenswrapper[5033]: E0226 18:10:48.053648 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166\": container with ID starting with 61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166 not found: ID does not exist" containerID="61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166" Feb 26 18:10:48 crc kubenswrapper[5033]: I0226 18:10:48.053679 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166"} err="failed to get container status \"61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166\": rpc error: code = NotFound desc = could not find container \"61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166\": container with ID starting with 61723cbd7bfe2a9da3a84f980397aae1344d61fec4001848948154afe69f9166 not found: ID does not exist" Feb 26 18:10:49 crc kubenswrapper[5033]: I0226 18:10:49.641605 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" path="/var/lib/kubelet/pods/e2ab13ff-9724-4bff-99f1-e943597b9bd6/volumes" Feb 26 18:10:50 crc kubenswrapper[5033]: I0226 18:10:50.189243 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:10:50 crc kubenswrapper[5033]: I0226 18:10:50.189319 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:11:13 crc kubenswrapper[5033]: I0226 18:11:13.222444 5033 generic.go:334] "Generic (PLEG): container finished" podID="d7dc2fb3-80be-4fc0-afe0-25f298176e0e" containerID="a5e166099f186b730b81b180e363c0a792aad8197ec84113ba3da5848aa11f82" exitCode=0 Feb 26 18:11:13 crc kubenswrapper[5033]: I0226 18:11:13.222503 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" event={"ID":"d7dc2fb3-80be-4fc0-afe0-25f298176e0e","Type":"ContainerDied","Data":"a5e166099f186b730b81b180e363c0a792aad8197ec84113ba3da5848aa11f82"} Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.685450 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.789132 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ceph\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.789218 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-inventory\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.789319 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-0\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.789382 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-1\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.789416 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-1\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.789442 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f897d\" (UniqueName: \"kubernetes.io/projected/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-kube-api-access-f897d\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.790087 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-0\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.790166 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-3\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.790236 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ssh-key-openstack-cell1\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.790291 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-0\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.790346 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-1\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.790424 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-combined-ca-bundle\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.790473 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-2\") pod \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\" (UID: \"d7dc2fb3-80be-4fc0-afe0-25f298176e0e\") " Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.796221 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ceph" (OuterVolumeSpecName: "ceph") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.796519 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-kube-api-access-f897d" (OuterVolumeSpecName: "kube-api-access-f897d") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "kube-api-access-f897d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.797093 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.821047 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.824504 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.825375 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.827085 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.827889 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.831089 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-inventory" (OuterVolumeSpecName: "inventory") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.834188 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.834212 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.837508 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.840068 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d7dc2fb3-80be-4fc0-afe0-25f298176e0e" (UID: "d7dc2fb3-80be-4fc0-afe0-25f298176e0e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894419 5033 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894443 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894453 5033 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894461 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894470 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894482 5033 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-ceph\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894490 5033 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894499 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894507 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894515 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894522 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f897d\" (UniqueName: \"kubernetes.io/projected/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-kube-api-access-f897d\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894531 5033 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:14 crc kubenswrapper[5033]: I0226 18:11:14.894540 5033 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7dc2fb3-80be-4fc0-afe0-25f298176e0e-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 26 18:11:15 crc kubenswrapper[5033]: I0226 18:11:15.244330 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" event={"ID":"d7dc2fb3-80be-4fc0-afe0-25f298176e0e","Type":"ContainerDied","Data":"083071c8666847db409a93c709030469f9002f7db6db2afea9a7fd509645cb0e"} Feb 26 18:11:15 crc kubenswrapper[5033]: I0226 18:11:15.244942 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="083071c8666847db409a93c709030469f9002f7db6db2afea9a7fd509645cb0e" Feb 26 18:11:15 crc kubenswrapper[5033]: I0226 18:11:15.245107 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4" Feb 26 18:11:20 crc kubenswrapper[5033]: I0226 18:11:20.188406 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:11:20 crc kubenswrapper[5033]: I0226 18:11:20.188712 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:11:20 crc kubenswrapper[5033]: I0226 18:11:20.188752 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 18:11:20 crc kubenswrapper[5033]: I0226 18:11:20.189580 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e59aefdf6606d8a6c8fb0cbff2ade0955939bcd9d9ba59b237ee2cd1f8472b2"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 18:11:20 crc kubenswrapper[5033]: I0226 18:11:20.189637 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://4e59aefdf6606d8a6c8fb0cbff2ade0955939bcd9d9ba59b237ee2cd1f8472b2" gracePeriod=600 Feb 26 18:11:21 crc kubenswrapper[5033]: I0226 18:11:21.306627 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="4e59aefdf6606d8a6c8fb0cbff2ade0955939bcd9d9ba59b237ee2cd1f8472b2" exitCode=0 Feb 26 18:11:21 crc kubenswrapper[5033]: I0226 18:11:21.306703 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"4e59aefdf6606d8a6c8fb0cbff2ade0955939bcd9d9ba59b237ee2cd1f8472b2"} Feb 26 18:11:21 crc kubenswrapper[5033]: I0226 18:11:21.307110 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23"} Feb 26 18:11:21 crc kubenswrapper[5033]: I0226 18:11:21.307131 5033 scope.go:117] "RemoveContainer" containerID="f7143eb8c45aac4de89650be73bb0d86492707cad609ef1ce79ac3319fe4391d" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.177537 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535492-dfp9s"] Feb 26 18:12:00 crc kubenswrapper[5033]: E0226 18:12:00.178910 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerName="registry-server" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.178929 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerName="registry-server" Feb 26 18:12:00 crc kubenswrapper[5033]: E0226 18:12:00.178960 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7dc2fb3-80be-4fc0-afe0-25f298176e0e" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.178970 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7dc2fb3-80be-4fc0-afe0-25f298176e0e" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 26 18:12:00 crc kubenswrapper[5033]: E0226 18:12:00.178990 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerName="extract-content" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.178999 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerName="extract-content" Feb 26 18:12:00 crc kubenswrapper[5033]: E0226 18:12:00.179021 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerName="extract-utilities" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.179029 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerName="extract-utilities" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.179310 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ab13ff-9724-4bff-99f1-e943597b9bd6" containerName="registry-server" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.179346 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7dc2fb3-80be-4fc0-afe0-25f298176e0e" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.180282 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535492-dfp9s" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.182829 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.183061 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.183213 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.188921 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535492-dfp9s"] Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.277588 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jpz\" (UniqueName: \"kubernetes.io/projected/896e93d2-2565-4c00-8157-31d3ff6767e1-kube-api-access-q6jpz\") pod \"auto-csr-approver-29535492-dfp9s\" (UID: \"896e93d2-2565-4c00-8157-31d3ff6767e1\") " pod="openshift-infra/auto-csr-approver-29535492-dfp9s" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.380081 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jpz\" (UniqueName: \"kubernetes.io/projected/896e93d2-2565-4c00-8157-31d3ff6767e1-kube-api-access-q6jpz\") pod \"auto-csr-approver-29535492-dfp9s\" (UID: \"896e93d2-2565-4c00-8157-31d3ff6767e1\") " pod="openshift-infra/auto-csr-approver-29535492-dfp9s" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.401890 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jpz\" (UniqueName: \"kubernetes.io/projected/896e93d2-2565-4c00-8157-31d3ff6767e1-kube-api-access-q6jpz\") pod \"auto-csr-approver-29535492-dfp9s\" (UID: \"896e93d2-2565-4c00-8157-31d3ff6767e1\") " pod="openshift-infra/auto-csr-approver-29535492-dfp9s" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.509656 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535492-dfp9s" Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.987321 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535492-dfp9s"] Feb 26 18:12:00 crc kubenswrapper[5033]: I0226 18:12:00.998980 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 18:12:01 crc kubenswrapper[5033]: I0226 18:12:01.699621 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535492-dfp9s" event={"ID":"896e93d2-2565-4c00-8157-31d3ff6767e1","Type":"ContainerStarted","Data":"dec86445580dd028106caafe775d2243934764e3d2f426a6f9b69c5b46cc114f"} Feb 26 18:12:02 crc kubenswrapper[5033]: I0226 18:12:02.709106 5033 generic.go:334] "Generic (PLEG): container finished" podID="896e93d2-2565-4c00-8157-31d3ff6767e1" containerID="8a5291226b6f9dc16e881b91ecc5f8d88330f982bc3943ca44012a7d0b91dd8b" exitCode=0 Feb 26 18:12:02 crc kubenswrapper[5033]: I0226 18:12:02.709148 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535492-dfp9s" event={"ID":"896e93d2-2565-4c00-8157-31d3ff6767e1","Type":"ContainerDied","Data":"8a5291226b6f9dc16e881b91ecc5f8d88330f982bc3943ca44012a7d0b91dd8b"} Feb 26 18:12:04 crc kubenswrapper[5033]: I0226 18:12:04.117618 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535492-dfp9s" Feb 26 18:12:04 crc kubenswrapper[5033]: I0226 18:12:04.176954 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6jpz\" (UniqueName: \"kubernetes.io/projected/896e93d2-2565-4c00-8157-31d3ff6767e1-kube-api-access-q6jpz\") pod \"896e93d2-2565-4c00-8157-31d3ff6767e1\" (UID: \"896e93d2-2565-4c00-8157-31d3ff6767e1\") " Feb 26 18:12:04 crc kubenswrapper[5033]: I0226 18:12:04.183785 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/896e93d2-2565-4c00-8157-31d3ff6767e1-kube-api-access-q6jpz" (OuterVolumeSpecName: "kube-api-access-q6jpz") pod "896e93d2-2565-4c00-8157-31d3ff6767e1" (UID: "896e93d2-2565-4c00-8157-31d3ff6767e1"). InnerVolumeSpecName "kube-api-access-q6jpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:12:04 crc kubenswrapper[5033]: I0226 18:12:04.280174 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6jpz\" (UniqueName: \"kubernetes.io/projected/896e93d2-2565-4c00-8157-31d3ff6767e1-kube-api-access-q6jpz\") on node \"crc\" DevicePath \"\"" Feb 26 18:12:04 crc kubenswrapper[5033]: I0226 18:12:04.729397 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535492-dfp9s" event={"ID":"896e93d2-2565-4c00-8157-31d3ff6767e1","Type":"ContainerDied","Data":"dec86445580dd028106caafe775d2243934764e3d2f426a6f9b69c5b46cc114f"} Feb 26 18:12:04 crc kubenswrapper[5033]: I0226 18:12:04.729442 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dec86445580dd028106caafe775d2243934764e3d2f426a6f9b69c5b46cc114f" Feb 26 18:12:04 crc kubenswrapper[5033]: I0226 18:12:04.729481 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535492-dfp9s" Feb 26 18:12:05 crc kubenswrapper[5033]: I0226 18:12:05.188138 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535486-6npdb"] Feb 26 18:12:05 crc kubenswrapper[5033]: I0226 18:12:05.199230 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535486-6npdb"] Feb 26 18:12:05 crc kubenswrapper[5033]: I0226 18:12:05.641978 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb559c1b-f454-48a4-9afa-1b52c3a221b2" path="/var/lib/kubelet/pods/eb559c1b-f454-48a4-9afa-1b52c3a221b2/volumes" Feb 26 18:12:44 crc kubenswrapper[5033]: I0226 18:12:44.128451 5033 scope.go:117] "RemoveContainer" containerID="3eead5fbf24f5f600fc14ce245509c7ebb8364b66a6b3f1d533c274e5b84dbd6" Feb 26 18:13:20 crc kubenswrapper[5033]: I0226 18:13:20.189101 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:13:20 crc kubenswrapper[5033]: I0226 18:13:20.189856 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:13:50 crc kubenswrapper[5033]: I0226 18:13:50.189403 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:13:50 crc kubenswrapper[5033]: I0226 18:13:50.189985 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.143858 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535494-kxw86"] Feb 26 18:14:00 crc kubenswrapper[5033]: E0226 18:14:00.145005 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896e93d2-2565-4c00-8157-31d3ff6767e1" containerName="oc" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.145024 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="896e93d2-2565-4c00-8157-31d3ff6767e1" containerName="oc" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.145326 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="896e93d2-2565-4c00-8157-31d3ff6767e1" containerName="oc" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.146247 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535494-kxw86" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.149531 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.149817 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.149940 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.159035 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535494-kxw86"] Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.285316 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgrw6\" (UniqueName: \"kubernetes.io/projected/0c91e220-3586-4685-9eef-c9c8405a00ad-kube-api-access-bgrw6\") pod \"auto-csr-approver-29535494-kxw86\" (UID: \"0c91e220-3586-4685-9eef-c9c8405a00ad\") " pod="openshift-infra/auto-csr-approver-29535494-kxw86" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.387185 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgrw6\" (UniqueName: \"kubernetes.io/projected/0c91e220-3586-4685-9eef-c9c8405a00ad-kube-api-access-bgrw6\") pod \"auto-csr-approver-29535494-kxw86\" (UID: \"0c91e220-3586-4685-9eef-c9c8405a00ad\") " pod="openshift-infra/auto-csr-approver-29535494-kxw86" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.415592 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgrw6\" (UniqueName: \"kubernetes.io/projected/0c91e220-3586-4685-9eef-c9c8405a00ad-kube-api-access-bgrw6\") pod \"auto-csr-approver-29535494-kxw86\" (UID: \"0c91e220-3586-4685-9eef-c9c8405a00ad\") " pod="openshift-infra/auto-csr-approver-29535494-kxw86" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.476205 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535494-kxw86" Feb 26 18:14:00 crc kubenswrapper[5033]: I0226 18:14:00.987908 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535494-kxw86"] Feb 26 18:14:01 crc kubenswrapper[5033]: I0226 18:14:01.957203 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535494-kxw86" event={"ID":"0c91e220-3586-4685-9eef-c9c8405a00ad","Type":"ContainerStarted","Data":"0f9d76a29516bd57f229e4d6f42f9a77455b2c7d72d0310be3b4a38006a6deef"} Feb 26 18:14:03 crc kubenswrapper[5033]: I0226 18:14:03.983226 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535494-kxw86" event={"ID":"0c91e220-3586-4685-9eef-c9c8405a00ad","Type":"ContainerStarted","Data":"50b171ed59f25034ea91a5eae0a9b04ec5de289cdce759dd22e2da2a73f48c39"} Feb 26 18:14:04 crc kubenswrapper[5033]: I0226 18:14:04.002544 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535494-kxw86" podStartSLOduration=2.050063929 podStartE2EDuration="4.002523544s" podCreationTimestamp="2026-02-26 18:14:00 +0000 UTC" firstStartedPulling="2026-02-26 18:14:01.011056722 +0000 UTC m=+9168.223655382" lastFinishedPulling="2026-02-26 18:14:02.963516337 +0000 UTC m=+9170.176114997" observedRunningTime="2026-02-26 18:14:03.999720636 +0000 UTC m=+9171.212319296" watchObservedRunningTime="2026-02-26 18:14:04.002523544 +0000 UTC m=+9171.215122194" Feb 26 18:14:04 crc kubenswrapper[5033]: I0226 18:14:04.998832 5033 generic.go:334] "Generic (PLEG): container finished" podID="0c91e220-3586-4685-9eef-c9c8405a00ad" containerID="50b171ed59f25034ea91a5eae0a9b04ec5de289cdce759dd22e2da2a73f48c39" exitCode=0 Feb 26 18:14:04 crc kubenswrapper[5033]: I0226 18:14:04.999167 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535494-kxw86" event={"ID":"0c91e220-3586-4685-9eef-c9c8405a00ad","Type":"ContainerDied","Data":"50b171ed59f25034ea91a5eae0a9b04ec5de289cdce759dd22e2da2a73f48c39"} Feb 26 18:14:06 crc kubenswrapper[5033]: I0226 18:14:06.426945 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535494-kxw86" Feb 26 18:14:06 crc kubenswrapper[5033]: I0226 18:14:06.529090 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgrw6\" (UniqueName: \"kubernetes.io/projected/0c91e220-3586-4685-9eef-c9c8405a00ad-kube-api-access-bgrw6\") pod \"0c91e220-3586-4685-9eef-c9c8405a00ad\" (UID: \"0c91e220-3586-4685-9eef-c9c8405a00ad\") " Feb 26 18:14:06 crc kubenswrapper[5033]: I0226 18:14:06.537805 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c91e220-3586-4685-9eef-c9c8405a00ad-kube-api-access-bgrw6" (OuterVolumeSpecName: "kube-api-access-bgrw6") pod "0c91e220-3586-4685-9eef-c9c8405a00ad" (UID: "0c91e220-3586-4685-9eef-c9c8405a00ad"). InnerVolumeSpecName "kube-api-access-bgrw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:14:06 crc kubenswrapper[5033]: I0226 18:14:06.632223 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgrw6\" (UniqueName: \"kubernetes.io/projected/0c91e220-3586-4685-9eef-c9c8405a00ad-kube-api-access-bgrw6\") on node \"crc\" DevicePath \"\"" Feb 26 18:14:06 crc kubenswrapper[5033]: I0226 18:14:06.734864 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535488-8wks2"] Feb 26 18:14:06 crc kubenswrapper[5033]: I0226 18:14:06.745187 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535488-8wks2"] Feb 26 18:14:07 crc kubenswrapper[5033]: I0226 18:14:07.031809 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535494-kxw86" event={"ID":"0c91e220-3586-4685-9eef-c9c8405a00ad","Type":"ContainerDied","Data":"0f9d76a29516bd57f229e4d6f42f9a77455b2c7d72d0310be3b4a38006a6deef"} Feb 26 18:14:07 crc kubenswrapper[5033]: I0226 18:14:07.031862 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f9d76a29516bd57f229e4d6f42f9a77455b2c7d72d0310be3b4a38006a6deef" Feb 26 18:14:07 crc kubenswrapper[5033]: I0226 18:14:07.031900 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535494-kxw86" Feb 26 18:14:07 crc kubenswrapper[5033]: I0226 18:14:07.827798 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc73b1db-9563-497f-b1e9-1a6381d79e17" path="/var/lib/kubelet/pods/dc73b1db-9563-497f-b1e9-1a6381d79e17/volumes" Feb 26 18:14:11 crc kubenswrapper[5033]: I0226 18:14:11.760122 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 26 18:14:11 crc kubenswrapper[5033]: I0226 18:14:11.760888 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="f59560e4-f5b7-4062-b7ef-c8b265d2dd0b" containerName="adoption" containerID="cri-o://479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178" gracePeriod=30 Feb 26 18:14:20 crc kubenswrapper[5033]: I0226 18:14:20.188533 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:14:20 crc kubenswrapper[5033]: I0226 18:14:20.189225 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:14:20 crc kubenswrapper[5033]: I0226 18:14:20.189488 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 18:14:20 crc kubenswrapper[5033]: I0226 18:14:20.190473 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 18:14:20 crc kubenswrapper[5033]: I0226 18:14:20.190533 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" gracePeriod=600 Feb 26 18:14:20 crc kubenswrapper[5033]: E0226 18:14:20.319313 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:14:20 crc kubenswrapper[5033]: E0226 18:14:20.325475 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b30da39_cf26_480f_82fc_dbb869e9acbd.slice/crio-conmon-51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23.scope\": RecentStats: unable to find data in memory cache]" Feb 26 18:14:21 crc kubenswrapper[5033]: I0226 18:14:21.195123 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" exitCode=0 Feb 26 18:14:21 crc kubenswrapper[5033]: I0226 18:14:21.196035 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23"} Feb 26 18:14:21 crc kubenswrapper[5033]: I0226 18:14:21.196149 5033 scope.go:117] "RemoveContainer" containerID="4e59aefdf6606d8a6c8fb0cbff2ade0955939bcd9d9ba59b237ee2cd1f8472b2" Feb 26 18:14:21 crc kubenswrapper[5033]: I0226 18:14:21.197298 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:14:21 crc kubenswrapper[5033]: E0226 18:14:21.197893 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:14:34 crc kubenswrapper[5033]: I0226 18:14:34.628475 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:14:34 crc kubenswrapper[5033]: E0226 18:14:34.629274 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.296910 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.443120 5033 generic.go:334] "Generic (PLEG): container finished" podID="f59560e4-f5b7-4062-b7ef-c8b265d2dd0b" containerID="479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178" exitCode=137 Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.443160 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b","Type":"ContainerDied","Data":"479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178"} Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.443186 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b","Type":"ContainerDied","Data":"fc2f38cfc09bf62c959173f83ac2406e92c64e9f9245634e2bc00c6f9eb5b2a0"} Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.443202 5033 scope.go:117] "RemoveContainer" containerID="479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.443196 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.454741 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\") pod \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\" (UID: \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\") " Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.454893 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nblcx\" (UniqueName: \"kubernetes.io/projected/f59560e4-f5b7-4062-b7ef-c8b265d2dd0b-kube-api-access-nblcx\") pod \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\" (UID: \"f59560e4-f5b7-4062-b7ef-c8b265d2dd0b\") " Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.461631 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59560e4-f5b7-4062-b7ef-c8b265d2dd0b-kube-api-access-nblcx" (OuterVolumeSpecName: "kube-api-access-nblcx") pod "f59560e4-f5b7-4062-b7ef-c8b265d2dd0b" (UID: "f59560e4-f5b7-4062-b7ef-c8b265d2dd0b"). InnerVolumeSpecName "kube-api-access-nblcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.497534 5033 scope.go:117] "RemoveContainer" containerID="479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178" Feb 26 18:14:42 crc kubenswrapper[5033]: E0226 18:14:42.497978 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178\": container with ID starting with 479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178 not found: ID does not exist" containerID="479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.498013 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178"} err="failed to get container status \"479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178\": rpc error: code = NotFound desc = could not find container \"479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178\": container with ID starting with 479adf24945977fb73a3051190d8b4974c10b57cc4c1119b960d824e45442178 not found: ID does not exist" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.498332 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a894638-dcd4-4df8-a254-3743eefa6a24" (OuterVolumeSpecName: "mariadb-data") pod "f59560e4-f5b7-4062-b7ef-c8b265d2dd0b" (UID: "f59560e4-f5b7-4062-b7ef-c8b265d2dd0b"). InnerVolumeSpecName "pvc-4a894638-dcd4-4df8-a254-3743eefa6a24". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.557947 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\") on node \"crc\" " Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.558236 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nblcx\" (UniqueName: \"kubernetes.io/projected/f59560e4-f5b7-4062-b7ef-c8b265d2dd0b-kube-api-access-nblcx\") on node \"crc\" DevicePath \"\"" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.597744 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.597902 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4a894638-dcd4-4df8-a254-3743eefa6a24" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a894638-dcd4-4df8-a254-3743eefa6a24") on node "crc" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.660399 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a894638-dcd4-4df8-a254-3743eefa6a24\") on node \"crc\" DevicePath \"\"" Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.782095 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 26 18:14:42 crc kubenswrapper[5033]: I0226 18:14:42.791270 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 26 18:14:43 crc kubenswrapper[5033]: I0226 18:14:43.554634 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 26 18:14:43 crc kubenswrapper[5033]: I0226 18:14:43.555191 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="682c1a69-619b-4f26-bfdf-ad71f13cecfc" containerName="adoption" containerID="cri-o://833d7444db08b6840f10a7b02e1ed045b641e83d121de9a99e3e283ef03c933c" gracePeriod=30 Feb 26 18:14:43 crc kubenswrapper[5033]: I0226 18:14:43.641701 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59560e4-f5b7-4062-b7ef-c8b265d2dd0b" path="/var/lib/kubelet/pods/f59560e4-f5b7-4062-b7ef-c8b265d2dd0b/volumes" Feb 26 18:14:44 crc kubenswrapper[5033]: I0226 18:14:44.234627 5033 scope.go:117] "RemoveContainer" containerID="b69f25c489f9376c576a4987a573d52239f075a91ce5bcb8ad601c0030bf0c08" Feb 26 18:14:48 crc kubenswrapper[5033]: I0226 18:14:48.627916 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:14:48 crc kubenswrapper[5033]: E0226 18:14:48.628765 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.155017 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s"] Feb 26 18:15:00 crc kubenswrapper[5033]: E0226 18:15:00.156143 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59560e4-f5b7-4062-b7ef-c8b265d2dd0b" containerName="adoption" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.156165 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59560e4-f5b7-4062-b7ef-c8b265d2dd0b" containerName="adoption" Feb 26 18:15:00 crc kubenswrapper[5033]: E0226 18:15:00.156231 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c91e220-3586-4685-9eef-c9c8405a00ad" containerName="oc" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.156238 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c91e220-3586-4685-9eef-c9c8405a00ad" containerName="oc" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.156504 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c91e220-3586-4685-9eef-c9c8405a00ad" containerName="oc" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.156540 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f59560e4-f5b7-4062-b7ef-c8b265d2dd0b" containerName="adoption" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.157546 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.161521 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.161766 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.166971 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s"] Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.257973 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de6943d-8939-4983-a5a5-3767948cacfd-config-volume\") pod \"collect-profiles-29535495-pdf6s\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.258389 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8zf8\" (UniqueName: \"kubernetes.io/projected/0de6943d-8939-4983-a5a5-3767948cacfd-kube-api-access-v8zf8\") pod \"collect-profiles-29535495-pdf6s\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.258445 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de6943d-8939-4983-a5a5-3767948cacfd-secret-volume\") pod \"collect-profiles-29535495-pdf6s\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.362107 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de6943d-8939-4983-a5a5-3767948cacfd-config-volume\") pod \"collect-profiles-29535495-pdf6s\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.362395 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8zf8\" (UniqueName: \"kubernetes.io/projected/0de6943d-8939-4983-a5a5-3767948cacfd-kube-api-access-v8zf8\") pod \"collect-profiles-29535495-pdf6s\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.362488 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de6943d-8939-4983-a5a5-3767948cacfd-secret-volume\") pod \"collect-profiles-29535495-pdf6s\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.363121 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de6943d-8939-4983-a5a5-3767948cacfd-config-volume\") pod \"collect-profiles-29535495-pdf6s\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.673315 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de6943d-8939-4983-a5a5-3767948cacfd-secret-volume\") pod \"collect-profiles-29535495-pdf6s\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.673530 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8zf8\" (UniqueName: \"kubernetes.io/projected/0de6943d-8939-4983-a5a5-3767948cacfd-kube-api-access-v8zf8\") pod \"collect-profiles-29535495-pdf6s\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:00 crc kubenswrapper[5033]: I0226 18:15:00.790983 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:01 crc kubenswrapper[5033]: I0226 18:15:01.286557 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s"] Feb 26 18:15:01 crc kubenswrapper[5033]: I0226 18:15:01.652833 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" event={"ID":"0de6943d-8939-4983-a5a5-3767948cacfd","Type":"ContainerStarted","Data":"4518e65e71d730bdea4059dfe2bbca6275a539ed89da7d5a85f04462f064a0b2"} Feb 26 18:15:01 crc kubenswrapper[5033]: I0226 18:15:01.653173 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" event={"ID":"0de6943d-8939-4983-a5a5-3767948cacfd","Type":"ContainerStarted","Data":"e2d91597b3e4f34edb21b7375b60ffd8db01db559778dee94011cc153995c6d1"} Feb 26 18:15:01 crc kubenswrapper[5033]: I0226 18:15:01.672871 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" podStartSLOduration=1.6728510760000002 podStartE2EDuration="1.672851076s" podCreationTimestamp="2026-02-26 18:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 18:15:01.672123929 +0000 UTC m=+9228.884722579" watchObservedRunningTime="2026-02-26 18:15:01.672851076 +0000 UTC m=+9228.885449736" Feb 26 18:15:02 crc kubenswrapper[5033]: I0226 18:15:02.664123 5033 generic.go:334] "Generic (PLEG): container finished" podID="0de6943d-8939-4983-a5a5-3767948cacfd" containerID="4518e65e71d730bdea4059dfe2bbca6275a539ed89da7d5a85f04462f064a0b2" exitCode=0 Feb 26 18:15:02 crc kubenswrapper[5033]: I0226 18:15:02.664232 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" event={"ID":"0de6943d-8939-4983-a5a5-3767948cacfd","Type":"ContainerDied","Data":"4518e65e71d730bdea4059dfe2bbca6275a539ed89da7d5a85f04462f064a0b2"} Feb 26 18:15:03 crc kubenswrapper[5033]: I0226 18:15:03.636041 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:15:03 crc kubenswrapper[5033]: E0226 18:15:03.636384 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.051664 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.150560 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de6943d-8939-4983-a5a5-3767948cacfd-config-volume\") pod \"0de6943d-8939-4983-a5a5-3767948cacfd\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.150625 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de6943d-8939-4983-a5a5-3767948cacfd-secret-volume\") pod \"0de6943d-8939-4983-a5a5-3767948cacfd\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.150649 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8zf8\" (UniqueName: \"kubernetes.io/projected/0de6943d-8939-4983-a5a5-3767948cacfd-kube-api-access-v8zf8\") pod \"0de6943d-8939-4983-a5a5-3767948cacfd\" (UID: \"0de6943d-8939-4983-a5a5-3767948cacfd\") " Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.151362 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0de6943d-8939-4983-a5a5-3767948cacfd-config-volume" (OuterVolumeSpecName: "config-volume") pod "0de6943d-8939-4983-a5a5-3767948cacfd" (UID: "0de6943d-8939-4983-a5a5-3767948cacfd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.158557 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de6943d-8939-4983-a5a5-3767948cacfd-kube-api-access-v8zf8" (OuterVolumeSpecName: "kube-api-access-v8zf8") pod "0de6943d-8939-4983-a5a5-3767948cacfd" (UID: "0de6943d-8939-4983-a5a5-3767948cacfd"). InnerVolumeSpecName "kube-api-access-v8zf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.159659 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0de6943d-8939-4983-a5a5-3767948cacfd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0de6943d-8939-4983-a5a5-3767948cacfd" (UID: "0de6943d-8939-4983-a5a5-3767948cacfd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.253051 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0de6943d-8939-4983-a5a5-3767948cacfd-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.253104 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0de6943d-8939-4983-a5a5-3767948cacfd-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.253119 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8zf8\" (UniqueName: \"kubernetes.io/projected/0de6943d-8939-4983-a5a5-3767948cacfd-kube-api-access-v8zf8\") on node \"crc\" DevicePath \"\"" Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.374661 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk"] Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.390597 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535450-rtxgk"] Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.687638 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" event={"ID":"0de6943d-8939-4983-a5a5-3767948cacfd","Type":"ContainerDied","Data":"e2d91597b3e4f34edb21b7375b60ffd8db01db559778dee94011cc153995c6d1"} Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.687686 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2d91597b3e4f34edb21b7375b60ffd8db01db559778dee94011cc153995c6d1" Feb 26 18:15:04 crc kubenswrapper[5033]: I0226 18:15:04.687735 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535495-pdf6s" Feb 26 18:15:05 crc kubenswrapper[5033]: I0226 18:15:05.642514 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc" path="/var/lib/kubelet/pods/4d6a980a-6455-42f0-ad1a-8e1f75b1f0dc/volumes" Feb 26 18:15:13 crc kubenswrapper[5033]: I0226 18:15:13.787158 5033 generic.go:334] "Generic (PLEG): container finished" podID="682c1a69-619b-4f26-bfdf-ad71f13cecfc" containerID="833d7444db08b6840f10a7b02e1ed045b641e83d121de9a99e3e283ef03c933c" exitCode=137 Feb 26 18:15:13 crc kubenswrapper[5033]: I0226 18:15:13.787296 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"682c1a69-619b-4f26-bfdf-ad71f13cecfc","Type":"ContainerDied","Data":"833d7444db08b6840f10a7b02e1ed045b641e83d121de9a99e3e283ef03c933c"} Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.033379 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.193202 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6781f54e-f208-4b55-828c-2ea6c2101239\") pod \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.193283 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkjdz\" (UniqueName: \"kubernetes.io/projected/682c1a69-619b-4f26-bfdf-ad71f13cecfc-kube-api-access-fkjdz\") pod \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.193449 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/682c1a69-619b-4f26-bfdf-ad71f13cecfc-ovn-data-cert\") pod \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\" (UID: \"682c1a69-619b-4f26-bfdf-ad71f13cecfc\") " Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.203814 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682c1a69-619b-4f26-bfdf-ad71f13cecfc-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "682c1a69-619b-4f26-bfdf-ad71f13cecfc" (UID: "682c1a69-619b-4f26-bfdf-ad71f13cecfc"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.203939 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/682c1a69-619b-4f26-bfdf-ad71f13cecfc-kube-api-access-fkjdz" (OuterVolumeSpecName: "kube-api-access-fkjdz") pod "682c1a69-619b-4f26-bfdf-ad71f13cecfc" (UID: "682c1a69-619b-4f26-bfdf-ad71f13cecfc"). InnerVolumeSpecName "kube-api-access-fkjdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.217002 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6781f54e-f208-4b55-828c-2ea6c2101239" (OuterVolumeSpecName: "ovn-data") pod "682c1a69-619b-4f26-bfdf-ad71f13cecfc" (UID: "682c1a69-619b-4f26-bfdf-ad71f13cecfc"). InnerVolumeSpecName "pvc-6781f54e-f208-4b55-828c-2ea6c2101239". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.296428 5033 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6781f54e-f208-4b55-828c-2ea6c2101239\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6781f54e-f208-4b55-828c-2ea6c2101239\") on node \"crc\" " Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.296479 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkjdz\" (UniqueName: \"kubernetes.io/projected/682c1a69-619b-4f26-bfdf-ad71f13cecfc-kube-api-access-fkjdz\") on node \"crc\" DevicePath \"\"" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.296494 5033 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/682c1a69-619b-4f26-bfdf-ad71f13cecfc-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.327227 5033 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.327449 5033 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6781f54e-f208-4b55-828c-2ea6c2101239" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6781f54e-f208-4b55-828c-2ea6c2101239") on node "crc" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.398447 5033 reconciler_common.go:293] "Volume detached for volume \"pvc-6781f54e-f208-4b55-828c-2ea6c2101239\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6781f54e-f208-4b55-828c-2ea6c2101239\") on node \"crc\" DevicePath \"\"" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.628085 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:15:14 crc kubenswrapper[5033]: E0226 18:15:14.628507 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.796970 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"682c1a69-619b-4f26-bfdf-ad71f13cecfc","Type":"ContainerDied","Data":"4909492c03c7dfa660b98d4596f10df2918238057bd48f185b5daa27b1423c5c"} Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.797026 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.797292 5033 scope.go:117] "RemoveContainer" containerID="833d7444db08b6840f10a7b02e1ed045b641e83d121de9a99e3e283ef03c933c" Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.844090 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 26 18:15:14 crc kubenswrapper[5033]: I0226 18:15:14.855519 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 26 18:15:15 crc kubenswrapper[5033]: I0226 18:15:15.639520 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="682c1a69-619b-4f26-bfdf-ad71f13cecfc" path="/var/lib/kubelet/pods/682c1a69-619b-4f26-bfdf-ad71f13cecfc/volumes" Feb 26 18:15:27 crc kubenswrapper[5033]: I0226 18:15:27.629400 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:15:27 crc kubenswrapper[5033]: E0226 18:15:27.630905 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:15:39 crc kubenswrapper[5033]: I0226 18:15:39.628778 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:15:39 crc kubenswrapper[5033]: E0226 18:15:39.629593 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:15:44 crc kubenswrapper[5033]: I0226 18:15:44.348033 5033 scope.go:117] "RemoveContainer" containerID="2b33d8ab4f61a15acb7884e0ff3a4169ad51e6f4d76821d24773e5345820b1ae" Feb 26 18:15:52 crc kubenswrapper[5033]: I0226 18:15:52.628900 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:15:52 crc kubenswrapper[5033]: E0226 18:15:52.630806 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.147419 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535496-5h9jl"] Feb 26 18:16:00 crc kubenswrapper[5033]: E0226 18:16:00.148586 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682c1a69-619b-4f26-bfdf-ad71f13cecfc" containerName="adoption" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.148603 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="682c1a69-619b-4f26-bfdf-ad71f13cecfc" containerName="adoption" Feb 26 18:16:00 crc kubenswrapper[5033]: E0226 18:16:00.148637 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de6943d-8939-4983-a5a5-3767948cacfd" containerName="collect-profiles" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.148646 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de6943d-8939-4983-a5a5-3767948cacfd" containerName="collect-profiles" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.148934 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0de6943d-8939-4983-a5a5-3767948cacfd" containerName="collect-profiles" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.148965 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="682c1a69-619b-4f26-bfdf-ad71f13cecfc" containerName="adoption" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.149949 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535496-5h9jl" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.152597 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.153682 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.153830 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.158592 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535496-5h9jl"] Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.208994 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n8lm\" (UniqueName: \"kubernetes.io/projected/68494582-a7e8-4967-8e7b-1c03dd5627e2-kube-api-access-8n8lm\") pod \"auto-csr-approver-29535496-5h9jl\" (UID: \"68494582-a7e8-4967-8e7b-1c03dd5627e2\") " pod="openshift-infra/auto-csr-approver-29535496-5h9jl" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.312040 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n8lm\" (UniqueName: \"kubernetes.io/projected/68494582-a7e8-4967-8e7b-1c03dd5627e2-kube-api-access-8n8lm\") pod \"auto-csr-approver-29535496-5h9jl\" (UID: \"68494582-a7e8-4967-8e7b-1c03dd5627e2\") " pod="openshift-infra/auto-csr-approver-29535496-5h9jl" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.330636 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n8lm\" (UniqueName: \"kubernetes.io/projected/68494582-a7e8-4967-8e7b-1c03dd5627e2-kube-api-access-8n8lm\") pod \"auto-csr-approver-29535496-5h9jl\" (UID: \"68494582-a7e8-4967-8e7b-1c03dd5627e2\") " pod="openshift-infra/auto-csr-approver-29535496-5h9jl" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.475702 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535496-5h9jl" Feb 26 18:16:00 crc kubenswrapper[5033]: I0226 18:16:00.956098 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535496-5h9jl"] Feb 26 18:16:01 crc kubenswrapper[5033]: I0226 18:16:01.294217 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535496-5h9jl" event={"ID":"68494582-a7e8-4967-8e7b-1c03dd5627e2","Type":"ContainerStarted","Data":"035266a70a15e07ef9def4b7c1d626be199fee0fb1f805c9863aab9d43a5bd5d"} Feb 26 18:16:03 crc kubenswrapper[5033]: I0226 18:16:03.330541 5033 generic.go:334] "Generic (PLEG): container finished" podID="68494582-a7e8-4967-8e7b-1c03dd5627e2" containerID="6bbc5e6602b3e2056537d0a9e57f16af4b1ca4f0debd8012ff615ff587083b73" exitCode=0 Feb 26 18:16:03 crc kubenswrapper[5033]: I0226 18:16:03.330911 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535496-5h9jl" event={"ID":"68494582-a7e8-4967-8e7b-1c03dd5627e2","Type":"ContainerDied","Data":"6bbc5e6602b3e2056537d0a9e57f16af4b1ca4f0debd8012ff615ff587083b73"} Feb 26 18:16:04 crc kubenswrapper[5033]: I0226 18:16:04.720082 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535496-5h9jl" Feb 26 18:16:04 crc kubenswrapper[5033]: I0226 18:16:04.908475 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n8lm\" (UniqueName: \"kubernetes.io/projected/68494582-a7e8-4967-8e7b-1c03dd5627e2-kube-api-access-8n8lm\") pod \"68494582-a7e8-4967-8e7b-1c03dd5627e2\" (UID: \"68494582-a7e8-4967-8e7b-1c03dd5627e2\") " Feb 26 18:16:04 crc kubenswrapper[5033]: I0226 18:16:04.915050 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68494582-a7e8-4967-8e7b-1c03dd5627e2-kube-api-access-8n8lm" (OuterVolumeSpecName: "kube-api-access-8n8lm") pod "68494582-a7e8-4967-8e7b-1c03dd5627e2" (UID: "68494582-a7e8-4967-8e7b-1c03dd5627e2"). InnerVolumeSpecName "kube-api-access-8n8lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:16:05 crc kubenswrapper[5033]: I0226 18:16:05.011164 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n8lm\" (UniqueName: \"kubernetes.io/projected/68494582-a7e8-4967-8e7b-1c03dd5627e2-kube-api-access-8n8lm\") on node \"crc\" DevicePath \"\"" Feb 26 18:16:05 crc kubenswrapper[5033]: I0226 18:16:05.352224 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535496-5h9jl" event={"ID":"68494582-a7e8-4967-8e7b-1c03dd5627e2","Type":"ContainerDied","Data":"035266a70a15e07ef9def4b7c1d626be199fee0fb1f805c9863aab9d43a5bd5d"} Feb 26 18:16:05 crc kubenswrapper[5033]: I0226 18:16:05.352298 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035266a70a15e07ef9def4b7c1d626be199fee0fb1f805c9863aab9d43a5bd5d" Feb 26 18:16:05 crc kubenswrapper[5033]: I0226 18:16:05.352340 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535496-5h9jl" Feb 26 18:16:05 crc kubenswrapper[5033]: I0226 18:16:05.786495 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535490-cg7q5"] Feb 26 18:16:05 crc kubenswrapper[5033]: I0226 18:16:05.795236 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535490-cg7q5"] Feb 26 18:16:06 crc kubenswrapper[5033]: I0226 18:16:06.628136 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:16:06 crc kubenswrapper[5033]: E0226 18:16:06.628711 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:16:07 crc kubenswrapper[5033]: I0226 18:16:07.643361 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de" path="/var/lib/kubelet/pods/93e69f37-5cb9-4f7c-a4fc-6d658cf8e7de/volumes" Feb 26 18:16:17 crc kubenswrapper[5033]: I0226 18:16:17.628775 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:16:17 crc kubenswrapper[5033]: E0226 18:16:17.629621 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.080752 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5586k/must-gather-hm6vr"] Feb 26 18:16:20 crc kubenswrapper[5033]: E0226 18:16:20.081746 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68494582-a7e8-4967-8e7b-1c03dd5627e2" containerName="oc" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.081759 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="68494582-a7e8-4967-8e7b-1c03dd5627e2" containerName="oc" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.081964 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="68494582-a7e8-4967-8e7b-1c03dd5627e2" containerName="oc" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.083175 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.085409 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5586k"/"default-dockercfg-rwvpd" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.087604 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5586k"/"kube-root-ca.crt" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.087607 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5586k"/"openshift-service-ca.crt" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.095754 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5586k/must-gather-hm6vr"] Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.232578 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/715a121f-d869-4599-8718-2c62220d5e2b-must-gather-output\") pod \"must-gather-hm6vr\" (UID: \"715a121f-d869-4599-8718-2c62220d5e2b\") " pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.232650 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s25j\" (UniqueName: \"kubernetes.io/projected/715a121f-d869-4599-8718-2c62220d5e2b-kube-api-access-6s25j\") pod \"must-gather-hm6vr\" (UID: \"715a121f-d869-4599-8718-2c62220d5e2b\") " pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.335001 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/715a121f-d869-4599-8718-2c62220d5e2b-must-gather-output\") pod \"must-gather-hm6vr\" (UID: \"715a121f-d869-4599-8718-2c62220d5e2b\") " pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.335078 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s25j\" (UniqueName: \"kubernetes.io/projected/715a121f-d869-4599-8718-2c62220d5e2b-kube-api-access-6s25j\") pod \"must-gather-hm6vr\" (UID: \"715a121f-d869-4599-8718-2c62220d5e2b\") " pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.335743 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/715a121f-d869-4599-8718-2c62220d5e2b-must-gather-output\") pod \"must-gather-hm6vr\" (UID: \"715a121f-d869-4599-8718-2c62220d5e2b\") " pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.358303 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s25j\" (UniqueName: \"kubernetes.io/projected/715a121f-d869-4599-8718-2c62220d5e2b-kube-api-access-6s25j\") pod \"must-gather-hm6vr\" (UID: \"715a121f-d869-4599-8718-2c62220d5e2b\") " pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.405744 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:16:20 crc kubenswrapper[5033]: I0226 18:16:20.908590 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5586k/must-gather-hm6vr"] Feb 26 18:16:21 crc kubenswrapper[5033]: I0226 18:16:21.519536 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/must-gather-hm6vr" event={"ID":"715a121f-d869-4599-8718-2c62220d5e2b","Type":"ContainerStarted","Data":"bfe80d96bb2bfe22d81bcae6c1499a513d6c7dd08f4bc57e10bea1b2e718dc2d"} Feb 26 18:16:28 crc kubenswrapper[5033]: I0226 18:16:28.611046 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/must-gather-hm6vr" event={"ID":"715a121f-d869-4599-8718-2c62220d5e2b","Type":"ContainerStarted","Data":"a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9"} Feb 26 18:16:29 crc kubenswrapper[5033]: I0226 18:16:29.644927 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/must-gather-hm6vr" event={"ID":"715a121f-d869-4599-8718-2c62220d5e2b","Type":"ContainerStarted","Data":"ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f"} Feb 26 18:16:29 crc kubenswrapper[5033]: I0226 18:16:29.646782 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5586k/must-gather-hm6vr" podStartSLOduration=2.371554596 podStartE2EDuration="9.646761231s" podCreationTimestamp="2026-02-26 18:16:20 +0000 UTC" firstStartedPulling="2026-02-26 18:16:20.917293416 +0000 UTC m=+9308.129892066" lastFinishedPulling="2026-02-26 18:16:28.192500051 +0000 UTC m=+9315.405098701" observedRunningTime="2026-02-26 18:16:29.641985255 +0000 UTC m=+9316.854583905" watchObservedRunningTime="2026-02-26 18:16:29.646761231 +0000 UTC m=+9316.859359881" Feb 26 18:16:30 crc kubenswrapper[5033]: I0226 18:16:30.627842 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:16:30 crc kubenswrapper[5033]: E0226 18:16:30.628439 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.099863 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5586k/crc-debug-bpznf"] Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.102240 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.248116 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6f6m\" (UniqueName: \"kubernetes.io/projected/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-kube-api-access-p6f6m\") pod \"crc-debug-bpznf\" (UID: \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\") " pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.248170 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-host\") pod \"crc-debug-bpznf\" (UID: \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\") " pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.350133 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6f6m\" (UniqueName: \"kubernetes.io/projected/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-kube-api-access-p6f6m\") pod \"crc-debug-bpznf\" (UID: \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\") " pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.350200 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-host\") pod \"crc-debug-bpznf\" (UID: \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\") " pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.350342 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-host\") pod \"crc-debug-bpznf\" (UID: \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\") " pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.375918 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6f6m\" (UniqueName: \"kubernetes.io/projected/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-kube-api-access-p6f6m\") pod \"crc-debug-bpznf\" (UID: \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\") " pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.425888 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:16:33 crc kubenswrapper[5033]: I0226 18:16:33.670049 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/crc-debug-bpznf" event={"ID":"c3ae57bb-c2d4-4249-859a-bd4a21e08e32","Type":"ContainerStarted","Data":"a44e5986950d8f80b2b207a409a897f365acd9928be18438718ef155c9001dd6"} Feb 26 18:16:41 crc kubenswrapper[5033]: I0226 18:16:41.627911 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:16:41 crc kubenswrapper[5033]: E0226 18:16:41.628818 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:16:44 crc kubenswrapper[5033]: I0226 18:16:44.648315 5033 scope.go:117] "RemoveContainer" containerID="f08303ed23a1bef0ffff0f2607f822a4e96a9fe068a1454b9c1b13add19070ba" Feb 26 18:16:47 crc kubenswrapper[5033]: I0226 18:16:47.827807 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/crc-debug-bpznf" event={"ID":"c3ae57bb-c2d4-4249-859a-bd4a21e08e32","Type":"ContainerStarted","Data":"3e118299383e0e84ff7ad4294b0211dbe56fb69479fec65f025bd5a3c309fcd7"} Feb 26 18:16:47 crc kubenswrapper[5033]: I0226 18:16:47.863338 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5586k/crc-debug-bpznf" podStartSLOduration=1.6760866970000001 podStartE2EDuration="14.863102778s" podCreationTimestamp="2026-02-26 18:16:33 +0000 UTC" firstStartedPulling="2026-02-26 18:16:33.473421453 +0000 UTC m=+9320.686020103" lastFinishedPulling="2026-02-26 18:16:46.660437534 +0000 UTC m=+9333.873036184" observedRunningTime="2026-02-26 18:16:47.852607033 +0000 UTC m=+9335.065205693" watchObservedRunningTime="2026-02-26 18:16:47.863102778 +0000 UTC m=+9335.075701428" Feb 26 18:16:56 crc kubenswrapper[5033]: I0226 18:16:56.627978 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:16:56 crc kubenswrapper[5033]: E0226 18:16:56.628871 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:17:07 crc kubenswrapper[5033]: I0226 18:17:07.628054 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:17:07 crc kubenswrapper[5033]: E0226 18:17:07.628931 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:17:12 crc kubenswrapper[5033]: I0226 18:17:12.053324 5033 generic.go:334] "Generic (PLEG): container finished" podID="c3ae57bb-c2d4-4249-859a-bd4a21e08e32" containerID="3e118299383e0e84ff7ad4294b0211dbe56fb69479fec65f025bd5a3c309fcd7" exitCode=0 Feb 26 18:17:12 crc kubenswrapper[5033]: I0226 18:17:12.053417 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/crc-debug-bpznf" event={"ID":"c3ae57bb-c2d4-4249-859a-bd4a21e08e32","Type":"ContainerDied","Data":"3e118299383e0e84ff7ad4294b0211dbe56fb69479fec65f025bd5a3c309fcd7"} Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.172121 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.215151 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5586k/crc-debug-bpznf"] Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.227011 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5586k/crc-debug-bpznf"] Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.255752 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-host\") pod \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\" (UID: \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\") " Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.256113 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6f6m\" (UniqueName: \"kubernetes.io/projected/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-kube-api-access-p6f6m\") pod \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\" (UID: \"c3ae57bb-c2d4-4249-859a-bd4a21e08e32\") " Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.256776 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-host" (OuterVolumeSpecName: "host") pod "c3ae57bb-c2d4-4249-859a-bd4a21e08e32" (UID: "c3ae57bb-c2d4-4249-859a-bd4a21e08e32"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.268552 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-kube-api-access-p6f6m" (OuterVolumeSpecName: "kube-api-access-p6f6m") pod "c3ae57bb-c2d4-4249-859a-bd4a21e08e32" (UID: "c3ae57bb-c2d4-4249-859a-bd4a21e08e32"). InnerVolumeSpecName "kube-api-access-p6f6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.358418 5033 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-host\") on node \"crc\" DevicePath \"\"" Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.358703 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6f6m\" (UniqueName: \"kubernetes.io/projected/c3ae57bb-c2d4-4249-859a-bd4a21e08e32-kube-api-access-p6f6m\") on node \"crc\" DevicePath \"\"" Feb 26 18:17:13 crc kubenswrapper[5033]: I0226 18:17:13.651881 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ae57bb-c2d4-4249-859a-bd4a21e08e32" path="/var/lib/kubelet/pods/c3ae57bb-c2d4-4249-859a-bd4a21e08e32/volumes" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.072399 5033 scope.go:117] "RemoveContainer" containerID="3e118299383e0e84ff7ad4294b0211dbe56fb69479fec65f025bd5a3c309fcd7" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.072425 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-bpznf" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.373889 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5586k/crc-debug-nd4xh"] Feb 26 18:17:14 crc kubenswrapper[5033]: E0226 18:17:14.375657 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae57bb-c2d4-4249-859a-bd4a21e08e32" containerName="container-00" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.375750 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae57bb-c2d4-4249-859a-bd4a21e08e32" containerName="container-00" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.377350 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ae57bb-c2d4-4249-859a-bd4a21e08e32" containerName="container-00" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.378487 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.477999 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/400f9346-1f4f-4a42-a90b-7c0099cdc293-host\") pod \"crc-debug-nd4xh\" (UID: \"400f9346-1f4f-4a42-a90b-7c0099cdc293\") " pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.478247 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm6b2\" (UniqueName: \"kubernetes.io/projected/400f9346-1f4f-4a42-a90b-7c0099cdc293-kube-api-access-xm6b2\") pod \"crc-debug-nd4xh\" (UID: \"400f9346-1f4f-4a42-a90b-7c0099cdc293\") " pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.580769 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm6b2\" (UniqueName: \"kubernetes.io/projected/400f9346-1f4f-4a42-a90b-7c0099cdc293-kube-api-access-xm6b2\") pod \"crc-debug-nd4xh\" (UID: \"400f9346-1f4f-4a42-a90b-7c0099cdc293\") " pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.580865 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/400f9346-1f4f-4a42-a90b-7c0099cdc293-host\") pod \"crc-debug-nd4xh\" (UID: \"400f9346-1f4f-4a42-a90b-7c0099cdc293\") " pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.580929 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/400f9346-1f4f-4a42-a90b-7c0099cdc293-host\") pod \"crc-debug-nd4xh\" (UID: \"400f9346-1f4f-4a42-a90b-7c0099cdc293\") " pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.599382 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm6b2\" (UniqueName: \"kubernetes.io/projected/400f9346-1f4f-4a42-a90b-7c0099cdc293-kube-api-access-xm6b2\") pod \"crc-debug-nd4xh\" (UID: \"400f9346-1f4f-4a42-a90b-7c0099cdc293\") " pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:14 crc kubenswrapper[5033]: I0226 18:17:14.695497 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:15 crc kubenswrapper[5033]: I0226 18:17:15.109800 5033 generic.go:334] "Generic (PLEG): container finished" podID="400f9346-1f4f-4a42-a90b-7c0099cdc293" containerID="f596d0675c57e2a76cf3a1b96da531d3aae780a978877aae41bbc0924c24065e" exitCode=0 Feb 26 18:17:15 crc kubenswrapper[5033]: I0226 18:17:15.110029 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/crc-debug-nd4xh" event={"ID":"400f9346-1f4f-4a42-a90b-7c0099cdc293","Type":"ContainerDied","Data":"f596d0675c57e2a76cf3a1b96da531d3aae780a978877aae41bbc0924c24065e"} Feb 26 18:17:15 crc kubenswrapper[5033]: I0226 18:17:15.110636 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/crc-debug-nd4xh" event={"ID":"400f9346-1f4f-4a42-a90b-7c0099cdc293","Type":"ContainerStarted","Data":"0500672d66cf822bd0c0a47634bae0ed4c07c521af5ce83505ae631dc56d9f1e"} Feb 26 18:17:15 crc kubenswrapper[5033]: I0226 18:17:15.272610 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5586k/crc-debug-nd4xh"] Feb 26 18:17:15 crc kubenswrapper[5033]: I0226 18:17:15.284088 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5586k/crc-debug-nd4xh"] Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.234525 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.418404 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/400f9346-1f4f-4a42-a90b-7c0099cdc293-host\") pod \"400f9346-1f4f-4a42-a90b-7c0099cdc293\" (UID: \"400f9346-1f4f-4a42-a90b-7c0099cdc293\") " Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.418489 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm6b2\" (UniqueName: \"kubernetes.io/projected/400f9346-1f4f-4a42-a90b-7c0099cdc293-kube-api-access-xm6b2\") pod \"400f9346-1f4f-4a42-a90b-7c0099cdc293\" (UID: \"400f9346-1f4f-4a42-a90b-7c0099cdc293\") " Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.418548 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/400f9346-1f4f-4a42-a90b-7c0099cdc293-host" (OuterVolumeSpecName: "host") pod "400f9346-1f4f-4a42-a90b-7c0099cdc293" (UID: "400f9346-1f4f-4a42-a90b-7c0099cdc293"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.419355 5033 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/400f9346-1f4f-4a42-a90b-7c0099cdc293-host\") on node \"crc\" DevicePath \"\"" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.435339 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/400f9346-1f4f-4a42-a90b-7c0099cdc293-kube-api-access-xm6b2" (OuterVolumeSpecName: "kube-api-access-xm6b2") pod "400f9346-1f4f-4a42-a90b-7c0099cdc293" (UID: "400f9346-1f4f-4a42-a90b-7c0099cdc293"). InnerVolumeSpecName "kube-api-access-xm6b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.443634 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5586k/crc-debug-wfjwh"] Feb 26 18:17:16 crc kubenswrapper[5033]: E0226 18:17:16.444064 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400f9346-1f4f-4a42-a90b-7c0099cdc293" containerName="container-00" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.444082 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="400f9346-1f4f-4a42-a90b-7c0099cdc293" containerName="container-00" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.444300 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="400f9346-1f4f-4a42-a90b-7c0099cdc293" containerName="container-00" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.445055 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.521197 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxkdl\" (UniqueName: \"kubernetes.io/projected/e6eac72e-0359-45b4-bf86-09e3e8f702b0-kube-api-access-sxkdl\") pod \"crc-debug-wfjwh\" (UID: \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\") " pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.521565 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6eac72e-0359-45b4-bf86-09e3e8f702b0-host\") pod \"crc-debug-wfjwh\" (UID: \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\") " pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.521762 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm6b2\" (UniqueName: \"kubernetes.io/projected/400f9346-1f4f-4a42-a90b-7c0099cdc293-kube-api-access-xm6b2\") on node \"crc\" DevicePath \"\"" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.623135 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6eac72e-0359-45b4-bf86-09e3e8f702b0-host\") pod \"crc-debug-wfjwh\" (UID: \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\") " pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.623321 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6eac72e-0359-45b4-bf86-09e3e8f702b0-host\") pod \"crc-debug-wfjwh\" (UID: \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\") " pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.623407 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxkdl\" (UniqueName: \"kubernetes.io/projected/e6eac72e-0359-45b4-bf86-09e3e8f702b0-kube-api-access-sxkdl\") pod \"crc-debug-wfjwh\" (UID: \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\") " pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.638811 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxkdl\" (UniqueName: \"kubernetes.io/projected/e6eac72e-0359-45b4-bf86-09e3e8f702b0-kube-api-access-sxkdl\") pod \"crc-debug-wfjwh\" (UID: \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\") " pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:16 crc kubenswrapper[5033]: I0226 18:17:16.786024 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:16 crc kubenswrapper[5033]: W0226 18:17:16.831080 5033 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6eac72e_0359_45b4_bf86_09e3e8f702b0.slice/crio-ac33a4c7a95c5140c9ff4eed9dd8602f6ea024cae16963f3e522b2ea76ce5aa3 WatchSource:0}: Error finding container ac33a4c7a95c5140c9ff4eed9dd8602f6ea024cae16963f3e522b2ea76ce5aa3: Status 404 returned error can't find the container with id ac33a4c7a95c5140c9ff4eed9dd8602f6ea024cae16963f3e522b2ea76ce5aa3 Feb 26 18:17:17 crc kubenswrapper[5033]: I0226 18:17:17.136295 5033 scope.go:117] "RemoveContainer" containerID="f596d0675c57e2a76cf3a1b96da531d3aae780a978877aae41bbc0924c24065e" Feb 26 18:17:17 crc kubenswrapper[5033]: I0226 18:17:17.136323 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-nd4xh" Feb 26 18:17:17 crc kubenswrapper[5033]: I0226 18:17:17.138034 5033 generic.go:334] "Generic (PLEG): container finished" podID="e6eac72e-0359-45b4-bf86-09e3e8f702b0" containerID="cd2f9e90907e2f6f71b67d913cccf3ebcdac186cfd1ebb26fc1f3cf62b2a3c4d" exitCode=0 Feb 26 18:17:17 crc kubenswrapper[5033]: I0226 18:17:17.138072 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/crc-debug-wfjwh" event={"ID":"e6eac72e-0359-45b4-bf86-09e3e8f702b0","Type":"ContainerDied","Data":"cd2f9e90907e2f6f71b67d913cccf3ebcdac186cfd1ebb26fc1f3cf62b2a3c4d"} Feb 26 18:17:17 crc kubenswrapper[5033]: I0226 18:17:17.138125 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/crc-debug-wfjwh" event={"ID":"e6eac72e-0359-45b4-bf86-09e3e8f702b0","Type":"ContainerStarted","Data":"ac33a4c7a95c5140c9ff4eed9dd8602f6ea024cae16963f3e522b2ea76ce5aa3"} Feb 26 18:17:17 crc kubenswrapper[5033]: I0226 18:17:17.184902 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5586k/crc-debug-wfjwh"] Feb 26 18:17:17 crc kubenswrapper[5033]: I0226 18:17:17.200061 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5586k/crc-debug-wfjwh"] Feb 26 18:17:17 crc kubenswrapper[5033]: I0226 18:17:17.640546 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400f9346-1f4f-4a42-a90b-7c0099cdc293" path="/var/lib/kubelet/pods/400f9346-1f4f-4a42-a90b-7c0099cdc293/volumes" Feb 26 18:17:18 crc kubenswrapper[5033]: I0226 18:17:18.269737 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:18 crc kubenswrapper[5033]: I0226 18:17:18.459136 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxkdl\" (UniqueName: \"kubernetes.io/projected/e6eac72e-0359-45b4-bf86-09e3e8f702b0-kube-api-access-sxkdl\") pod \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\" (UID: \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\") " Feb 26 18:17:18 crc kubenswrapper[5033]: I0226 18:17:18.459590 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6eac72e-0359-45b4-bf86-09e3e8f702b0-host\") pod \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\" (UID: \"e6eac72e-0359-45b4-bf86-09e3e8f702b0\") " Feb 26 18:17:18 crc kubenswrapper[5033]: I0226 18:17:18.459690 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6eac72e-0359-45b4-bf86-09e3e8f702b0-host" (OuterVolumeSpecName: "host") pod "e6eac72e-0359-45b4-bf86-09e3e8f702b0" (UID: "e6eac72e-0359-45b4-bf86-09e3e8f702b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 18:17:18 crc kubenswrapper[5033]: I0226 18:17:18.460112 5033 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6eac72e-0359-45b4-bf86-09e3e8f702b0-host\") on node \"crc\" DevicePath \"\"" Feb 26 18:17:18 crc kubenswrapper[5033]: I0226 18:17:18.466524 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6eac72e-0359-45b4-bf86-09e3e8f702b0-kube-api-access-sxkdl" (OuterVolumeSpecName: "kube-api-access-sxkdl") pod "e6eac72e-0359-45b4-bf86-09e3e8f702b0" (UID: "e6eac72e-0359-45b4-bf86-09e3e8f702b0"). InnerVolumeSpecName "kube-api-access-sxkdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:17:18 crc kubenswrapper[5033]: I0226 18:17:18.562351 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxkdl\" (UniqueName: \"kubernetes.io/projected/e6eac72e-0359-45b4-bf86-09e3e8f702b0-kube-api-access-sxkdl\") on node \"crc\" DevicePath \"\"" Feb 26 18:17:19 crc kubenswrapper[5033]: I0226 18:17:19.169562 5033 scope.go:117] "RemoveContainer" containerID="cd2f9e90907e2f6f71b67d913cccf3ebcdac186cfd1ebb26fc1f3cf62b2a3c4d" Feb 26 18:17:19 crc kubenswrapper[5033]: I0226 18:17:19.169629 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/crc-debug-wfjwh" Feb 26 18:17:19 crc kubenswrapper[5033]: I0226 18:17:19.643082 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6eac72e-0359-45b4-bf86-09e3e8f702b0" path="/var/lib/kubelet/pods/e6eac72e-0359-45b4-bf86-09e3e8f702b0/volumes" Feb 26 18:17:22 crc kubenswrapper[5033]: I0226 18:17:22.628845 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:17:22 crc kubenswrapper[5033]: E0226 18:17:22.629411 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:17:33 crc kubenswrapper[5033]: I0226 18:17:33.635197 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:17:33 crc kubenswrapper[5033]: E0226 18:17:33.636382 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:17:47 crc kubenswrapper[5033]: I0226 18:17:47.628760 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:17:47 crc kubenswrapper[5033]: E0226 18:17:47.629732 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:17:58 crc kubenswrapper[5033]: I0226 18:17:58.628105 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:17:58 crc kubenswrapper[5033]: E0226 18:17:58.629073 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.141153 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535498-sb2nd"] Feb 26 18:18:00 crc kubenswrapper[5033]: E0226 18:18:00.142050 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6eac72e-0359-45b4-bf86-09e3e8f702b0" containerName="container-00" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.142066 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6eac72e-0359-45b4-bf86-09e3e8f702b0" containerName="container-00" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.142293 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6eac72e-0359-45b4-bf86-09e3e8f702b0" containerName="container-00" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.143239 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535498-sb2nd" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.146130 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.146448 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.147520 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.154692 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535498-sb2nd"] Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.233615 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-942j5\" (UniqueName: \"kubernetes.io/projected/73909f91-3ba0-44a3-90f3-1ab4e22b9ae0-kube-api-access-942j5\") pod \"auto-csr-approver-29535498-sb2nd\" (UID: \"73909f91-3ba0-44a3-90f3-1ab4e22b9ae0\") " pod="openshift-infra/auto-csr-approver-29535498-sb2nd" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.335543 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-942j5\" (UniqueName: \"kubernetes.io/projected/73909f91-3ba0-44a3-90f3-1ab4e22b9ae0-kube-api-access-942j5\") pod \"auto-csr-approver-29535498-sb2nd\" (UID: \"73909f91-3ba0-44a3-90f3-1ab4e22b9ae0\") " pod="openshift-infra/auto-csr-approver-29535498-sb2nd" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.358174 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-942j5\" (UniqueName: \"kubernetes.io/projected/73909f91-3ba0-44a3-90f3-1ab4e22b9ae0-kube-api-access-942j5\") pod \"auto-csr-approver-29535498-sb2nd\" (UID: \"73909f91-3ba0-44a3-90f3-1ab4e22b9ae0\") " pod="openshift-infra/auto-csr-approver-29535498-sb2nd" Feb 26 18:18:00 crc kubenswrapper[5033]: I0226 18:18:00.463447 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535498-sb2nd" Feb 26 18:18:01 crc kubenswrapper[5033]: I0226 18:18:01.094057 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535498-sb2nd"] Feb 26 18:18:01 crc kubenswrapper[5033]: I0226 18:18:01.097014 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 18:18:01 crc kubenswrapper[5033]: I0226 18:18:01.648805 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535498-sb2nd" event={"ID":"73909f91-3ba0-44a3-90f3-1ab4e22b9ae0","Type":"ContainerStarted","Data":"fbc39df662245227923788e91b96eb005624e1d43c2ce8a31dec8bc17bf6abd3"} Feb 26 18:18:04 crc kubenswrapper[5033]: I0226 18:18:04.679702 5033 generic.go:334] "Generic (PLEG): container finished" podID="73909f91-3ba0-44a3-90f3-1ab4e22b9ae0" containerID="04dd59876f32c0b896011cef3be2bb901643dddf4197765b3e63985cb839c401" exitCode=0 Feb 26 18:18:04 crc kubenswrapper[5033]: I0226 18:18:04.679755 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535498-sb2nd" event={"ID":"73909f91-3ba0-44a3-90f3-1ab4e22b9ae0","Type":"ContainerDied","Data":"04dd59876f32c0b896011cef3be2bb901643dddf4197765b3e63985cb839c401"} Feb 26 18:18:06 crc kubenswrapper[5033]: I0226 18:18:06.096214 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535498-sb2nd" Feb 26 18:18:06 crc kubenswrapper[5033]: I0226 18:18:06.265515 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-942j5\" (UniqueName: \"kubernetes.io/projected/73909f91-3ba0-44a3-90f3-1ab4e22b9ae0-kube-api-access-942j5\") pod \"73909f91-3ba0-44a3-90f3-1ab4e22b9ae0\" (UID: \"73909f91-3ba0-44a3-90f3-1ab4e22b9ae0\") " Feb 26 18:18:06 crc kubenswrapper[5033]: I0226 18:18:06.271657 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73909f91-3ba0-44a3-90f3-1ab4e22b9ae0-kube-api-access-942j5" (OuterVolumeSpecName: "kube-api-access-942j5") pod "73909f91-3ba0-44a3-90f3-1ab4e22b9ae0" (UID: "73909f91-3ba0-44a3-90f3-1ab4e22b9ae0"). InnerVolumeSpecName "kube-api-access-942j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:18:06 crc kubenswrapper[5033]: I0226 18:18:06.367897 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-942j5\" (UniqueName: \"kubernetes.io/projected/73909f91-3ba0-44a3-90f3-1ab4e22b9ae0-kube-api-access-942j5\") on node \"crc\" DevicePath \"\"" Feb 26 18:18:06 crc kubenswrapper[5033]: I0226 18:18:06.706146 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535498-sb2nd" event={"ID":"73909f91-3ba0-44a3-90f3-1ab4e22b9ae0","Type":"ContainerDied","Data":"fbc39df662245227923788e91b96eb005624e1d43c2ce8a31dec8bc17bf6abd3"} Feb 26 18:18:06 crc kubenswrapper[5033]: I0226 18:18:06.706233 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbc39df662245227923788e91b96eb005624e1d43c2ce8a31dec8bc17bf6abd3" Feb 26 18:18:06 crc kubenswrapper[5033]: I0226 18:18:06.706391 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535498-sb2nd" Feb 26 18:18:07 crc kubenswrapper[5033]: I0226 18:18:07.169700 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535492-dfp9s"] Feb 26 18:18:07 crc kubenswrapper[5033]: I0226 18:18:07.178065 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535492-dfp9s"] Feb 26 18:18:07 crc kubenswrapper[5033]: I0226 18:18:07.641623 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="896e93d2-2565-4c00-8157-31d3ff6767e1" path="/var/lib/kubelet/pods/896e93d2-2565-4c00-8157-31d3ff6767e1/volumes" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.464662 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vwjkq"] Feb 26 18:18:08 crc kubenswrapper[5033]: E0226 18:18:08.465943 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73909f91-3ba0-44a3-90f3-1ab4e22b9ae0" containerName="oc" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.465959 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="73909f91-3ba0-44a3-90f3-1ab4e22b9ae0" containerName="oc" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.466202 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="73909f91-3ba0-44a3-90f3-1ab4e22b9ae0" containerName="oc" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.467937 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.495544 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwjkq"] Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.611999 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-utilities\") pod \"redhat-marketplace-vwjkq\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.612174 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-catalog-content\") pod \"redhat-marketplace-vwjkq\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.612208 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4j72\" (UniqueName: \"kubernetes.io/projected/0ede356e-68a8-4cb5-b517-9dad5c4522cf-kube-api-access-c4j72\") pod \"redhat-marketplace-vwjkq\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.713831 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-catalog-content\") pod \"redhat-marketplace-vwjkq\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.713891 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4j72\" (UniqueName: \"kubernetes.io/projected/0ede356e-68a8-4cb5-b517-9dad5c4522cf-kube-api-access-c4j72\") pod \"redhat-marketplace-vwjkq\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.714002 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-utilities\") pod \"redhat-marketplace-vwjkq\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.714528 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-utilities\") pod \"redhat-marketplace-vwjkq\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.714818 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-catalog-content\") pod \"redhat-marketplace-vwjkq\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.735349 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4j72\" (UniqueName: \"kubernetes.io/projected/0ede356e-68a8-4cb5-b517-9dad5c4522cf-kube-api-access-c4j72\") pod \"redhat-marketplace-vwjkq\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:08 crc kubenswrapper[5033]: I0226 18:18:08.795576 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:09 crc kubenswrapper[5033]: I0226 18:18:09.342748 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwjkq"] Feb 26 18:18:09 crc kubenswrapper[5033]: I0226 18:18:09.628944 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:18:09 crc kubenswrapper[5033]: E0226 18:18:09.629315 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:18:09 crc kubenswrapper[5033]: I0226 18:18:09.741450 5033 generic.go:334] "Generic (PLEG): container finished" podID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerID="4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492" exitCode=0 Feb 26 18:18:09 crc kubenswrapper[5033]: I0226 18:18:09.741765 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwjkq" event={"ID":"0ede356e-68a8-4cb5-b517-9dad5c4522cf","Type":"ContainerDied","Data":"4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492"} Feb 26 18:18:09 crc kubenswrapper[5033]: I0226 18:18:09.741795 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwjkq" event={"ID":"0ede356e-68a8-4cb5-b517-9dad5c4522cf","Type":"ContainerStarted","Data":"8a07782d319e9c1c5454a0cb41f9810196053526e3581f30c32b1a7d82ced7d6"} Feb 26 18:18:10 crc kubenswrapper[5033]: I0226 18:18:10.752199 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwjkq" event={"ID":"0ede356e-68a8-4cb5-b517-9dad5c4522cf","Type":"ContainerStarted","Data":"7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f"} Feb 26 18:18:11 crc kubenswrapper[5033]: I0226 18:18:11.763847 5033 generic.go:334] "Generic (PLEG): container finished" podID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerID="7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f" exitCode=0 Feb 26 18:18:11 crc kubenswrapper[5033]: I0226 18:18:11.763952 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwjkq" event={"ID":"0ede356e-68a8-4cb5-b517-9dad5c4522cf","Type":"ContainerDied","Data":"7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f"} Feb 26 18:18:12 crc kubenswrapper[5033]: I0226 18:18:12.774801 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwjkq" event={"ID":"0ede356e-68a8-4cb5-b517-9dad5c4522cf","Type":"ContainerStarted","Data":"9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852"} Feb 26 18:18:12 crc kubenswrapper[5033]: I0226 18:18:12.803003 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vwjkq" podStartSLOduration=2.324536425 podStartE2EDuration="4.802988537s" podCreationTimestamp="2026-02-26 18:18:08 +0000 UTC" firstStartedPulling="2026-02-26 18:18:09.743916829 +0000 UTC m=+9416.956515489" lastFinishedPulling="2026-02-26 18:18:12.222368951 +0000 UTC m=+9419.434967601" observedRunningTime="2026-02-26 18:18:12.792516114 +0000 UTC m=+9420.005114784" watchObservedRunningTime="2026-02-26 18:18:12.802988537 +0000 UTC m=+9420.015587187" Feb 26 18:18:18 crc kubenswrapper[5033]: I0226 18:18:18.795970 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:18 crc kubenswrapper[5033]: I0226 18:18:18.796495 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:18 crc kubenswrapper[5033]: I0226 18:18:18.850006 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:18 crc kubenswrapper[5033]: I0226 18:18:18.909649 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:19 crc kubenswrapper[5033]: I0226 18:18:19.096811 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwjkq"] Feb 26 18:18:20 crc kubenswrapper[5033]: I0226 18:18:20.629181 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:18:20 crc kubenswrapper[5033]: E0226 18:18:20.629529 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:18:20 crc kubenswrapper[5033]: I0226 18:18:20.858359 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vwjkq" podUID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerName="registry-server" containerID="cri-o://9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852" gracePeriod=2 Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.370608 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.405873 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-utilities\") pod \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.405942 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-catalog-content\") pod \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.406055 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4j72\" (UniqueName: \"kubernetes.io/projected/0ede356e-68a8-4cb5-b517-9dad5c4522cf-kube-api-access-c4j72\") pod \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\" (UID: \"0ede356e-68a8-4cb5-b517-9dad5c4522cf\") " Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.407156 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-utilities" (OuterVolumeSpecName: "utilities") pod "0ede356e-68a8-4cb5-b517-9dad5c4522cf" (UID: "0ede356e-68a8-4cb5-b517-9dad5c4522cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.411318 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ede356e-68a8-4cb5-b517-9dad5c4522cf-kube-api-access-c4j72" (OuterVolumeSpecName: "kube-api-access-c4j72") pod "0ede356e-68a8-4cb5-b517-9dad5c4522cf" (UID: "0ede356e-68a8-4cb5-b517-9dad5c4522cf"). InnerVolumeSpecName "kube-api-access-c4j72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.509105 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4j72\" (UniqueName: \"kubernetes.io/projected/0ede356e-68a8-4cb5-b517-9dad5c4522cf-kube-api-access-c4j72\") on node \"crc\" DevicePath \"\"" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.509152 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.869756 5033 generic.go:334] "Generic (PLEG): container finished" podID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerID="9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852" exitCode=0 Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.869811 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwjkq" event={"ID":"0ede356e-68a8-4cb5-b517-9dad5c4522cf","Type":"ContainerDied","Data":"9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852"} Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.869840 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwjkq" event={"ID":"0ede356e-68a8-4cb5-b517-9dad5c4522cf","Type":"ContainerDied","Data":"8a07782d319e9c1c5454a0cb41f9810196053526e3581f30c32b1a7d82ced7d6"} Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.869862 5033 scope.go:117] "RemoveContainer" containerID="9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.869920 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwjkq" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.891434 5033 scope.go:117] "RemoveContainer" containerID="7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.916853 5033 scope.go:117] "RemoveContainer" containerID="4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.979429 5033 scope.go:117] "RemoveContainer" containerID="9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852" Feb 26 18:18:21 crc kubenswrapper[5033]: E0226 18:18:21.980307 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852\": container with ID starting with 9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852 not found: ID does not exist" containerID="9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.980382 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852"} err="failed to get container status \"9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852\": rpc error: code = NotFound desc = could not find container \"9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852\": container with ID starting with 9a50d544ed37300eeb3c7eeb1da5d02021a67565274a562b04226a2c8bb7e852 not found: ID does not exist" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.980402 5033 scope.go:117] "RemoveContainer" containerID="7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f" Feb 26 18:18:21 crc kubenswrapper[5033]: E0226 18:18:21.980780 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f\": container with ID starting with 7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f not found: ID does not exist" containerID="7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.980827 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f"} err="failed to get container status \"7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f\": rpc error: code = NotFound desc = could not find container \"7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f\": container with ID starting with 7d3c19e07acfb70aea78af98707a48ced702f2e032530b3b0ad17641705c0f7f not found: ID does not exist" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.980852 5033 scope.go:117] "RemoveContainer" containerID="4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492" Feb 26 18:18:21 crc kubenswrapper[5033]: E0226 18:18:21.981172 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492\": container with ID starting with 4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492 not found: ID does not exist" containerID="4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492" Feb 26 18:18:21 crc kubenswrapper[5033]: I0226 18:18:21.981213 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492"} err="failed to get container status \"4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492\": rpc error: code = NotFound desc = could not find container \"4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492\": container with ID starting with 4a6951882771ae546762735cba2f09cfcb267415422bd6c4523a0e5de8646492 not found: ID does not exist" Feb 26 18:18:22 crc kubenswrapper[5033]: I0226 18:18:22.248767 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ede356e-68a8-4cb5-b517-9dad5c4522cf" (UID: "0ede356e-68a8-4cb5-b517-9dad5c4522cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:18:22 crc kubenswrapper[5033]: I0226 18:18:22.324880 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ede356e-68a8-4cb5-b517-9dad5c4522cf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:18:22 crc kubenswrapper[5033]: I0226 18:18:22.506684 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwjkq"] Feb 26 18:18:22 crc kubenswrapper[5033]: I0226 18:18:22.518823 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwjkq"] Feb 26 18:18:23 crc kubenswrapper[5033]: I0226 18:18:23.641616 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" path="/var/lib/kubelet/pods/0ede356e-68a8-4cb5-b517-9dad5c4522cf/volumes" Feb 26 18:18:31 crc kubenswrapper[5033]: I0226 18:18:31.628557 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:18:31 crc kubenswrapper[5033]: E0226 18:18:31.629792 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:18:42 crc kubenswrapper[5033]: I0226 18:18:42.628796 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:18:42 crc kubenswrapper[5033]: E0226 18:18:42.629445 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:18:46 crc kubenswrapper[5033]: I0226 18:18:46.702873 5033 scope.go:117] "RemoveContainer" containerID="8a5291226b6f9dc16e881b91ecc5f8d88330f982bc3943ca44012a7d0b91dd8b" Feb 26 18:18:53 crc kubenswrapper[5033]: I0226 18:18:53.634792 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:18:53 crc kubenswrapper[5033]: E0226 18:18:53.636634 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:19:07 crc kubenswrapper[5033]: I0226 18:19:07.628807 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:19:07 crc kubenswrapper[5033]: E0226 18:19:07.629450 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:19:21 crc kubenswrapper[5033]: I0226 18:19:21.629179 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:19:23 crc kubenswrapper[5033]: I0226 18:19:23.489920 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"10fa683236b128009f2a223b9e7597e835171fa14a18ab9a78920652b702878a"} Feb 26 18:19:33 crc kubenswrapper[5033]: I0226 18:19:33.772795 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="a20a94a0-9589-4c0e-9305-72e94939e526" containerName="galera" probeResult="failure" output="command timed out" Feb 26 18:19:33 crc kubenswrapper[5033]: I0226 18:19:33.773381 5033 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="a20a94a0-9589-4c0e-9305-72e94939e526" containerName="galera" probeResult="failure" output="command timed out" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.169779 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6rtt9"] Feb 26 18:19:49 crc kubenswrapper[5033]: E0226 18:19:49.170892 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerName="extract-utilities" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.170909 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerName="extract-utilities" Feb 26 18:19:49 crc kubenswrapper[5033]: E0226 18:19:49.170940 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerName="registry-server" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.170948 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerName="registry-server" Feb 26 18:19:49 crc kubenswrapper[5033]: E0226 18:19:49.170966 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerName="extract-content" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.170974 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerName="extract-content" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.171300 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ede356e-68a8-4cb5-b517-9dad5c4522cf" containerName="registry-server" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.173011 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.183122 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6rtt9"] Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.203945 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-utilities\") pod \"community-operators-6rtt9\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.204162 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-catalog-content\") pod \"community-operators-6rtt9\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.204394 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt2d2\" (UniqueName: \"kubernetes.io/projected/5c892ffc-57df-4eec-a7be-b0807233db82-kube-api-access-gt2d2\") pod \"community-operators-6rtt9\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.306605 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-utilities\") pod \"community-operators-6rtt9\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.306725 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-catalog-content\") pod \"community-operators-6rtt9\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.306885 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt2d2\" (UniqueName: \"kubernetes.io/projected/5c892ffc-57df-4eec-a7be-b0807233db82-kube-api-access-gt2d2\") pod \"community-operators-6rtt9\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.307488 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-utilities\") pod \"community-operators-6rtt9\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.307529 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-catalog-content\") pod \"community-operators-6rtt9\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.327853 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt2d2\" (UniqueName: \"kubernetes.io/projected/5c892ffc-57df-4eec-a7be-b0807233db82-kube-api-access-gt2d2\") pod \"community-operators-6rtt9\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:49 crc kubenswrapper[5033]: I0226 18:19:49.503045 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:50 crc kubenswrapper[5033]: I0226 18:19:50.105252 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6rtt9"] Feb 26 18:19:50 crc kubenswrapper[5033]: I0226 18:19:50.786625 5033 generic.go:334] "Generic (PLEG): container finished" podID="5c892ffc-57df-4eec-a7be-b0807233db82" containerID="a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01" exitCode=0 Feb 26 18:19:50 crc kubenswrapper[5033]: I0226 18:19:50.786696 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rtt9" event={"ID":"5c892ffc-57df-4eec-a7be-b0807233db82","Type":"ContainerDied","Data":"a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01"} Feb 26 18:19:50 crc kubenswrapper[5033]: I0226 18:19:50.786907 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rtt9" event={"ID":"5c892ffc-57df-4eec-a7be-b0807233db82","Type":"ContainerStarted","Data":"66756e1c0a93e84ebd30d34e986d72da6975c12445117d0934b28a32a30dcbae"} Feb 26 18:19:51 crc kubenswrapper[5033]: I0226 18:19:51.801381 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rtt9" event={"ID":"5c892ffc-57df-4eec-a7be-b0807233db82","Type":"ContainerStarted","Data":"74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde"} Feb 26 18:19:53 crc kubenswrapper[5033]: I0226 18:19:53.821187 5033 generic.go:334] "Generic (PLEG): container finished" podID="5c892ffc-57df-4eec-a7be-b0807233db82" containerID="74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde" exitCode=0 Feb 26 18:19:53 crc kubenswrapper[5033]: I0226 18:19:53.821249 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rtt9" event={"ID":"5c892ffc-57df-4eec-a7be-b0807233db82","Type":"ContainerDied","Data":"74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde"} Feb 26 18:19:54 crc kubenswrapper[5033]: I0226 18:19:54.865806 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rtt9" event={"ID":"5c892ffc-57df-4eec-a7be-b0807233db82","Type":"ContainerStarted","Data":"1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36"} Feb 26 18:19:54 crc kubenswrapper[5033]: I0226 18:19:54.890303 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6rtt9" podStartSLOduration=2.385304192 podStartE2EDuration="5.890284612s" podCreationTimestamp="2026-02-26 18:19:49 +0000 UTC" firstStartedPulling="2026-02-26 18:19:50.788783582 +0000 UTC m=+9518.001382272" lastFinishedPulling="2026-02-26 18:19:54.293764042 +0000 UTC m=+9521.506362692" observedRunningTime="2026-02-26 18:19:54.882622977 +0000 UTC m=+9522.095221637" watchObservedRunningTime="2026-02-26 18:19:54.890284612 +0000 UTC m=+9522.102883262" Feb 26 18:19:59 crc kubenswrapper[5033]: I0226 18:19:59.503974 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:59 crc kubenswrapper[5033]: I0226 18:19:59.504553 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:59 crc kubenswrapper[5033]: I0226 18:19:59.568371 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:19:59 crc kubenswrapper[5033]: I0226 18:19:59.988536 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.052624 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6rtt9"] Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.146578 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535500-55fxh"] Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.148060 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535500-55fxh" Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.150059 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.150216 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.150216 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.160416 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535500-55fxh"] Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.166015 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsh2v\" (UniqueName: \"kubernetes.io/projected/91042adc-dcc8-411c-9e7f-4679e27c6621-kube-api-access-gsh2v\") pod \"auto-csr-approver-29535500-55fxh\" (UID: \"91042adc-dcc8-411c-9e7f-4679e27c6621\") " pod="openshift-infra/auto-csr-approver-29535500-55fxh" Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.267441 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsh2v\" (UniqueName: \"kubernetes.io/projected/91042adc-dcc8-411c-9e7f-4679e27c6621-kube-api-access-gsh2v\") pod \"auto-csr-approver-29535500-55fxh\" (UID: \"91042adc-dcc8-411c-9e7f-4679e27c6621\") " pod="openshift-infra/auto-csr-approver-29535500-55fxh" Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.288232 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsh2v\" (UniqueName: \"kubernetes.io/projected/91042adc-dcc8-411c-9e7f-4679e27c6621-kube-api-access-gsh2v\") pod \"auto-csr-approver-29535500-55fxh\" (UID: \"91042adc-dcc8-411c-9e7f-4679e27c6621\") " pod="openshift-infra/auto-csr-approver-29535500-55fxh" Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.466571 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535500-55fxh" Feb 26 18:20:00 crc kubenswrapper[5033]: I0226 18:20:00.969541 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535500-55fxh"] Feb 26 18:20:01 crc kubenswrapper[5033]: I0226 18:20:01.944948 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535500-55fxh" event={"ID":"91042adc-dcc8-411c-9e7f-4679e27c6621","Type":"ContainerStarted","Data":"d546efeab9b72ef6c9d001e02382f0a3a6f34ef298053dfc77c0a3ea65c92850"} Feb 26 18:20:01 crc kubenswrapper[5033]: I0226 18:20:01.945121 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6rtt9" podUID="5c892ffc-57df-4eec-a7be-b0807233db82" containerName="registry-server" containerID="cri-o://1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36" gracePeriod=2 Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.512339 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.536304 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-catalog-content\") pod \"5c892ffc-57df-4eec-a7be-b0807233db82\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.536443 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt2d2\" (UniqueName: \"kubernetes.io/projected/5c892ffc-57df-4eec-a7be-b0807233db82-kube-api-access-gt2d2\") pod \"5c892ffc-57df-4eec-a7be-b0807233db82\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.536479 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-utilities\") pod \"5c892ffc-57df-4eec-a7be-b0807233db82\" (UID: \"5c892ffc-57df-4eec-a7be-b0807233db82\") " Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.537861 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-utilities" (OuterVolumeSpecName: "utilities") pod "5c892ffc-57df-4eec-a7be-b0807233db82" (UID: "5c892ffc-57df-4eec-a7be-b0807233db82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.543194 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c892ffc-57df-4eec-a7be-b0807233db82-kube-api-access-gt2d2" (OuterVolumeSpecName: "kube-api-access-gt2d2") pod "5c892ffc-57df-4eec-a7be-b0807233db82" (UID: "5c892ffc-57df-4eec-a7be-b0807233db82"). InnerVolumeSpecName "kube-api-access-gt2d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.599790 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c892ffc-57df-4eec-a7be-b0807233db82" (UID: "5c892ffc-57df-4eec-a7be-b0807233db82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.638379 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.638631 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt2d2\" (UniqueName: \"kubernetes.io/projected/5c892ffc-57df-4eec-a7be-b0807233db82-kube-api-access-gt2d2\") on node \"crc\" DevicePath \"\"" Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.638716 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c892ffc-57df-4eec-a7be-b0807233db82-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.961624 5033 generic.go:334] "Generic (PLEG): container finished" podID="91042adc-dcc8-411c-9e7f-4679e27c6621" containerID="86289b13dfa2c54c269b09bb794a92cb25ca0f30fc62e4abd853b8786acffd8a" exitCode=0 Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.961713 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535500-55fxh" event={"ID":"91042adc-dcc8-411c-9e7f-4679e27c6621","Type":"ContainerDied","Data":"86289b13dfa2c54c269b09bb794a92cb25ca0f30fc62e4abd853b8786acffd8a"} Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.966758 5033 generic.go:334] "Generic (PLEG): container finished" podID="5c892ffc-57df-4eec-a7be-b0807233db82" containerID="1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36" exitCode=0 Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.966814 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rtt9" event={"ID":"5c892ffc-57df-4eec-a7be-b0807233db82","Type":"ContainerDied","Data":"1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36"} Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.966857 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rtt9" event={"ID":"5c892ffc-57df-4eec-a7be-b0807233db82","Type":"ContainerDied","Data":"66756e1c0a93e84ebd30d34e986d72da6975c12445117d0934b28a32a30dcbae"} Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.966875 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rtt9" Feb 26 18:20:02 crc kubenswrapper[5033]: I0226 18:20:02.966894 5033 scope.go:117] "RemoveContainer" containerID="1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36" Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.002387 5033 scope.go:117] "RemoveContainer" containerID="74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde" Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.017394 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6rtt9"] Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.028526 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6rtt9"] Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.067587 5033 scope.go:117] "RemoveContainer" containerID="a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01" Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.126663 5033 scope.go:117] "RemoveContainer" containerID="1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36" Feb 26 18:20:03 crc kubenswrapper[5033]: E0226 18:20:03.127279 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36\": container with ID starting with 1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36 not found: ID does not exist" containerID="1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36" Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.127379 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36"} err="failed to get container status \"1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36\": rpc error: code = NotFound desc = could not find container \"1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36\": container with ID starting with 1123cb6d0bf9a7e14671874548e6f0097b6291652a0836f6b86a9f6b404e6e36 not found: ID does not exist" Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.127560 5033 scope.go:117] "RemoveContainer" containerID="74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde" Feb 26 18:20:03 crc kubenswrapper[5033]: E0226 18:20:03.127987 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde\": container with ID starting with 74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde not found: ID does not exist" containerID="74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde" Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.128085 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde"} err="failed to get container status \"74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde\": rpc error: code = NotFound desc = could not find container \"74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde\": container with ID starting with 74a025bf94154e7cb50643e44253b0ac692e468020a4f4b7d998038b3fc65fde not found: ID does not exist" Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.128155 5033 scope.go:117] "RemoveContainer" containerID="a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01" Feb 26 18:20:03 crc kubenswrapper[5033]: E0226 18:20:03.128628 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01\": container with ID starting with a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01 not found: ID does not exist" containerID="a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01" Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.128681 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01"} err="failed to get container status \"a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01\": rpc error: code = NotFound desc = could not find container \"a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01\": container with ID starting with a9a47f1e6a55952ad8792678a706b95545230b15ae3e9303364cf5ecba80dd01 not found: ID does not exist" Feb 26 18:20:03 crc kubenswrapper[5033]: I0226 18:20:03.641770 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c892ffc-57df-4eec-a7be-b0807233db82" path="/var/lib/kubelet/pods/5c892ffc-57df-4eec-a7be-b0807233db82/volumes" Feb 26 18:20:04 crc kubenswrapper[5033]: I0226 18:20:04.363096 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535500-55fxh" Feb 26 18:20:04 crc kubenswrapper[5033]: I0226 18:20:04.478861 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsh2v\" (UniqueName: \"kubernetes.io/projected/91042adc-dcc8-411c-9e7f-4679e27c6621-kube-api-access-gsh2v\") pod \"91042adc-dcc8-411c-9e7f-4679e27c6621\" (UID: \"91042adc-dcc8-411c-9e7f-4679e27c6621\") " Feb 26 18:20:04 crc kubenswrapper[5033]: I0226 18:20:04.486373 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91042adc-dcc8-411c-9e7f-4679e27c6621-kube-api-access-gsh2v" (OuterVolumeSpecName: "kube-api-access-gsh2v") pod "91042adc-dcc8-411c-9e7f-4679e27c6621" (UID: "91042adc-dcc8-411c-9e7f-4679e27c6621"). InnerVolumeSpecName "kube-api-access-gsh2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:20:04 crc kubenswrapper[5033]: I0226 18:20:04.582295 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsh2v\" (UniqueName: \"kubernetes.io/projected/91042adc-dcc8-411c-9e7f-4679e27c6621-kube-api-access-gsh2v\") on node \"crc\" DevicePath \"\"" Feb 26 18:20:04 crc kubenswrapper[5033]: I0226 18:20:04.999012 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535500-55fxh" event={"ID":"91042adc-dcc8-411c-9e7f-4679e27c6621","Type":"ContainerDied","Data":"d546efeab9b72ef6c9d001e02382f0a3a6f34ef298053dfc77c0a3ea65c92850"} Feb 26 18:20:04 crc kubenswrapper[5033]: I0226 18:20:04.999083 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d546efeab9b72ef6c9d001e02382f0a3a6f34ef298053dfc77c0a3ea65c92850" Feb 26 18:20:05 crc kubenswrapper[5033]: I0226 18:20:04.999097 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535500-55fxh" Feb 26 18:20:05 crc kubenswrapper[5033]: I0226 18:20:05.446210 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535494-kxw86"] Feb 26 18:20:05 crc kubenswrapper[5033]: I0226 18:20:05.458548 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535494-kxw86"] Feb 26 18:20:05 crc kubenswrapper[5033]: I0226 18:20:05.640283 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c91e220-3586-4685-9eef-c9c8405a00ad" path="/var/lib/kubelet/pods/0c91e220-3586-4685-9eef-c9c8405a00ad/volumes" Feb 26 18:20:46 crc kubenswrapper[5033]: I0226 18:20:46.850907 5033 scope.go:117] "RemoveContainer" containerID="50b171ed59f25034ea91a5eae0a9b04ec5de289cdce759dd22e2da2a73f48c39" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.191733 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fxnwv"] Feb 26 18:20:56 crc kubenswrapper[5033]: E0226 18:20:56.192882 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c892ffc-57df-4eec-a7be-b0807233db82" containerName="extract-utilities" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.192901 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c892ffc-57df-4eec-a7be-b0807233db82" containerName="extract-utilities" Feb 26 18:20:56 crc kubenswrapper[5033]: E0226 18:20:56.192948 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c892ffc-57df-4eec-a7be-b0807233db82" containerName="extract-content" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.192958 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c892ffc-57df-4eec-a7be-b0807233db82" containerName="extract-content" Feb 26 18:20:56 crc kubenswrapper[5033]: E0226 18:20:56.192974 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c892ffc-57df-4eec-a7be-b0807233db82" containerName="registry-server" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.192984 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c892ffc-57df-4eec-a7be-b0807233db82" containerName="registry-server" Feb 26 18:20:56 crc kubenswrapper[5033]: E0226 18:20:56.193003 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91042adc-dcc8-411c-9e7f-4679e27c6621" containerName="oc" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.193014 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="91042adc-dcc8-411c-9e7f-4679e27c6621" containerName="oc" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.193335 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c892ffc-57df-4eec-a7be-b0807233db82" containerName="registry-server" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.193369 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="91042adc-dcc8-411c-9e7f-4679e27c6621" containerName="oc" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.195571 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.200987 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxnwv"] Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.222144 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-utilities\") pod \"certified-operators-fxnwv\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.222203 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gdqh\" (UniqueName: \"kubernetes.io/projected/eca58ccd-3cba-4917-bfe5-779918dee1bc-kube-api-access-5gdqh\") pod \"certified-operators-fxnwv\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.222318 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-catalog-content\") pod \"certified-operators-fxnwv\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.323968 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gdqh\" (UniqueName: \"kubernetes.io/projected/eca58ccd-3cba-4917-bfe5-779918dee1bc-kube-api-access-5gdqh\") pod \"certified-operators-fxnwv\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.324449 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-catalog-content\") pod \"certified-operators-fxnwv\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.324919 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-utilities\") pod \"certified-operators-fxnwv\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.324941 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-catalog-content\") pod \"certified-operators-fxnwv\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.325390 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-utilities\") pod \"certified-operators-fxnwv\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.344088 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gdqh\" (UniqueName: \"kubernetes.io/projected/eca58ccd-3cba-4917-bfe5-779918dee1bc-kube-api-access-5gdqh\") pod \"certified-operators-fxnwv\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:56 crc kubenswrapper[5033]: I0226 18:20:56.560368 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:20:57 crc kubenswrapper[5033]: I0226 18:20:57.189907 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxnwv"] Feb 26 18:20:57 crc kubenswrapper[5033]: I0226 18:20:57.573745 5033 generic.go:334] "Generic (PLEG): container finished" podID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerID="e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00" exitCode=0 Feb 26 18:20:57 crc kubenswrapper[5033]: I0226 18:20:57.574025 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxnwv" event={"ID":"eca58ccd-3cba-4917-bfe5-779918dee1bc","Type":"ContainerDied","Data":"e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00"} Feb 26 18:20:57 crc kubenswrapper[5033]: I0226 18:20:57.574060 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxnwv" event={"ID":"eca58ccd-3cba-4917-bfe5-779918dee1bc","Type":"ContainerStarted","Data":"f56df22bfd40b4169c7670051c8c36b9275a5ea709245e408ea532ae8f38c2a9"} Feb 26 18:20:59 crc kubenswrapper[5033]: I0226 18:20:59.612439 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxnwv" event={"ID":"eca58ccd-3cba-4917-bfe5-779918dee1bc","Type":"ContainerStarted","Data":"fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6"} Feb 26 18:21:00 crc kubenswrapper[5033]: I0226 18:21:00.625200 5033 generic.go:334] "Generic (PLEG): container finished" podID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerID="fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6" exitCode=0 Feb 26 18:21:00 crc kubenswrapper[5033]: I0226 18:21:00.625248 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxnwv" event={"ID":"eca58ccd-3cba-4917-bfe5-779918dee1bc","Type":"ContainerDied","Data":"fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6"} Feb 26 18:21:02 crc kubenswrapper[5033]: I0226 18:21:02.655332 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxnwv" event={"ID":"eca58ccd-3cba-4917-bfe5-779918dee1bc","Type":"ContainerStarted","Data":"fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357"} Feb 26 18:21:02 crc kubenswrapper[5033]: I0226 18:21:02.684122 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fxnwv" podStartSLOduration=2.7853409879999997 podStartE2EDuration="6.684093116s" podCreationTimestamp="2026-02-26 18:20:56 +0000 UTC" firstStartedPulling="2026-02-26 18:20:57.576814019 +0000 UTC m=+9584.789412669" lastFinishedPulling="2026-02-26 18:21:01.475566137 +0000 UTC m=+9588.688164797" observedRunningTime="2026-02-26 18:21:02.675486706 +0000 UTC m=+9589.888085356" watchObservedRunningTime="2026-02-26 18:21:02.684093116 +0000 UTC m=+9589.896691766" Feb 26 18:21:06 crc kubenswrapper[5033]: I0226 18:21:06.561691 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:21:06 crc kubenswrapper[5033]: I0226 18:21:06.562296 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:21:06 crc kubenswrapper[5033]: I0226 18:21:06.611841 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:21:06 crc kubenswrapper[5033]: I0226 18:21:06.741861 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:21:06 crc kubenswrapper[5033]: I0226 18:21:06.850283 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fxnwv"] Feb 26 18:21:08 crc kubenswrapper[5033]: I0226 18:21:08.706779 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fxnwv" podUID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerName="registry-server" containerID="cri-o://fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357" gracePeriod=2 Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.199116 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.303332 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-utilities\") pod \"eca58ccd-3cba-4917-bfe5-779918dee1bc\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.303582 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-catalog-content\") pod \"eca58ccd-3cba-4917-bfe5-779918dee1bc\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.303654 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gdqh\" (UniqueName: \"kubernetes.io/projected/eca58ccd-3cba-4917-bfe5-779918dee1bc-kube-api-access-5gdqh\") pod \"eca58ccd-3cba-4917-bfe5-779918dee1bc\" (UID: \"eca58ccd-3cba-4917-bfe5-779918dee1bc\") " Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.304050 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-utilities" (OuterVolumeSpecName: "utilities") pod "eca58ccd-3cba-4917-bfe5-779918dee1bc" (UID: "eca58ccd-3cba-4917-bfe5-779918dee1bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.304583 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.309507 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca58ccd-3cba-4917-bfe5-779918dee1bc-kube-api-access-5gdqh" (OuterVolumeSpecName: "kube-api-access-5gdqh") pod "eca58ccd-3cba-4917-bfe5-779918dee1bc" (UID: "eca58ccd-3cba-4917-bfe5-779918dee1bc"). InnerVolumeSpecName "kube-api-access-5gdqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.369345 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eca58ccd-3cba-4917-bfe5-779918dee1bc" (UID: "eca58ccd-3cba-4917-bfe5-779918dee1bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.406965 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca58ccd-3cba-4917-bfe5-779918dee1bc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.407003 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gdqh\" (UniqueName: \"kubernetes.io/projected/eca58ccd-3cba-4917-bfe5-779918dee1bc-kube-api-access-5gdqh\") on node \"crc\" DevicePath \"\"" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.717174 5033 generic.go:334] "Generic (PLEG): container finished" podID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerID="fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357" exitCode=0 Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.717211 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxnwv" event={"ID":"eca58ccd-3cba-4917-bfe5-779918dee1bc","Type":"ContainerDied","Data":"fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357"} Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.717229 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxnwv" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.717290 5033 scope.go:117] "RemoveContainer" containerID="fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.717276 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxnwv" event={"ID":"eca58ccd-3cba-4917-bfe5-779918dee1bc","Type":"ContainerDied","Data":"f56df22bfd40b4169c7670051c8c36b9275a5ea709245e408ea532ae8f38c2a9"} Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.741034 5033 scope.go:117] "RemoveContainer" containerID="fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.753338 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fxnwv"] Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.761299 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fxnwv"] Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.765392 5033 scope.go:117] "RemoveContainer" containerID="e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.828706 5033 scope.go:117] "RemoveContainer" containerID="fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357" Feb 26 18:21:09 crc kubenswrapper[5033]: E0226 18:21:09.831479 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357\": container with ID starting with fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357 not found: ID does not exist" containerID="fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.831530 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357"} err="failed to get container status \"fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357\": rpc error: code = NotFound desc = could not find container \"fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357\": container with ID starting with fa84941eb389a9e7772c6b1d66c60a0c02982851b68acf2ad76646d352b27357 not found: ID does not exist" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.831558 5033 scope.go:117] "RemoveContainer" containerID="fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6" Feb 26 18:21:09 crc kubenswrapper[5033]: E0226 18:21:09.835401 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6\": container with ID starting with fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6 not found: ID does not exist" containerID="fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.835449 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6"} err="failed to get container status \"fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6\": rpc error: code = NotFound desc = could not find container \"fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6\": container with ID starting with fe73025067626c94540230a0a63737586863120f2b61b5e1146646fe54bca4b6 not found: ID does not exist" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.835479 5033 scope.go:117] "RemoveContainer" containerID="e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00" Feb 26 18:21:09 crc kubenswrapper[5033]: E0226 18:21:09.835918 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00\": container with ID starting with e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00 not found: ID does not exist" containerID="e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00" Feb 26 18:21:09 crc kubenswrapper[5033]: I0226 18:21:09.835946 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00"} err="failed to get container status \"e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00\": rpc error: code = NotFound desc = could not find container \"e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00\": container with ID starting with e11ca83a5509294e7d5940e2c9f4de44c691e84a446da3eaaf1214fa9eb38e00 not found: ID does not exist" Feb 26 18:21:11 crc kubenswrapper[5033]: I0226 18:21:11.644228 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca58ccd-3cba-4917-bfe5-779918dee1bc" path="/var/lib/kubelet/pods/eca58ccd-3cba-4917-bfe5-779918dee1bc/volumes" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.270928 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjwq6"] Feb 26 18:21:28 crc kubenswrapper[5033]: E0226 18:21:28.273796 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerName="extract-content" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.273831 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerName="extract-content" Feb 26 18:21:28 crc kubenswrapper[5033]: E0226 18:21:28.273856 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerName="registry-server" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.273865 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerName="registry-server" Feb 26 18:21:28 crc kubenswrapper[5033]: E0226 18:21:28.273883 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerName="extract-utilities" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.273892 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerName="extract-utilities" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.274226 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca58ccd-3cba-4917-bfe5-779918dee1bc" containerName="registry-server" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.276413 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.286138 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjwq6"] Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.408042 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dn5t\" (UniqueName: \"kubernetes.io/projected/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-kube-api-access-8dn5t\") pod \"redhat-operators-gjwq6\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.408096 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-catalog-content\") pod \"redhat-operators-gjwq6\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.408244 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-utilities\") pod \"redhat-operators-gjwq6\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.513611 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-utilities\") pod \"redhat-operators-gjwq6\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.513757 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dn5t\" (UniqueName: \"kubernetes.io/projected/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-kube-api-access-8dn5t\") pod \"redhat-operators-gjwq6\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.513811 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-catalog-content\") pod \"redhat-operators-gjwq6\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.514076 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-utilities\") pod \"redhat-operators-gjwq6\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.514222 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-catalog-content\") pod \"redhat-operators-gjwq6\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.533868 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dn5t\" (UniqueName: \"kubernetes.io/projected/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-kube-api-access-8dn5t\") pod \"redhat-operators-gjwq6\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:28 crc kubenswrapper[5033]: I0226 18:21:28.600606 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:29 crc kubenswrapper[5033]: I0226 18:21:29.125614 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjwq6"] Feb 26 18:21:29 crc kubenswrapper[5033]: I0226 18:21:29.919449 5033 generic.go:334] "Generic (PLEG): container finished" podID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerID="20dd1b5b678fa4df87ffb0339a2d60390ad94132d79468fa53f4d6c0a43df475" exitCode=0 Feb 26 18:21:29 crc kubenswrapper[5033]: I0226 18:21:29.919542 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjwq6" event={"ID":"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f","Type":"ContainerDied","Data":"20dd1b5b678fa4df87ffb0339a2d60390ad94132d79468fa53f4d6c0a43df475"} Feb 26 18:21:29 crc kubenswrapper[5033]: I0226 18:21:29.919747 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjwq6" event={"ID":"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f","Type":"ContainerStarted","Data":"93e597d32de7e65fc23df1e0ad6353d949fe1b11bd7c2b462ccaa9af06ca6c57"} Feb 26 18:21:30 crc kubenswrapper[5033]: I0226 18:21:30.931285 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjwq6" event={"ID":"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f","Type":"ContainerStarted","Data":"dc1bc87e90d77db6dddacce5974915b1323a704fde15638f09c515aabbabb82b"} Feb 26 18:21:34 crc kubenswrapper[5033]: I0226 18:21:34.975875 5033 generic.go:334] "Generic (PLEG): container finished" podID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerID="dc1bc87e90d77db6dddacce5974915b1323a704fde15638f09c515aabbabb82b" exitCode=0 Feb 26 18:21:34 crc kubenswrapper[5033]: I0226 18:21:34.975937 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjwq6" event={"ID":"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f","Type":"ContainerDied","Data":"dc1bc87e90d77db6dddacce5974915b1323a704fde15638f09c515aabbabb82b"} Feb 26 18:21:35 crc kubenswrapper[5033]: I0226 18:21:35.990669 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjwq6" event={"ID":"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f","Type":"ContainerStarted","Data":"3cde013521eacfddb946945087bbe193238d966a35bc1d672b57951b033a2a16"} Feb 26 18:21:36 crc kubenswrapper[5033]: I0226 18:21:36.015234 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjwq6" podStartSLOduration=2.528198211 podStartE2EDuration="8.015209556s" podCreationTimestamp="2026-02-26 18:21:28 +0000 UTC" firstStartedPulling="2026-02-26 18:21:29.924289241 +0000 UTC m=+9617.136887891" lastFinishedPulling="2026-02-26 18:21:35.411300586 +0000 UTC m=+9622.623899236" observedRunningTime="2026-02-26 18:21:36.013109375 +0000 UTC m=+9623.225708025" watchObservedRunningTime="2026-02-26 18:21:36.015209556 +0000 UTC m=+9623.227808206" Feb 26 18:21:38 crc kubenswrapper[5033]: I0226 18:21:38.601284 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:38 crc kubenswrapper[5033]: I0226 18:21:38.601764 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:39 crc kubenswrapper[5033]: I0226 18:21:39.828175 5033 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjwq6" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerName="registry-server" probeResult="failure" output=< Feb 26 18:21:39 crc kubenswrapper[5033]: timeout: failed to connect service ":50051" within 1s Feb 26 18:21:39 crc kubenswrapper[5033]: > Feb 26 18:21:48 crc kubenswrapper[5033]: I0226 18:21:48.656221 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:48 crc kubenswrapper[5033]: I0226 18:21:48.718972 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:48 crc kubenswrapper[5033]: I0226 18:21:48.900884 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjwq6"] Feb 26 18:21:50 crc kubenswrapper[5033]: I0226 18:21:50.176246 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gjwq6" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerName="registry-server" containerID="cri-o://3cde013521eacfddb946945087bbe193238d966a35bc1d672b57951b033a2a16" gracePeriod=2 Feb 26 18:21:50 crc kubenswrapper[5033]: I0226 18:21:50.189848 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:21:50 crc kubenswrapper[5033]: I0226 18:21:50.189961 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.191073 5033 generic.go:334] "Generic (PLEG): container finished" podID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerID="3cde013521eacfddb946945087bbe193238d966a35bc1d672b57951b033a2a16" exitCode=0 Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.191189 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjwq6" event={"ID":"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f","Type":"ContainerDied","Data":"3cde013521eacfddb946945087bbe193238d966a35bc1d672b57951b033a2a16"} Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.309791 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.455872 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-catalog-content\") pod \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.456008 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-utilities\") pod \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.456042 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dn5t\" (UniqueName: \"kubernetes.io/projected/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-kube-api-access-8dn5t\") pod \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\" (UID: \"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f\") " Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.457035 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-utilities" (OuterVolumeSpecName: "utilities") pod "cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" (UID: "cbeed1b6-9bf3-4b48-aa90-145ca4588d9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.461109 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-kube-api-access-8dn5t" (OuterVolumeSpecName: "kube-api-access-8dn5t") pod "cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" (UID: "cbeed1b6-9bf3-4b48-aa90-145ca4588d9f"). InnerVolumeSpecName "kube-api-access-8dn5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.559740 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.559778 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dn5t\" (UniqueName: \"kubernetes.io/projected/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-kube-api-access-8dn5t\") on node \"crc\" DevicePath \"\"" Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.607682 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" (UID: "cbeed1b6-9bf3-4b48-aa90-145ca4588d9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:21:51 crc kubenswrapper[5033]: I0226 18:21:51.662158 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:21:52 crc kubenswrapper[5033]: I0226 18:21:52.205973 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjwq6" event={"ID":"cbeed1b6-9bf3-4b48-aa90-145ca4588d9f","Type":"ContainerDied","Data":"93e597d32de7e65fc23df1e0ad6353d949fe1b11bd7c2b462ccaa9af06ca6c57"} Feb 26 18:21:52 crc kubenswrapper[5033]: I0226 18:21:52.206379 5033 scope.go:117] "RemoveContainer" containerID="3cde013521eacfddb946945087bbe193238d966a35bc1d672b57951b033a2a16" Feb 26 18:21:52 crc kubenswrapper[5033]: I0226 18:21:52.206422 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjwq6" Feb 26 18:21:52 crc kubenswrapper[5033]: I0226 18:21:52.243674 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjwq6"] Feb 26 18:21:52 crc kubenswrapper[5033]: I0226 18:21:52.258821 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gjwq6"] Feb 26 18:21:52 crc kubenswrapper[5033]: I0226 18:21:52.263695 5033 scope.go:117] "RemoveContainer" containerID="dc1bc87e90d77db6dddacce5974915b1323a704fde15638f09c515aabbabb82b" Feb 26 18:21:52 crc kubenswrapper[5033]: I0226 18:21:52.298528 5033 scope.go:117] "RemoveContainer" containerID="20dd1b5b678fa4df87ffb0339a2d60390ad94132d79468fa53f4d6c0a43df475" Feb 26 18:21:53 crc kubenswrapper[5033]: I0226 18:21:53.645848 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" path="/var/lib/kubelet/pods/cbeed1b6-9bf3-4b48-aa90-145ca4588d9f/volumes" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.156955 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535502-btmvd"] Feb 26 18:22:00 crc kubenswrapper[5033]: E0226 18:22:00.158017 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerName="registry-server" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.158037 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerName="registry-server" Feb 26 18:22:00 crc kubenswrapper[5033]: E0226 18:22:00.158065 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerName="extract-utilities" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.158075 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerName="extract-utilities" Feb 26 18:22:00 crc kubenswrapper[5033]: E0226 18:22:00.158113 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerName="extract-content" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.158122 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerName="extract-content" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.158376 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbeed1b6-9bf3-4b48-aa90-145ca4588d9f" containerName="registry-server" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.159380 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535502-btmvd" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.166376 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.166396 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.166944 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.171330 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535502-btmvd"] Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.270757 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qncwb\" (UniqueName: \"kubernetes.io/projected/d3e2c15a-87d3-4dbe-b758-e1dbe86f656e-kube-api-access-qncwb\") pod \"auto-csr-approver-29535502-btmvd\" (UID: \"d3e2c15a-87d3-4dbe-b758-e1dbe86f656e\") " pod="openshift-infra/auto-csr-approver-29535502-btmvd" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.372588 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qncwb\" (UniqueName: \"kubernetes.io/projected/d3e2c15a-87d3-4dbe-b758-e1dbe86f656e-kube-api-access-qncwb\") pod \"auto-csr-approver-29535502-btmvd\" (UID: \"d3e2c15a-87d3-4dbe-b758-e1dbe86f656e\") " pod="openshift-infra/auto-csr-approver-29535502-btmvd" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.390015 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qncwb\" (UniqueName: \"kubernetes.io/projected/d3e2c15a-87d3-4dbe-b758-e1dbe86f656e-kube-api-access-qncwb\") pod \"auto-csr-approver-29535502-btmvd\" (UID: \"d3e2c15a-87d3-4dbe-b758-e1dbe86f656e\") " pod="openshift-infra/auto-csr-approver-29535502-btmvd" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.492003 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535502-btmvd" Feb 26 18:22:00 crc kubenswrapper[5033]: I0226 18:22:00.996925 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535502-btmvd"] Feb 26 18:22:01 crc kubenswrapper[5033]: I0226 18:22:01.305549 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535502-btmvd" event={"ID":"d3e2c15a-87d3-4dbe-b758-e1dbe86f656e","Type":"ContainerStarted","Data":"ae759fb58b987f2038e5f276e40b189e010f455d33b30dd62647f6263dc610b3"} Feb 26 18:22:02 crc kubenswrapper[5033]: I0226 18:22:02.321809 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535502-btmvd" event={"ID":"d3e2c15a-87d3-4dbe-b758-e1dbe86f656e","Type":"ContainerStarted","Data":"adfdbc1554750f10ee824f085ea21c06819ea2fc88f9b05d896bacaeefc16130"} Feb 26 18:22:02 crc kubenswrapper[5033]: I0226 18:22:02.337344 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535502-btmvd" podStartSLOduration=1.395717902 podStartE2EDuration="2.337328682s" podCreationTimestamp="2026-02-26 18:22:00 +0000 UTC" firstStartedPulling="2026-02-26 18:22:01.003138291 +0000 UTC m=+9648.215736941" lastFinishedPulling="2026-02-26 18:22:01.944749071 +0000 UTC m=+9649.157347721" observedRunningTime="2026-02-26 18:22:02.334823761 +0000 UTC m=+9649.547422451" watchObservedRunningTime="2026-02-26 18:22:02.337328682 +0000 UTC m=+9649.549927332" Feb 26 18:22:03 crc kubenswrapper[5033]: I0226 18:22:03.341926 5033 generic.go:334] "Generic (PLEG): container finished" podID="d3e2c15a-87d3-4dbe-b758-e1dbe86f656e" containerID="adfdbc1554750f10ee824f085ea21c06819ea2fc88f9b05d896bacaeefc16130" exitCode=0 Feb 26 18:22:03 crc kubenswrapper[5033]: I0226 18:22:03.341990 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535502-btmvd" event={"ID":"d3e2c15a-87d3-4dbe-b758-e1dbe86f656e","Type":"ContainerDied","Data":"adfdbc1554750f10ee824f085ea21c06819ea2fc88f9b05d896bacaeefc16130"} Feb 26 18:22:04 crc kubenswrapper[5033]: I0226 18:22:04.770436 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535502-btmvd" Feb 26 18:22:04 crc kubenswrapper[5033]: I0226 18:22:04.897566 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qncwb\" (UniqueName: \"kubernetes.io/projected/d3e2c15a-87d3-4dbe-b758-e1dbe86f656e-kube-api-access-qncwb\") pod \"d3e2c15a-87d3-4dbe-b758-e1dbe86f656e\" (UID: \"d3e2c15a-87d3-4dbe-b758-e1dbe86f656e\") " Feb 26 18:22:04 crc kubenswrapper[5033]: I0226 18:22:04.911041 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e2c15a-87d3-4dbe-b758-e1dbe86f656e-kube-api-access-qncwb" (OuterVolumeSpecName: "kube-api-access-qncwb") pod "d3e2c15a-87d3-4dbe-b758-e1dbe86f656e" (UID: "d3e2c15a-87d3-4dbe-b758-e1dbe86f656e"). InnerVolumeSpecName "kube-api-access-qncwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:22:05 crc kubenswrapper[5033]: I0226 18:22:04.999908 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qncwb\" (UniqueName: \"kubernetes.io/projected/d3e2c15a-87d3-4dbe-b758-e1dbe86f656e-kube-api-access-qncwb\") on node \"crc\" DevicePath \"\"" Feb 26 18:22:05 crc kubenswrapper[5033]: I0226 18:22:05.371793 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535502-btmvd" event={"ID":"d3e2c15a-87d3-4dbe-b758-e1dbe86f656e","Type":"ContainerDied","Data":"ae759fb58b987f2038e5f276e40b189e010f455d33b30dd62647f6263dc610b3"} Feb 26 18:22:05 crc kubenswrapper[5033]: I0226 18:22:05.371840 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae759fb58b987f2038e5f276e40b189e010f455d33b30dd62647f6263dc610b3" Feb 26 18:22:05 crc kubenswrapper[5033]: I0226 18:22:05.372143 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535502-btmvd" Feb 26 18:22:05 crc kubenswrapper[5033]: I0226 18:22:05.416758 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535496-5h9jl"] Feb 26 18:22:05 crc kubenswrapper[5033]: I0226 18:22:05.427052 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535496-5h9jl"] Feb 26 18:22:05 crc kubenswrapper[5033]: I0226 18:22:05.639964 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68494582-a7e8-4967-8e7b-1c03dd5627e2" path="/var/lib/kubelet/pods/68494582-a7e8-4967-8e7b-1c03dd5627e2/volumes" Feb 26 18:22:20 crc kubenswrapper[5033]: I0226 18:22:20.189611 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:22:20 crc kubenswrapper[5033]: I0226 18:22:20.190040 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:22:47 crc kubenswrapper[5033]: I0226 18:22:47.029629 5033 scope.go:117] "RemoveContainer" containerID="6bbc5e6602b3e2056537d0a9e57f16af4b1ca4f0debd8012ff615ff587083b73" Feb 26 18:22:50 crc kubenswrapper[5033]: I0226 18:22:50.189179 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:22:50 crc kubenswrapper[5033]: I0226 18:22:50.189845 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:22:50 crc kubenswrapper[5033]: I0226 18:22:50.189906 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 18:22:50 crc kubenswrapper[5033]: I0226 18:22:50.190823 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10fa683236b128009f2a223b9e7597e835171fa14a18ab9a78920652b702878a"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 18:22:50 crc kubenswrapper[5033]: I0226 18:22:50.190886 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://10fa683236b128009f2a223b9e7597e835171fa14a18ab9a78920652b702878a" gracePeriod=600 Feb 26 18:22:50 crc kubenswrapper[5033]: I0226 18:22:50.825561 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="10fa683236b128009f2a223b9e7597e835171fa14a18ab9a78920652b702878a" exitCode=0 Feb 26 18:22:50 crc kubenswrapper[5033]: I0226 18:22:50.825793 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"10fa683236b128009f2a223b9e7597e835171fa14a18ab9a78920652b702878a"} Feb 26 18:22:50 crc kubenswrapper[5033]: I0226 18:22:50.825949 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f"} Feb 26 18:22:50 crc kubenswrapper[5033]: I0226 18:22:50.825973 5033 scope.go:117] "RemoveContainer" containerID="51002367d9aab91fe117f3ce7f2d6ff8e0474477c80467b248a931e52526ad23" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.151754 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535504-9t6gr"] Feb 26 18:24:00 crc kubenswrapper[5033]: E0226 18:24:00.152977 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e2c15a-87d3-4dbe-b758-e1dbe86f656e" containerName="oc" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.152996 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e2c15a-87d3-4dbe-b758-e1dbe86f656e" containerName="oc" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.153213 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e2c15a-87d3-4dbe-b758-e1dbe86f656e" containerName="oc" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.154125 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535504-9t6gr" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.157022 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.157224 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.161521 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.164714 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535504-9t6gr"] Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.225165 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bqrp\" (UniqueName: \"kubernetes.io/projected/0ec0ceed-1997-4fe5-966a-d42618095b51-kube-api-access-2bqrp\") pod \"auto-csr-approver-29535504-9t6gr\" (UID: \"0ec0ceed-1997-4fe5-966a-d42618095b51\") " pod="openshift-infra/auto-csr-approver-29535504-9t6gr" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.326914 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bqrp\" (UniqueName: \"kubernetes.io/projected/0ec0ceed-1997-4fe5-966a-d42618095b51-kube-api-access-2bqrp\") pod \"auto-csr-approver-29535504-9t6gr\" (UID: \"0ec0ceed-1997-4fe5-966a-d42618095b51\") " pod="openshift-infra/auto-csr-approver-29535504-9t6gr" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.346362 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bqrp\" (UniqueName: \"kubernetes.io/projected/0ec0ceed-1997-4fe5-966a-d42618095b51-kube-api-access-2bqrp\") pod \"auto-csr-approver-29535504-9t6gr\" (UID: \"0ec0ceed-1997-4fe5-966a-d42618095b51\") " pod="openshift-infra/auto-csr-approver-29535504-9t6gr" Feb 26 18:24:00 crc kubenswrapper[5033]: I0226 18:24:00.475131 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535504-9t6gr" Feb 26 18:24:01 crc kubenswrapper[5033]: I0226 18:24:01.010638 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535504-9t6gr"] Feb 26 18:24:01 crc kubenswrapper[5033]: I0226 18:24:01.017160 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 18:24:01 crc kubenswrapper[5033]: I0226 18:24:01.592512 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535504-9t6gr" event={"ID":"0ec0ceed-1997-4fe5-966a-d42618095b51","Type":"ContainerStarted","Data":"f6372e54551e9d42d3aa25cccaee51839d3b4aaad1304b76b313273940936439"} Feb 26 18:24:02 crc kubenswrapper[5033]: I0226 18:24:02.608065 5033 generic.go:334] "Generic (PLEG): container finished" podID="0ec0ceed-1997-4fe5-966a-d42618095b51" containerID="d2b35cba59bd5b843b78f28109a7311b9c11896921b945149169576b74de946c" exitCode=0 Feb 26 18:24:02 crc kubenswrapper[5033]: I0226 18:24:02.608111 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535504-9t6gr" event={"ID":"0ec0ceed-1997-4fe5-966a-d42618095b51","Type":"ContainerDied","Data":"d2b35cba59bd5b843b78f28109a7311b9c11896921b945149169576b74de946c"} Feb 26 18:24:03 crc kubenswrapper[5033]: I0226 18:24:03.978446 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535504-9t6gr" Feb 26 18:24:04 crc kubenswrapper[5033]: I0226 18:24:04.131511 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bqrp\" (UniqueName: \"kubernetes.io/projected/0ec0ceed-1997-4fe5-966a-d42618095b51-kube-api-access-2bqrp\") pod \"0ec0ceed-1997-4fe5-966a-d42618095b51\" (UID: \"0ec0ceed-1997-4fe5-966a-d42618095b51\") " Feb 26 18:24:04 crc kubenswrapper[5033]: I0226 18:24:04.144621 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec0ceed-1997-4fe5-966a-d42618095b51-kube-api-access-2bqrp" (OuterVolumeSpecName: "kube-api-access-2bqrp") pod "0ec0ceed-1997-4fe5-966a-d42618095b51" (UID: "0ec0ceed-1997-4fe5-966a-d42618095b51"). InnerVolumeSpecName "kube-api-access-2bqrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:24:04 crc kubenswrapper[5033]: I0226 18:24:04.234411 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bqrp\" (UniqueName: \"kubernetes.io/projected/0ec0ceed-1997-4fe5-966a-d42618095b51-kube-api-access-2bqrp\") on node \"crc\" DevicePath \"\"" Feb 26 18:24:04 crc kubenswrapper[5033]: I0226 18:24:04.631564 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535504-9t6gr" event={"ID":"0ec0ceed-1997-4fe5-966a-d42618095b51","Type":"ContainerDied","Data":"f6372e54551e9d42d3aa25cccaee51839d3b4aaad1304b76b313273940936439"} Feb 26 18:24:04 crc kubenswrapper[5033]: I0226 18:24:04.631621 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6372e54551e9d42d3aa25cccaee51839d3b4aaad1304b76b313273940936439" Feb 26 18:24:04 crc kubenswrapper[5033]: I0226 18:24:04.631585 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535504-9t6gr" Feb 26 18:24:05 crc kubenswrapper[5033]: I0226 18:24:05.058341 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535498-sb2nd"] Feb 26 18:24:05 crc kubenswrapper[5033]: I0226 18:24:05.070605 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535498-sb2nd"] Feb 26 18:24:05 crc kubenswrapper[5033]: I0226 18:24:05.641336 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73909f91-3ba0-44a3-90f3-1ab4e22b9ae0" path="/var/lib/kubelet/pods/73909f91-3ba0-44a3-90f3-1ab4e22b9ae0/volumes" Feb 26 18:24:47 crc kubenswrapper[5033]: I0226 18:24:47.166393 5033 scope.go:117] "RemoveContainer" containerID="04dd59876f32c0b896011cef3be2bb901643dddf4197765b3e63985cb839c401" Feb 26 18:24:50 crc kubenswrapper[5033]: I0226 18:24:50.188996 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:24:50 crc kubenswrapper[5033]: I0226 18:24:50.189544 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:25:20 crc kubenswrapper[5033]: I0226 18:25:20.189198 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:25:20 crc kubenswrapper[5033]: I0226 18:25:20.191333 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:25:50 crc kubenswrapper[5033]: I0226 18:25:50.189187 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:25:50 crc kubenswrapper[5033]: I0226 18:25:50.190053 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:25:50 crc kubenswrapper[5033]: I0226 18:25:50.190131 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 18:25:50 crc kubenswrapper[5033]: I0226 18:25:50.191697 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 18:25:50 crc kubenswrapper[5033]: I0226 18:25:50.191837 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" gracePeriod=600 Feb 26 18:25:50 crc kubenswrapper[5033]: E0226 18:25:50.318521 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:25:50 crc kubenswrapper[5033]: I0226 18:25:50.949611 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" exitCode=0 Feb 26 18:25:50 crc kubenswrapper[5033]: I0226 18:25:50.949663 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f"} Feb 26 18:25:50 crc kubenswrapper[5033]: I0226 18:25:50.949696 5033 scope.go:117] "RemoveContainer" containerID="10fa683236b128009f2a223b9e7597e835171fa14a18ab9a78920652b702878a" Feb 26 18:25:50 crc kubenswrapper[5033]: I0226 18:25:50.950450 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:25:50 crc kubenswrapper[5033]: E0226 18:25:50.950753 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.150752 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535506-xtsfl"] Feb 26 18:26:00 crc kubenswrapper[5033]: E0226 18:26:00.151939 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec0ceed-1997-4fe5-966a-d42618095b51" containerName="oc" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.151956 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec0ceed-1997-4fe5-966a-d42618095b51" containerName="oc" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.152309 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec0ceed-1997-4fe5-966a-d42618095b51" containerName="oc" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.153316 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535506-xtsfl" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.155763 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.155831 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.157336 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.172578 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535506-xtsfl"] Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.241982 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmfvc\" (UniqueName: \"kubernetes.io/projected/5fc8a4fa-1728-4825-87e5-424c0fa32e51-kube-api-access-pmfvc\") pod \"auto-csr-approver-29535506-xtsfl\" (UID: \"5fc8a4fa-1728-4825-87e5-424c0fa32e51\") " pod="openshift-infra/auto-csr-approver-29535506-xtsfl" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.344557 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmfvc\" (UniqueName: \"kubernetes.io/projected/5fc8a4fa-1728-4825-87e5-424c0fa32e51-kube-api-access-pmfvc\") pod \"auto-csr-approver-29535506-xtsfl\" (UID: \"5fc8a4fa-1728-4825-87e5-424c0fa32e51\") " pod="openshift-infra/auto-csr-approver-29535506-xtsfl" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.369296 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmfvc\" (UniqueName: \"kubernetes.io/projected/5fc8a4fa-1728-4825-87e5-424c0fa32e51-kube-api-access-pmfvc\") pod \"auto-csr-approver-29535506-xtsfl\" (UID: \"5fc8a4fa-1728-4825-87e5-424c0fa32e51\") " pod="openshift-infra/auto-csr-approver-29535506-xtsfl" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.490764 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535506-xtsfl" Feb 26 18:26:00 crc kubenswrapper[5033]: I0226 18:26:00.950877 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535506-xtsfl"] Feb 26 18:26:01 crc kubenswrapper[5033]: I0226 18:26:01.056333 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535506-xtsfl" event={"ID":"5fc8a4fa-1728-4825-87e5-424c0fa32e51","Type":"ContainerStarted","Data":"26726531cd2a550aaf402baf613be4bdc2e3af1d03541de0fcd48302ea46dd89"} Feb 26 18:26:03 crc kubenswrapper[5033]: I0226 18:26:03.076746 5033 generic.go:334] "Generic (PLEG): container finished" podID="5fc8a4fa-1728-4825-87e5-424c0fa32e51" containerID="d777bf7a28fa0ad80b57eb1fc7d76e9c65f39872fd707258ad545e8aa9d330d2" exitCode=0 Feb 26 18:26:03 crc kubenswrapper[5033]: I0226 18:26:03.076818 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535506-xtsfl" event={"ID":"5fc8a4fa-1728-4825-87e5-424c0fa32e51","Type":"ContainerDied","Data":"d777bf7a28fa0ad80b57eb1fc7d76e9c65f39872fd707258ad545e8aa9d330d2"} Feb 26 18:26:03 crc kubenswrapper[5033]: I0226 18:26:03.643472 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:26:03 crc kubenswrapper[5033]: E0226 18:26:03.643778 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:26:04 crc kubenswrapper[5033]: I0226 18:26:04.471696 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535506-xtsfl" Feb 26 18:26:04 crc kubenswrapper[5033]: I0226 18:26:04.556426 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmfvc\" (UniqueName: \"kubernetes.io/projected/5fc8a4fa-1728-4825-87e5-424c0fa32e51-kube-api-access-pmfvc\") pod \"5fc8a4fa-1728-4825-87e5-424c0fa32e51\" (UID: \"5fc8a4fa-1728-4825-87e5-424c0fa32e51\") " Feb 26 18:26:04 crc kubenswrapper[5033]: I0226 18:26:04.561508 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc8a4fa-1728-4825-87e5-424c0fa32e51-kube-api-access-pmfvc" (OuterVolumeSpecName: "kube-api-access-pmfvc") pod "5fc8a4fa-1728-4825-87e5-424c0fa32e51" (UID: "5fc8a4fa-1728-4825-87e5-424c0fa32e51"). InnerVolumeSpecName "kube-api-access-pmfvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:26:04 crc kubenswrapper[5033]: I0226 18:26:04.658546 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmfvc\" (UniqueName: \"kubernetes.io/projected/5fc8a4fa-1728-4825-87e5-424c0fa32e51-kube-api-access-pmfvc\") on node \"crc\" DevicePath \"\"" Feb 26 18:26:05 crc kubenswrapper[5033]: I0226 18:26:05.098250 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535506-xtsfl" event={"ID":"5fc8a4fa-1728-4825-87e5-424c0fa32e51","Type":"ContainerDied","Data":"26726531cd2a550aaf402baf613be4bdc2e3af1d03541de0fcd48302ea46dd89"} Feb 26 18:26:05 crc kubenswrapper[5033]: I0226 18:26:05.098614 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26726531cd2a550aaf402baf613be4bdc2e3af1d03541de0fcd48302ea46dd89" Feb 26 18:26:05 crc kubenswrapper[5033]: I0226 18:26:05.098393 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535506-xtsfl" Feb 26 18:26:05 crc kubenswrapper[5033]: I0226 18:26:05.545409 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535500-55fxh"] Feb 26 18:26:05 crc kubenswrapper[5033]: I0226 18:26:05.555185 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535500-55fxh"] Feb 26 18:26:05 crc kubenswrapper[5033]: I0226 18:26:05.652938 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91042adc-dcc8-411c-9e7f-4679e27c6621" path="/var/lib/kubelet/pods/91042adc-dcc8-411c-9e7f-4679e27c6621/volumes" Feb 26 18:26:15 crc kubenswrapper[5033]: I0226 18:26:15.629041 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:26:15 crc kubenswrapper[5033]: E0226 18:26:15.629876 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:26:28 crc kubenswrapper[5033]: I0226 18:26:28.628358 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:26:28 crc kubenswrapper[5033]: E0226 18:26:28.629084 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:26:37 crc kubenswrapper[5033]: I0226 18:26:37.906804 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_f9343e5c-17b8-4f9f-bd66-071d27c0b43a/init-config-reloader/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.280737 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_f9343e5c-17b8-4f9f-bd66-071d27c0b43a/init-config-reloader/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.312223 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_f9343e5c-17b8-4f9f-bd66-071d27c0b43a/config-reloader/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.326499 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_f9343e5c-17b8-4f9f-bd66-071d27c0b43a/alertmanager/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.473600 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b3e6c3b2-d7d4-4297-99c7-761ec33de582/aodh-api/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.549175 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b3e6c3b2-d7d4-4297-99c7-761ec33de582/aodh-evaluator/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.651304 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b3e6c3b2-d7d4-4297-99c7-761ec33de582/aodh-listener/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.688562 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b3e6c3b2-d7d4-4297-99c7-761ec33de582/aodh-notifier/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.740090 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84b69f6954-ldnp5_34475404-fb32-4053-978e-02a89a738365/barbican-api/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.833535 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84b69f6954-ldnp5_34475404-fb32-4053-978e-02a89a738365/barbican-api-log/0.log" Feb 26 18:26:38 crc kubenswrapper[5033]: I0226 18:26:38.942121 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6798749c56-qfcf4_da8187b8-d72f-4471-9ebc-6846bf6a03f8/barbican-keystone-listener/0.log" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.000007 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6798749c56-qfcf4_da8187b8-d72f-4471-9ebc-6846bf6a03f8/barbican-keystone-listener-log/0.log" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.224831 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84bd445555-qg8pl_8a4dbc71-6e95-4ec4-90f1-7867b5492ef3/barbican-worker/0.log" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.291698 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84bd445555-qg8pl_8a4dbc71-6e95-4ec4-90f1-7867b5492ef3/barbican-worker-log/0.log" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.463201 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-hqjgv_cdd70c36-4e9c-4608-816c-7fd46b747577/bootstrap-openstack-openstack-cell1/0.log" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.534309 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b63df35f-0e95-461c-9549-82bbe8f7e7b6/ceilometer-central-agent/0.log" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.597361 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b63df35f-0e95-461c-9549-82bbe8f7e7b6/ceilometer-notification-agent/0.log" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.634153 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:26:39 crc kubenswrapper[5033]: E0226 18:26:39.634866 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.661473 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b63df35f-0e95-461c-9549-82bbe8f7e7b6/proxy-httpd/0.log" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.727804 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b63df35f-0e95-461c-9549-82bbe8f7e7b6/sg-core/0.log" Feb 26 18:26:39 crc kubenswrapper[5033]: I0226 18:26:39.803093 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-fclgb_73f6a569-5a65-4fbf-a77f-2a6ce838ac41/ceph-client-openstack-openstack-cell1/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.014849 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ed9da2b7-a018-43d7-a0a1-c6166adc3c3f/cinder-api/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.089356 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ed9da2b7-a018-43d7-a0a1-c6166adc3c3f/cinder-api-log/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.272412 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_df14a279-eaa8-4702-976b-ab92229cd218/probe/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.296787 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_df14a279-eaa8-4702-976b-ab92229cd218/cinder-backup/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.330199 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dd7f43ba-d95d-48ef-9788-b25bb345f972/cinder-scheduler/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.557819 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dd7f43ba-d95d-48ef-9788-b25bb345f972/probe/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.585145 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_b9c460f2-2f7e-496f-b148-de398d33f18e/cinder-volume/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.638775 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_b9c460f2-2f7e-496f-b148-de398d33f18e/probe/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.823145 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-xc82d_6d711842-6c9a-468e-842b-b44c019d0e13/configure-network-openstack-openstack-cell1/0.log" Feb 26 18:26:40 crc kubenswrapper[5033]: I0226 18:26:40.957236 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-rzn9b_2c298a16-67d9-45a7-bcef-322883b4030d/configure-os-openstack-openstack-cell1/0.log" Feb 26 18:26:41 crc kubenswrapper[5033]: I0226 18:26:41.434942 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8586b96f6c-5jbm2_4f725c7f-3187-4bf8-a8cd-ffc406ea45ac/init/0.log" Feb 26 18:26:41 crc kubenswrapper[5033]: I0226 18:26:41.679124 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8586b96f6c-5jbm2_4f725c7f-3187-4bf8-a8cd-ffc406ea45ac/init/0.log" Feb 26 18:26:41 crc kubenswrapper[5033]: I0226 18:26:41.721154 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8586b96f6c-5jbm2_4f725c7f-3187-4bf8-a8cd-ffc406ea45ac/dnsmasq-dns/0.log" Feb 26 18:26:41 crc kubenswrapper[5033]: I0226 18:26:41.756483 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-xndgp_64d8cd2d-5dbe-45e7-b844-339aa50577b6/download-cache-openstack-openstack-cell1/0.log" Feb 26 18:26:41 crc kubenswrapper[5033]: I0226 18:26:41.931748 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cdaea64e-c02c-4fdc-9b53-08cccb2085a9/glance-httpd/0.log" Feb 26 18:26:41 crc kubenswrapper[5033]: I0226 18:26:41.990804 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cdaea64e-c02c-4fdc-9b53-08cccb2085a9/glance-log/0.log" Feb 26 18:26:42 crc kubenswrapper[5033]: I0226 18:26:42.134306 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_48fd6e88-5852-4cf0-b4f7-41a8bc0d214b/glance-httpd/0.log" Feb 26 18:26:42 crc kubenswrapper[5033]: I0226 18:26:42.190317 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_48fd6e88-5852-4cf0-b4f7-41a8bc0d214b/glance-log/0.log" Feb 26 18:26:42 crc kubenswrapper[5033]: I0226 18:26:42.329670 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-86f57c6d85-7fbtj_e3f0b850-f4c8-4fec-a439-dca70fa0c18e/heat-api/0.log" Feb 26 18:26:42 crc kubenswrapper[5033]: I0226 18:26:42.444876 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-66c4795cc9-fn99t_a07c60cb-13ba-4570-a36b-ffa95c55e0f0/heat-cfnapi/0.log" Feb 26 18:26:42 crc kubenswrapper[5033]: I0226 18:26:42.566394 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-586dbbcf89-xjwbx_b36ee056-5db4-4b73-91e8-1bbf4b1f36cd/heat-engine/0.log" Feb 26 18:26:42 crc kubenswrapper[5033]: I0226 18:26:42.677559 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7879589cbf-mbc9r_05be8330-3b07-4f80-978e-11db638c80ac/horizon/0.log" Feb 26 18:26:42 crc kubenswrapper[5033]: I0226 18:26:42.762975 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7879589cbf-mbc9r_05be8330-3b07-4f80-978e-11db638c80ac/horizon-log/0.log" Feb 26 18:26:43 crc kubenswrapper[5033]: I0226 18:26:43.484841 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-t9vd8_9e984f92-4207-4d54-9e99-8d18b18db198/install-certs-openstack-openstack-cell1/0.log" Feb 26 18:26:43 crc kubenswrapper[5033]: I0226 18:26:43.517481 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-bsrfn_fa523cf5-af6f-4c44-8d8e-0016d52e3db7/install-os-openstack-openstack-cell1/0.log" Feb 26 18:26:43 crc kubenswrapper[5033]: I0226 18:26:43.711242 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7b6d888c87-76db4_1398b95c-4410-4eb1-a84c-d9df58c4aad3/keystone-api/0.log" Feb 26 18:26:43 crc kubenswrapper[5033]: I0226 18:26:43.733449 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29535481-l2mhq_08deaf39-80a4-408d-a07e-63c2428e7b2c/keystone-cron/0.log" Feb 26 18:26:43 crc kubenswrapper[5033]: I0226 18:26:43.886882 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9a49e383-e3d9-4a97-8f52-df90e3f4bbc9/kube-state-metrics/0.log" Feb 26 18:26:44 crc kubenswrapper[5033]: I0226 18:26:44.025468 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-7ncz9_8aa037b5-14a7-41e1-81cf-ffb06e380167/libvirt-openstack-openstack-cell1/0.log" Feb 26 18:26:44 crc kubenswrapper[5033]: I0226 18:26:44.161062 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_090a19bc-766f-43fb-9220-dfefb248ebf4/manila-api-log/0.log" Feb 26 18:26:44 crc kubenswrapper[5033]: I0226 18:26:44.190849 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_090a19bc-766f-43fb-9220-dfefb248ebf4/manila-api/0.log" Feb 26 18:26:44 crc kubenswrapper[5033]: I0226 18:26:44.336468 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd/probe/0.log" Feb 26 18:26:44 crc kubenswrapper[5033]: I0226 18:26:44.338209 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8f3bffc6-9c7d-4606-92e3-bd83a55b2bfd/manila-scheduler/0.log" Feb 26 18:26:44 crc kubenswrapper[5033]: I0226 18:26:44.496317 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_96f8bb5c-3820-4d9f-856d-f777d2e44129/manila-share/0.log" Feb 26 18:26:44 crc kubenswrapper[5033]: I0226 18:26:44.706914 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_96f8bb5c-3820-4d9f-856d-f777d2e44129/probe/0.log" Feb 26 18:26:45 crc kubenswrapper[5033]: I0226 18:26:45.173798 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69cc55c897-gnbn9_d965aba7-c944-4339-a3af-b65a14d65f85/neutron-httpd/0.log" Feb 26 18:26:45 crc kubenswrapper[5033]: I0226 18:26:45.218624 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69cc55c897-gnbn9_d965aba7-c944-4339-a3af-b65a14d65f85/neutron-api/0.log" Feb 26 18:26:45 crc kubenswrapper[5033]: I0226 18:26:45.340001 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-b6psf_8f9e96ec-e2f3-4a0f-a9ce-cb27d56f2397/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 26 18:26:45 crc kubenswrapper[5033]: I0226 18:26:45.554247 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-hw98g_2f817919-d860-441e-abc9-c3a428da253b/neutron-metadata-openstack-openstack-cell1/0.log" Feb 26 18:26:45 crc kubenswrapper[5033]: I0226 18:26:45.588771 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-6txrj_0aece9f3-88d3-458e-8587-fbb1090e0f29/neutron-sriov-openstack-openstack-cell1/0.log" Feb 26 18:26:45 crc kubenswrapper[5033]: I0226 18:26:45.947338 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_daa57611-c06c-499a-ac11-e1c080227fc9/nova-api-api/0.log" Feb 26 18:26:45 crc kubenswrapper[5033]: I0226 18:26:45.979243 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_daa57611-c06c-499a-ac11-e1c080227fc9/nova-api-log/0.log" Feb 26 18:26:46 crc kubenswrapper[5033]: I0226 18:26:46.118217 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b97279be-904e-4e29-86d1-42feab6dad6a/nova-cell0-conductor-conductor/0.log" Feb 26 18:26:46 crc kubenswrapper[5033]: I0226 18:26:46.304563 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b3ef55c3-0ccb-45e1-9c68-08fca3ff4f0e/nova-cell1-conductor-conductor/0.log" Feb 26 18:26:46 crc kubenswrapper[5033]: I0226 18:26:46.545928 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_56249f2d-f5a1-42a1-b994-1569b5c7b5cb/nova-cell1-novncproxy-novncproxy/0.log" Feb 26 18:26:46 crc kubenswrapper[5033]: I0226 18:26:46.560874 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellkq8p4_d7dc2fb3-80be-4fc0-afe0-25f298176e0e/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 26 18:26:46 crc kubenswrapper[5033]: I0226 18:26:46.790924 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-vvrps_abd39c5a-ea6d-4d76-8370-d3179f6ae5c7/nova-cell1-openstack-openstack-cell1/0.log" Feb 26 18:26:46 crc kubenswrapper[5033]: I0226 18:26:46.901579 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_27c6a87a-6888-4658-883f-bc4b1e2b469a/nova-metadata-log/0.log" Feb 26 18:26:46 crc kubenswrapper[5033]: I0226 18:26:46.941613 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_27c6a87a-6888-4658-883f-bc4b1e2b469a/nova-metadata-metadata/0.log" Feb 26 18:26:47 crc kubenswrapper[5033]: I0226 18:26:47.102178 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b2b76dc3-ff08-4cd1-8cde-aaa65f14c0ab/nova-scheduler-scheduler/0.log" Feb 26 18:26:47 crc kubenswrapper[5033]: I0226 18:26:47.206526 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-b4445f8b9-64k79_2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4/init/0.log" Feb 26 18:26:47 crc kubenswrapper[5033]: I0226 18:26:47.283234 5033 scope.go:117] "RemoveContainer" containerID="86289b13dfa2c54c269b09bb794a92cb25ca0f30fc62e4abd853b8786acffd8a" Feb 26 18:26:47 crc kubenswrapper[5033]: I0226 18:26:47.447662 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-b4445f8b9-64k79_2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4/init/0.log" Feb 26 18:26:47 crc kubenswrapper[5033]: I0226 18:26:47.507066 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-b4445f8b9-64k79_2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4/octavia-api-provider-agent/0.log" Feb 26 18:26:47 crc kubenswrapper[5033]: I0226 18:26:47.673081 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-zzzv9_d7ac0f34-8690-4df6-8960-60cd6a1abf7a/init/0.log" Feb 26 18:26:47 crc kubenswrapper[5033]: I0226 18:26:47.805245 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-b4445f8b9-64k79_2c61dbd1-14dc-435d-ba2b-4d5a46d18aa4/octavia-api/0.log" Feb 26 18:26:47 crc kubenswrapper[5033]: I0226 18:26:47.964498 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-zzzv9_d7ac0f34-8690-4df6-8960-60cd6a1abf7a/init/0.log" Feb 26 18:26:48 crc kubenswrapper[5033]: I0226 18:26:48.033618 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-zzzv9_d7ac0f34-8690-4df6-8960-60cd6a1abf7a/octavia-healthmanager/0.log" Feb 26 18:26:48 crc kubenswrapper[5033]: I0226 18:26:48.057190 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-77sps_45026070-8e05-46ff-8dcd-39009ddf82cd/init/0.log" Feb 26 18:26:48 crc kubenswrapper[5033]: I0226 18:26:48.327427 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-77sps_45026070-8e05-46ff-8dcd-39009ddf82cd/init/0.log" Feb 26 18:26:48 crc kubenswrapper[5033]: I0226 18:26:48.392124 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-678t2_86a0b82d-eb74-4b64-984e-2eb565b14a00/init/0.log" Feb 26 18:26:48 crc kubenswrapper[5033]: I0226 18:26:48.477248 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-77sps_45026070-8e05-46ff-8dcd-39009ddf82cd/octavia-housekeeping/0.log" Feb 26 18:26:48 crc kubenswrapper[5033]: I0226 18:26:48.850971 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-678t2_86a0b82d-eb74-4b64-984e-2eb565b14a00/init/0.log" Feb 26 18:26:48 crc kubenswrapper[5033]: I0226 18:26:48.858934 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-678t2_86a0b82d-eb74-4b64-984e-2eb565b14a00/octavia-amphora-httpd/0.log" Feb 26 18:26:48 crc kubenswrapper[5033]: I0226 18:26:48.972242 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-x6rfk_15d82b70-9bd3-4ae4-a20c-248c1b50264b/init/0.log" Feb 26 18:26:49 crc kubenswrapper[5033]: I0226 18:26:49.312395 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-x6rfk_15d82b70-9bd3-4ae4-a20c-248c1b50264b/init/0.log" Feb 26 18:26:49 crc kubenswrapper[5033]: I0226 18:26:49.338645 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-x6rfk_15d82b70-9bd3-4ae4-a20c-248c1b50264b/octavia-rsyslog/0.log" Feb 26 18:26:49 crc kubenswrapper[5033]: I0226 18:26:49.413040 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-dpv6r_8064f2ba-5595-44a5-9287-c1fd5d02da0f/init/0.log" Feb 26 18:26:49 crc kubenswrapper[5033]: I0226 18:26:49.772270 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-dpv6r_8064f2ba-5595-44a5-9287-c1fd5d02da0f/init/0.log" Feb 26 18:26:49 crc kubenswrapper[5033]: I0226 18:26:49.873873 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a20a94a0-9589-4c0e-9305-72e94939e526/mysql-bootstrap/0.log" Feb 26 18:26:50 crc kubenswrapper[5033]: I0226 18:26:50.048104 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-dpv6r_8064f2ba-5595-44a5-9287-c1fd5d02da0f/octavia-worker/0.log" Feb 26 18:26:50 crc kubenswrapper[5033]: I0226 18:26:50.140437 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a20a94a0-9589-4c0e-9305-72e94939e526/galera/0.log" Feb 26 18:26:50 crc kubenswrapper[5033]: I0226 18:26:50.267855 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a20a94a0-9589-4c0e-9305-72e94939e526/mysql-bootstrap/0.log" Feb 26 18:26:50 crc kubenswrapper[5033]: I0226 18:26:50.305091 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_60fff19f-5516-4ccc-b6b1-a4c50d99696e/mysql-bootstrap/0.log" Feb 26 18:26:50 crc kubenswrapper[5033]: I0226 18:26:50.612830 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_60fff19f-5516-4ccc-b6b1-a4c50d99696e/galera/0.log" Feb 26 18:26:50 crc kubenswrapper[5033]: I0226 18:26:50.647802 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_60fff19f-5516-4ccc-b6b1-a4c50d99696e/mysql-bootstrap/0.log" Feb 26 18:26:50 crc kubenswrapper[5033]: I0226 18:26:50.712515 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a0a1aafd-6624-43ec-b133-425ec8e6309e/openstackclient/0.log" Feb 26 18:26:50 crc kubenswrapper[5033]: I0226 18:26:50.986127 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d66r6_5367270d-f5c1-40a7-b55d-dd016e41b3b5/openstack-network-exporter/0.log" Feb 26 18:26:51 crc kubenswrapper[5033]: I0226 18:26:51.145357 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kgzlq_fe83e329-75c3-41af-a979-dd9aa57873b1/ovn-controller/0.log" Feb 26 18:26:51 crc kubenswrapper[5033]: I0226 18:26:51.634473 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:26:51 crc kubenswrapper[5033]: E0226 18:26:51.634865 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:26:51 crc kubenswrapper[5033]: I0226 18:26:51.791781 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tvbth_562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0/ovsdb-server-init/0.log" Feb 26 18:26:51 crc kubenswrapper[5033]: I0226 18:26:51.955695 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tvbth_562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0/ovs-vswitchd/0.log" Feb 26 18:26:51 crc kubenswrapper[5033]: I0226 18:26:51.973740 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tvbth_562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0/ovsdb-server-init/0.log" Feb 26 18:26:52 crc kubenswrapper[5033]: I0226 18:26:52.032289 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tvbth_562cf9d0-12f5-4b77-afd0-f1a5a7c6eda0/ovsdb-server/0.log" Feb 26 18:26:52 crc kubenswrapper[5033]: I0226 18:26:52.203326 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc/ovn-northd/0.log" Feb 26 18:26:52 crc kubenswrapper[5033]: I0226 18:26:52.243288 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe9f0d4b-42d1-4093-b2bd-14ad1d2ac7fc/openstack-network-exporter/0.log" Feb 26 18:26:52 crc kubenswrapper[5033]: I0226 18:26:52.440200 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-js4r4_e2fc5aa3-dad1-4f82-92ad-daf02318ab73/ovn-openstack-openstack-cell1/0.log" Feb 26 18:26:52 crc kubenswrapper[5033]: I0226 18:26:52.504152 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_095f5cb5-16b7-44e2-924c-77c2730118f0/openstack-network-exporter/0.log" Feb 26 18:26:52 crc kubenswrapper[5033]: I0226 18:26:52.632953 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_095f5cb5-16b7-44e2-924c-77c2730118f0/ovsdbserver-nb/0.log" Feb 26 18:26:52 crc kubenswrapper[5033]: I0226 18:26:52.996894 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_b99b5db1-b403-4ae0-bd38-d8ce5b410ddc/openstack-network-exporter/0.log" Feb 26 18:26:53 crc kubenswrapper[5033]: I0226 18:26:53.069004 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_b99b5db1-b403-4ae0-bd38-d8ce5b410ddc/ovsdbserver-nb/0.log" Feb 26 18:26:53 crc kubenswrapper[5033]: I0226 18:26:53.720630 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_ec70e74e-82af-4e84-88c4-9436180cb8d2/ovsdbserver-nb/0.log" Feb 26 18:26:53 crc kubenswrapper[5033]: I0226 18:26:53.754774 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_ec70e74e-82af-4e84-88c4-9436180cb8d2/openstack-network-exporter/0.log" Feb 26 18:26:53 crc kubenswrapper[5033]: I0226 18:26:53.760779 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_01ac4961-e808-495c-a2d3-7b3cde3c725b/openstack-network-exporter/0.log" Feb 26 18:26:53 crc kubenswrapper[5033]: I0226 18:26:53.945592 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_01ac4961-e808-495c-a2d3-7b3cde3c725b/ovsdbserver-sb/0.log" Feb 26 18:26:53 crc kubenswrapper[5033]: I0226 18:26:53.960825 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_c76bfef9-84e4-470b-a497-9855ac4eec8b/openstack-network-exporter/0.log" Feb 26 18:26:53 crc kubenswrapper[5033]: I0226 18:26:53.992465 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_c76bfef9-84e4-470b-a497-9855ac4eec8b/ovsdbserver-sb/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.199359 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e42ca2a0-6c30-408e-a49a-9f52d241b74a/ovsdbserver-sb/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.199741 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e42ca2a0-6c30-408e-a49a-9f52d241b74a/openstack-network-exporter/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.355598 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7bcb013b-b51b-4de2-8fca-d83453bd4feb/memcached/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.487033 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-788bb6fb9b-t7f5z_962b99a2-813c-4ab4-95ed-1aae526764b1/placement-api/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.515929 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-788bb6fb9b-t7f5z_962b99a2-813c-4ab4-95ed-1aae526764b1/placement-log/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.592894 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cldgnb_cf05cfa2-5c69-440f-8d78-40000ee0b13a/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.687190 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07c079fd-cffe-457d-9416-4cb765fd19de/init-config-reloader/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.891320 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07c079fd-cffe-457d-9416-4cb765fd19de/init-config-reloader/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.917655 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07c079fd-cffe-457d-9416-4cb765fd19de/config-reloader/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.918035 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07c079fd-cffe-457d-9416-4cb765fd19de/thanos-sidecar/0.log" Feb 26 18:26:54 crc kubenswrapper[5033]: I0226 18:26:54.918753 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07c079fd-cffe-457d-9416-4cb765fd19de/prometheus/0.log" Feb 26 18:26:55 crc kubenswrapper[5033]: I0226 18:26:55.125114 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e2616fe4-bf52-4cf1-a08e-86e821cf6935/setup-container/0.log" Feb 26 18:26:55 crc kubenswrapper[5033]: I0226 18:26:55.295288 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e2616fe4-bf52-4cf1-a08e-86e821cf6935/setup-container/0.log" Feb 26 18:26:55 crc kubenswrapper[5033]: I0226 18:26:55.310281 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e2616fe4-bf52-4cf1-a08e-86e821cf6935/rabbitmq/0.log" Feb 26 18:26:55 crc kubenswrapper[5033]: I0226 18:26:55.349377 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_69f6b213-b08e-463c-9ddd-5e8b46861c4b/setup-container/0.log" Feb 26 18:26:55 crc kubenswrapper[5033]: I0226 18:26:55.574205 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_69f6b213-b08e-463c-9ddd-5e8b46861c4b/setup-container/0.log" Feb 26 18:26:55 crc kubenswrapper[5033]: I0226 18:26:55.675831 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-58gxt_56c12e84-7770-493b-a30e-3fcaace66b8c/reboot-os-openstack-openstack-cell1/0.log" Feb 26 18:26:55 crc kubenswrapper[5033]: I0226 18:26:55.837187 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-wf2zl_40e5b6c1-de05-4f09-a9cb-6bda8c86a122/run-os-openstack-openstack-cell1/0.log" Feb 26 18:26:55 crc kubenswrapper[5033]: I0226 18:26:55.921319 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-prsc9_50c67862-887c-4627-b44b-e6e4c7b50401/ssh-known-hosts-openstack/0.log" Feb 26 18:26:56 crc kubenswrapper[5033]: I0226 18:26:56.168138 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-rswv4_cc11177a-f224-4e9d-9b44-92dd6741bb1c/telemetry-openstack-openstack-cell1/0.log" Feb 26 18:26:56 crc kubenswrapper[5033]: I0226 18:26:56.254732 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_69f6b213-b08e-463c-9ddd-5e8b46861c4b/rabbitmq/0.log" Feb 26 18:26:56 crc kubenswrapper[5033]: I0226 18:26:56.318601 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-kfcz8_0418ee62-c3ed-4b01-9801-705d116841ae/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 26 18:26:56 crc kubenswrapper[5033]: I0226 18:26:56.638665 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-mm9vg_32f43285-acd0-474c-920f-04f59f060fb7/validate-network-openstack-openstack-cell1/0.log" Feb 26 18:27:05 crc kubenswrapper[5033]: I0226 18:27:05.629195 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:27:05 crc kubenswrapper[5033]: E0226 18:27:05.630615 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:27:19 crc kubenswrapper[5033]: I0226 18:27:19.628386 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:27:19 crc kubenswrapper[5033]: E0226 18:27:19.629178 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:27:20 crc kubenswrapper[5033]: I0226 18:27:20.246226 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-55cc45767f-2szs7_965ea850-eadc-4a58-ba58-65acacf603e8/manager/0.log" Feb 26 18:27:20 crc kubenswrapper[5033]: I0226 18:27:20.650272 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j_e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba/util/0.log" Feb 26 18:27:20 crc kubenswrapper[5033]: I0226 18:27:20.773625 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j_e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba/util/0.log" Feb 26 18:27:21 crc kubenswrapper[5033]: I0226 18:27:21.309889 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-c4b7d6946-bnptj_8611f841-ae12-4c99-95cf-06d75c839484/manager/0.log" Feb 26 18:27:21 crc kubenswrapper[5033]: I0226 18:27:21.344897 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j_e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba/pull/0.log" Feb 26 18:27:21 crc kubenswrapper[5033]: I0226 18:27:21.395658 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j_e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba/pull/0.log" Feb 26 18:27:21 crc kubenswrapper[5033]: I0226 18:27:21.650322 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j_e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba/pull/0.log" Feb 26 18:27:21 crc kubenswrapper[5033]: I0226 18:27:21.655291 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j_e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba/extract/0.log" Feb 26 18:27:21 crc kubenswrapper[5033]: I0226 18:27:21.699783 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6r5z5j_e3bab0f1-9a79-4f6c-9f7c-6c1874f591ba/util/0.log" Feb 26 18:27:22 crc kubenswrapper[5033]: I0226 18:27:22.152244 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7f748f8b74-nrh9v_61f234b9-a904-466e-b680-f8d6924c8240/manager/0.log" Feb 26 18:27:22 crc kubenswrapper[5033]: I0226 18:27:22.195614 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-9595d6797-p7fv4_968ae160-ba03-4539-849f-6aca2d464181/manager/0.log" Feb 26 18:27:22 crc kubenswrapper[5033]: I0226 18:27:22.462035 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54fb488b88-5rdb7_aeec1478-5706-42da-9d49-ee5bd5b24cbc/manager/0.log" Feb 26 18:27:22 crc kubenswrapper[5033]: I0226 18:27:22.752024 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6494cdbf8f-m75x9_49b7ffc2-b28c-451d-9fff-d7af6078a893/manager/0.log" Feb 26 18:27:23 crc kubenswrapper[5033]: I0226 18:27:23.358721 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-66d6b5f488-rhf9p_acfa5be3-b477-460f-8835-7f86492eed44/manager/0.log" Feb 26 18:27:23 crc kubenswrapper[5033]: I0226 18:27:23.701497 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c78d668d5-dzbd8_264eff6e-56a5-4f67-9b00-17c3cd1ab65d/manager/0.log" Feb 26 18:27:23 crc kubenswrapper[5033]: I0226 18:27:23.740465 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-76fd76856-flt5k_4cf1cb70-e865-474b-a84d-245c92a4ccf3/manager/0.log" Feb 26 18:27:23 crc kubenswrapper[5033]: I0226 18:27:23.872368 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-768c8b45bb-x2r76_bae783d2-7df3-44de-a5e9-ccee0c3ae193/manager/0.log" Feb 26 18:27:24 crc kubenswrapper[5033]: I0226 18:27:24.319011 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6dc9b6ff89-z22f2_d8530a52-aa3c-4911-8339-af07cfeb2e52/manager/0.log" Feb 26 18:27:24 crc kubenswrapper[5033]: I0226 18:27:24.363243 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54967dbbdf-rlpf8_ad93658a-8b9d-4518-a840-03622dd46767/manager/0.log" Feb 26 18:27:24 crc kubenswrapper[5033]: I0226 18:27:24.761793 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d56fd956f-mdclf_1d9ae0bd-78ce-4d9f-8a00-0da4d84fdb94/manager/0.log" Feb 26 18:27:24 crc kubenswrapper[5033]: I0226 18:27:24.839450 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-s9bq7_9ea581b5-9d90-4f86-8d9d-3c7ac0a1ae59/manager/0.log" Feb 26 18:27:25 crc kubenswrapper[5033]: I0226 18:27:25.066859 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-77b8b67585-q6sd9_66a23e95-112a-43d6-8e64-d4073f1b6489/manager/0.log" Feb 26 18:27:25 crc kubenswrapper[5033]: I0226 18:27:25.163014 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-66fc5dfc5b-8k2ns_f15028fb-d8cb-4af3-954a-5b361e84f246/operator/0.log" Feb 26 18:27:25 crc kubenswrapper[5033]: I0226 18:27:25.330488 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xws47_3c0d3fc4-f017-41be-aac9-f570ece2a54a/registry-server/0.log" Feb 26 18:27:25 crc kubenswrapper[5033]: I0226 18:27:25.674351 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57bd55f9b7-24q9k_c61b96c6-4a34-407e-9c0b-a76783ffb3ed/manager/0.log" Feb 26 18:27:25 crc kubenswrapper[5033]: I0226 18:27:25.707610 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-85c99d655-chlrt_f0fe9b18-a0d3-47ec-ab43-73412831eac1/manager/0.log" Feb 26 18:27:25 crc kubenswrapper[5033]: I0226 18:27:25.764086 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-n7drp_67c3e0a3-0b3c-446e-bd46-07b501b58fa8/operator/0.log" Feb 26 18:27:25 crc kubenswrapper[5033]: I0226 18:27:25.962712 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-79558bbfbf-28bzq_157b2409-f9a3-4e35-b087-4b54c945bb14/manager/0.log" Feb 26 18:27:26 crc kubenswrapper[5033]: I0226 18:27:26.215719 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-s7gzx_63abbf2b-ad9d-458c-9a6e-b31988780065/manager/0.log" Feb 26 18:27:26 crc kubenswrapper[5033]: I0226 18:27:26.392999 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-56dc67d744-mwxv6_9c031846-df0c-47c8-9fad-a566b6db896d/manager/0.log" Feb 26 18:27:26 crc kubenswrapper[5033]: I0226 18:27:26.460138 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76bcb69745-2xc62_67342993-d4b9-4e5a-9777-8dc7a6ee4c0d/manager/0.log" Feb 26 18:27:27 crc kubenswrapper[5033]: I0226 18:27:27.804431 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5c89c59655-hhkt9_68273151-991f-4e1c-bf48-93b1d8b3973f/manager/0.log" Feb 26 18:27:31 crc kubenswrapper[5033]: I0226 18:27:31.630065 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:27:31 crc kubenswrapper[5033]: E0226 18:27:31.631307 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:27:46 crc kubenswrapper[5033]: I0226 18:27:46.627852 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:27:46 crc kubenswrapper[5033]: E0226 18:27:46.628815 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:27:49 crc kubenswrapper[5033]: I0226 18:27:49.392317 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xtmb5_4b423453-fa78-454a-83c7-934c0fce8992/control-plane-machine-set-operator/0.log" Feb 26 18:27:49 crc kubenswrapper[5033]: I0226 18:27:49.753903 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cnhpr_9807d6e2-9450-4da2-a3f5-aea1a3b70cb8/kube-rbac-proxy/0.log" Feb 26 18:27:49 crc kubenswrapper[5033]: I0226 18:27:49.755576 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cnhpr_9807d6e2-9450-4da2-a3f5-aea1a3b70cb8/machine-api-operator/0.log" Feb 26 18:27:57 crc kubenswrapper[5033]: I0226 18:27:57.628245 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:27:57 crc kubenswrapper[5033]: E0226 18:27:57.629064 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.144996 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535508-bv7qc"] Feb 26 18:28:00 crc kubenswrapper[5033]: E0226 18:28:00.146022 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc8a4fa-1728-4825-87e5-424c0fa32e51" containerName="oc" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.146037 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc8a4fa-1728-4825-87e5-424c0fa32e51" containerName="oc" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.146331 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc8a4fa-1728-4825-87e5-424c0fa32e51" containerName="oc" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.147052 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535508-bv7qc" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.151064 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.151829 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.155685 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.156761 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535508-bv7qc"] Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.212150 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxmw6\" (UniqueName: \"kubernetes.io/projected/f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa-kube-api-access-jxmw6\") pod \"auto-csr-approver-29535508-bv7qc\" (UID: \"f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa\") " pod="openshift-infra/auto-csr-approver-29535508-bv7qc" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.313236 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxmw6\" (UniqueName: \"kubernetes.io/projected/f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa-kube-api-access-jxmw6\") pod \"auto-csr-approver-29535508-bv7qc\" (UID: \"f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa\") " pod="openshift-infra/auto-csr-approver-29535508-bv7qc" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.344855 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxmw6\" (UniqueName: \"kubernetes.io/projected/f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa-kube-api-access-jxmw6\") pod \"auto-csr-approver-29535508-bv7qc\" (UID: \"f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa\") " pod="openshift-infra/auto-csr-approver-29535508-bv7qc" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.469768 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535508-bv7qc" Feb 26 18:28:00 crc kubenswrapper[5033]: I0226 18:28:00.970932 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535508-bv7qc"] Feb 26 18:28:01 crc kubenswrapper[5033]: I0226 18:28:01.505760 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535508-bv7qc" event={"ID":"f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa","Type":"ContainerStarted","Data":"be7b1eed535a66fa5584f9a2d4eb93a868ad87cd71fb6978fe50de7010bbfe39"} Feb 26 18:28:03 crc kubenswrapper[5033]: E0226 18:28:03.258487 5033 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e4aa7f_4a0e_47b6_8a96_9f38cbffc8fa.slice/crio-91f7110e941e74cd7a0fc04545239df308942d242c5571ecfdac36bed3295ec7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e4aa7f_4a0e_47b6_8a96_9f38cbffc8fa.slice/crio-conmon-91f7110e941e74cd7a0fc04545239df308942d242c5571ecfdac36bed3295ec7.scope\": RecentStats: unable to find data in memory cache]" Feb 26 18:28:03 crc kubenswrapper[5033]: I0226 18:28:03.539777 5033 generic.go:334] "Generic (PLEG): container finished" podID="f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa" containerID="91f7110e941e74cd7a0fc04545239df308942d242c5571ecfdac36bed3295ec7" exitCode=0 Feb 26 18:28:03 crc kubenswrapper[5033]: I0226 18:28:03.539831 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535508-bv7qc" event={"ID":"f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa","Type":"ContainerDied","Data":"91f7110e941e74cd7a0fc04545239df308942d242c5571ecfdac36bed3295ec7"} Feb 26 18:28:03 crc kubenswrapper[5033]: I0226 18:28:03.999273 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-glxpk_3a36f0b4-122a-453f-aa82-c5b8ca6862fa/cert-manager-controller/0.log" Feb 26 18:28:04 crc kubenswrapper[5033]: I0226 18:28:04.197663 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-n5ljg_8f3ed32f-586d-46b9-a6e3-5c61abde9a03/cert-manager-cainjector/0.log" Feb 26 18:28:04 crc kubenswrapper[5033]: I0226 18:28:04.233940 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-5vkhb_24dc0187-e3d9-41af-950e-49c544beeed6/cert-manager-webhook/0.log" Feb 26 18:28:04 crc kubenswrapper[5033]: I0226 18:28:04.961999 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535508-bv7qc" Feb 26 18:28:05 crc kubenswrapper[5033]: I0226 18:28:05.109283 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxmw6\" (UniqueName: \"kubernetes.io/projected/f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa-kube-api-access-jxmw6\") pod \"f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa\" (UID: \"f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa\") " Feb 26 18:28:05 crc kubenswrapper[5033]: I0226 18:28:05.116413 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa-kube-api-access-jxmw6" (OuterVolumeSpecName: "kube-api-access-jxmw6") pod "f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa" (UID: "f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa"). InnerVolumeSpecName "kube-api-access-jxmw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:28:05 crc kubenswrapper[5033]: I0226 18:28:05.211930 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxmw6\" (UniqueName: \"kubernetes.io/projected/f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa-kube-api-access-jxmw6\") on node \"crc\" DevicePath \"\"" Feb 26 18:28:05 crc kubenswrapper[5033]: I0226 18:28:05.559328 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535508-bv7qc" event={"ID":"f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa","Type":"ContainerDied","Data":"be7b1eed535a66fa5584f9a2d4eb93a868ad87cd71fb6978fe50de7010bbfe39"} Feb 26 18:28:05 crc kubenswrapper[5033]: I0226 18:28:05.559652 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be7b1eed535a66fa5584f9a2d4eb93a868ad87cd71fb6978fe50de7010bbfe39" Feb 26 18:28:05 crc kubenswrapper[5033]: I0226 18:28:05.559375 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535508-bv7qc" Feb 26 18:28:06 crc kubenswrapper[5033]: I0226 18:28:06.053423 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535502-btmvd"] Feb 26 18:28:06 crc kubenswrapper[5033]: I0226 18:28:06.062846 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535502-btmvd"] Feb 26 18:28:07 crc kubenswrapper[5033]: I0226 18:28:07.639329 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e2c15a-87d3-4dbe-b758-e1dbe86f656e" path="/var/lib/kubelet/pods/d3e2c15a-87d3-4dbe-b758-e1dbe86f656e/volumes" Feb 26 18:28:09 crc kubenswrapper[5033]: I0226 18:28:09.627813 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:28:09 crc kubenswrapper[5033]: E0226 18:28:09.628523 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:28:18 crc kubenswrapper[5033]: I0226 18:28:18.523178 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-fhq9x_5ed0fdc7-c54e-4cc9-bb77-249396ebea11/nmstate-console-plugin/0.log" Feb 26 18:28:18 crc kubenswrapper[5033]: I0226 18:28:18.759127 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sxftv_d8a7b715-5bb0-4163-ab06-3aeff0badbd6/nmstate-handler/0.log" Feb 26 18:28:18 crc kubenswrapper[5033]: I0226 18:28:18.866722 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-pntj4_46ef3bc4-83e8-4ec0-a5ba-af3f2a567726/nmstate-metrics/0.log" Feb 26 18:28:18 crc kubenswrapper[5033]: I0226 18:28:18.911423 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-pntj4_46ef3bc4-83e8-4ec0-a5ba-af3f2a567726/kube-rbac-proxy/0.log" Feb 26 18:28:19 crc kubenswrapper[5033]: I0226 18:28:19.040920 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-px2gk_c31544e8-ba06-46d8-880f-cf5fdd7eb899/nmstate-operator/0.log" Feb 26 18:28:19 crc kubenswrapper[5033]: I0226 18:28:19.183536 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-2hrjl_5dadc038-c65b-4800-9ca4-e3a879aeae2d/nmstate-webhook/0.log" Feb 26 18:28:21 crc kubenswrapper[5033]: I0226 18:28:21.628119 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:28:21 crc kubenswrapper[5033]: E0226 18:28:21.628900 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:28:33 crc kubenswrapper[5033]: I0226 18:28:33.639292 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:28:33 crc kubenswrapper[5033]: E0226 18:28:33.641389 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:28:35 crc kubenswrapper[5033]: I0226 18:28:35.587397 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-7nhbd_675d6580-5f3e-46b9-a1e9-83127d5f1948/prometheus-operator/0.log" Feb 26 18:28:35 crc kubenswrapper[5033]: I0226 18:28:35.802958 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls_a23d8887-760f-43dc-83f4-7d0eddea6db1/prometheus-operator-admission-webhook/0.log" Feb 26 18:28:35 crc kubenswrapper[5033]: I0226 18:28:35.901805 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg_f5a524af-f845-4af4-9a29-1b1bd1d7a57e/prometheus-operator-admission-webhook/0.log" Feb 26 18:28:36 crc kubenswrapper[5033]: I0226 18:28:36.049171 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-cjxmz_ed6d813c-516f-4d4d-b20b-6534359e0220/operator/0.log" Feb 26 18:28:36 crc kubenswrapper[5033]: I0226 18:28:36.149724 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-c9bgz_ff5f88ba-1664-44b4-b1da-84fe25772c98/perses-operator/0.log" Feb 26 18:28:47 crc kubenswrapper[5033]: I0226 18:28:47.405000 5033 scope.go:117] "RemoveContainer" containerID="adfdbc1554750f10ee824f085ea21c06819ea2fc88f9b05d896bacaeefc16130" Feb 26 18:28:47 crc kubenswrapper[5033]: I0226 18:28:47.628521 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:28:47 crc kubenswrapper[5033]: E0226 18:28:47.628912 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:28:51 crc kubenswrapper[5033]: I0226 18:28:51.553028 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-l876q_0ace028d-9434-478e-b6f9-2302d3506e3d/kube-rbac-proxy/0.log" Feb 26 18:28:51 crc kubenswrapper[5033]: I0226 18:28:51.828743 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-frr-files/0.log" Feb 26 18:28:51 crc kubenswrapper[5033]: I0226 18:28:51.968559 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-l876q_0ace028d-9434-478e-b6f9-2302d3506e3d/controller/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.095236 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-reloader/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.118965 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-metrics/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.121742 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-frr-files/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.175277 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-reloader/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.367470 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-frr-files/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.370018 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-metrics/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.386239 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-metrics/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.406180 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-reloader/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.583753 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-reloader/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.602055 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/controller/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.608952 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-metrics/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.643107 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/cp-frr-files/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.797408 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/kube-rbac-proxy/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.811889 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/frr-metrics/0.log" Feb 26 18:28:52 crc kubenswrapper[5033]: I0226 18:28:52.859109 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/kube-rbac-proxy-frr/0.log" Feb 26 18:28:53 crc kubenswrapper[5033]: I0226 18:28:53.042205 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/reloader/0.log" Feb 26 18:28:53 crc kubenswrapper[5033]: I0226 18:28:53.798043 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-ps55f_0d49f5f7-f9f1-48cb-a9e2-7e5ecdaf8055/frr-k8s-webhook-server/0.log" Feb 26 18:28:53 crc kubenswrapper[5033]: I0226 18:28:53.856234 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c4f4fbd6-mcbpn_41f5e1df-b55f-4861-a814-a111201ae275/manager/0.log" Feb 26 18:28:54 crc kubenswrapper[5033]: I0226 18:28:54.106193 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-55cd77d8fc-wn4jv_d1a05743-e247-444a-bbda-5f411c391cde/webhook-server/0.log" Feb 26 18:28:54 crc kubenswrapper[5033]: I0226 18:28:54.318448 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5tc8n_8dbfe85d-1a34-476d-8f00-c02b6b1aeea7/kube-rbac-proxy/0.log" Feb 26 18:28:55 crc kubenswrapper[5033]: I0226 18:28:55.221556 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5tc8n_8dbfe85d-1a34-476d-8f00-c02b6b1aeea7/speaker/0.log" Feb 26 18:28:56 crc kubenswrapper[5033]: I0226 18:28:56.676504 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bz2qd_94d772d9-7831-40d7-9298-8fe371818c29/frr/0.log" Feb 26 18:28:58 crc kubenswrapper[5033]: I0226 18:28:58.628080 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:28:58 crc kubenswrapper[5033]: E0226 18:28:58.628732 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.240236 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7bzhv"] Feb 26 18:29:03 crc kubenswrapper[5033]: E0226 18:29:03.241192 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa" containerName="oc" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.241208 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa" containerName="oc" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.241708 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa" containerName="oc" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.243600 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.257715 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bzhv"] Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.335514 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59nqj\" (UniqueName: \"kubernetes.io/projected/785aa2eb-71a2-451b-b228-d3f018ec15ec-kube-api-access-59nqj\") pod \"redhat-marketplace-7bzhv\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.335576 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-catalog-content\") pod \"redhat-marketplace-7bzhv\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.335732 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-utilities\") pod \"redhat-marketplace-7bzhv\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.437519 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59nqj\" (UniqueName: \"kubernetes.io/projected/785aa2eb-71a2-451b-b228-d3f018ec15ec-kube-api-access-59nqj\") pod \"redhat-marketplace-7bzhv\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.437573 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-catalog-content\") pod \"redhat-marketplace-7bzhv\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.437675 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-utilities\") pod \"redhat-marketplace-7bzhv\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.438185 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-utilities\") pod \"redhat-marketplace-7bzhv\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.438192 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-catalog-content\") pod \"redhat-marketplace-7bzhv\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.461485 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59nqj\" (UniqueName: \"kubernetes.io/projected/785aa2eb-71a2-451b-b228-d3f018ec15ec-kube-api-access-59nqj\") pod \"redhat-marketplace-7bzhv\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:03 crc kubenswrapper[5033]: I0226 18:29:03.594146 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:04 crc kubenswrapper[5033]: I0226 18:29:04.067968 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bzhv"] Feb 26 18:29:04 crc kubenswrapper[5033]: I0226 18:29:04.129218 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bzhv" event={"ID":"785aa2eb-71a2-451b-b228-d3f018ec15ec","Type":"ContainerStarted","Data":"9c47cfc38fe04f3116f6be9fc3e51a14d2956f15a028b52d693eeaf168208d35"} Feb 26 18:29:05 crc kubenswrapper[5033]: I0226 18:29:05.139736 5033 generic.go:334] "Generic (PLEG): container finished" podID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerID="2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35" exitCode=0 Feb 26 18:29:05 crc kubenswrapper[5033]: I0226 18:29:05.139831 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bzhv" event={"ID":"785aa2eb-71a2-451b-b228-d3f018ec15ec","Type":"ContainerDied","Data":"2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35"} Feb 26 18:29:05 crc kubenswrapper[5033]: I0226 18:29:05.142113 5033 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 18:29:07 crc kubenswrapper[5033]: I0226 18:29:07.158591 5033 generic.go:334] "Generic (PLEG): container finished" podID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerID="771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e" exitCode=0 Feb 26 18:29:07 crc kubenswrapper[5033]: I0226 18:29:07.158693 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bzhv" event={"ID":"785aa2eb-71a2-451b-b228-d3f018ec15ec","Type":"ContainerDied","Data":"771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e"} Feb 26 18:29:07 crc kubenswrapper[5033]: I0226 18:29:07.673369 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk_fed827dc-359a-4184-bd34-e6c223ab63e2/util/0.log" Feb 26 18:29:07 crc kubenswrapper[5033]: I0226 18:29:07.867051 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk_fed827dc-359a-4184-bd34-e6c223ab63e2/pull/0.log" Feb 26 18:29:07 crc kubenswrapper[5033]: I0226 18:29:07.884628 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk_fed827dc-359a-4184-bd34-e6c223ab63e2/pull/0.log" Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.057394 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk_fed827dc-359a-4184-bd34-e6c223ab63e2/extract/0.log" Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.171990 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bzhv" event={"ID":"785aa2eb-71a2-451b-b228-d3f018ec15ec","Type":"ContainerStarted","Data":"e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a"} Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.202936 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7bzhv" podStartSLOduration=2.479136779 podStartE2EDuration="5.202908333s" podCreationTimestamp="2026-02-26 18:29:03 +0000 UTC" firstStartedPulling="2026-02-26 18:29:05.14190292 +0000 UTC m=+10072.354501570" lastFinishedPulling="2026-02-26 18:29:07.865674474 +0000 UTC m=+10075.078273124" observedRunningTime="2026-02-26 18:29:08.19038887 +0000 UTC m=+10075.402987530" watchObservedRunningTime="2026-02-26 18:29:08.202908333 +0000 UTC m=+10075.415506983" Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.372275 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk_fed827dc-359a-4184-bd34-e6c223ab63e2/util/0.log" Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.380366 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk_fed827dc-359a-4184-bd34-e6c223ab63e2/util/0.log" Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.466902 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822r4tk_fed827dc-359a-4184-bd34-e6c223ab63e2/pull/0.log" Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.529429 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp_19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3/util/0.log" Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.748237 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp_19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3/util/0.log" Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.780859 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp_19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3/pull/0.log" Feb 26 18:29:08 crc kubenswrapper[5033]: I0226 18:29:08.832957 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp_19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3/pull/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.120484 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp_19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3/pull/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.142242 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp_19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3/util/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.216089 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59wmqp_19f363c4-0c5d-4e7a-9532-41e9ae4c3fa3/extract/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.361483 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6_18742a04-1627-4e97-b9ae-81b4f5457d05/util/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.584641 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6_18742a04-1627-4e97-b9ae-81b4f5457d05/pull/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.616146 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6_18742a04-1627-4e97-b9ae-81b4f5457d05/util/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.662983 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6_18742a04-1627-4e97-b9ae-81b4f5457d05/pull/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.922067 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6_18742a04-1627-4e97-b9ae-81b4f5457d05/extract/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.937922 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6_18742a04-1627-4e97-b9ae-81b4f5457d05/util/0.log" Feb 26 18:29:09 crc kubenswrapper[5033]: I0226 18:29:09.972128 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082z8v6_18742a04-1627-4e97-b9ae-81b4f5457d05/pull/0.log" Feb 26 18:29:10 crc kubenswrapper[5033]: I0226 18:29:10.157099 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rthm4_99b57b7d-f577-4511-99db-fb69eab2bfdc/extract-utilities/0.log" Feb 26 18:29:10 crc kubenswrapper[5033]: I0226 18:29:10.371746 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rthm4_99b57b7d-f577-4511-99db-fb69eab2bfdc/extract-utilities/0.log" Feb 26 18:29:10 crc kubenswrapper[5033]: I0226 18:29:10.388389 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rthm4_99b57b7d-f577-4511-99db-fb69eab2bfdc/extract-content/0.log" Feb 26 18:29:10 crc kubenswrapper[5033]: I0226 18:29:10.411822 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rthm4_99b57b7d-f577-4511-99db-fb69eab2bfdc/extract-content/0.log" Feb 26 18:29:10 crc kubenswrapper[5033]: I0226 18:29:10.542036 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rthm4_99b57b7d-f577-4511-99db-fb69eab2bfdc/extract-utilities/0.log" Feb 26 18:29:10 crc kubenswrapper[5033]: I0226 18:29:10.625469 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rthm4_99b57b7d-f577-4511-99db-fb69eab2bfdc/extract-content/0.log" Feb 26 18:29:10 crc kubenswrapper[5033]: I0226 18:29:10.798222 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mwf6x_1799f267-5d16-49d2-81d6-1cf0836f0776/extract-utilities/0.log" Feb 26 18:29:11 crc kubenswrapper[5033]: I0226 18:29:11.141713 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mwf6x_1799f267-5d16-49d2-81d6-1cf0836f0776/extract-content/0.log" Feb 26 18:29:11 crc kubenswrapper[5033]: I0226 18:29:11.160826 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mwf6x_1799f267-5d16-49d2-81d6-1cf0836f0776/extract-utilities/0.log" Feb 26 18:29:11 crc kubenswrapper[5033]: I0226 18:29:11.228684 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mwf6x_1799f267-5d16-49d2-81d6-1cf0836f0776/extract-content/0.log" Feb 26 18:29:11 crc kubenswrapper[5033]: I0226 18:29:11.497558 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mwf6x_1799f267-5d16-49d2-81d6-1cf0836f0776/extract-utilities/0.log" Feb 26 18:29:11 crc kubenswrapper[5033]: I0226 18:29:11.773905 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mwf6x_1799f267-5d16-49d2-81d6-1cf0836f0776/extract-content/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.012610 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv_08c209a8-a926-4b83-a443-6959c51f2d6c/util/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.180896 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv_08c209a8-a926-4b83-a443-6959c51f2d6c/util/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.225051 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv_08c209a8-a926-4b83-a443-6959c51f2d6c/pull/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.438430 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rthm4_99b57b7d-f577-4511-99db-fb69eab2bfdc/registry-server/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.450521 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv_08c209a8-a926-4b83-a443-6959c51f2d6c/pull/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.572801 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv_08c209a8-a926-4b83-a443-6959c51f2d6c/util/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.647506 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv_08c209a8-a926-4b83-a443-6959c51f2d6c/extract/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.700835 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42ztxv_08c209a8-a926-4b83-a443-6959c51f2d6c/pull/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.815827 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-dsvmf_3c5e786b-b322-4756-9ad5-678d16b4b9f6/marketplace-operator/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.884427 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mwf6x_1799f267-5d16-49d2-81d6-1cf0836f0776/registry-server/0.log" Feb 26 18:29:12 crc kubenswrapper[5033]: I0226 18:29:12.913501 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7bzhv_785aa2eb-71a2-451b-b228-d3f018ec15ec/extract-utilities/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.091753 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7bzhv_785aa2eb-71a2-451b-b228-d3f018ec15ec/extract-utilities/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.106555 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7bzhv_785aa2eb-71a2-451b-b228-d3f018ec15ec/extract-content/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.152375 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7bzhv_785aa2eb-71a2-451b-b228-d3f018ec15ec/extract-content/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.278783 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7bzhv_785aa2eb-71a2-451b-b228-d3f018ec15ec/extract-utilities/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.287563 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7bzhv_785aa2eb-71a2-451b-b228-d3f018ec15ec/extract-content/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.289441 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7bzhv_785aa2eb-71a2-451b-b228-d3f018ec15ec/registry-server/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.380032 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9b5td_215f71fa-45e7-4a6b-9401-60e194203f62/extract-utilities/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.549324 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9b5td_215f71fa-45e7-4a6b-9401-60e194203f62/extract-utilities/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.573163 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9b5td_215f71fa-45e7-4a6b-9401-60e194203f62/extract-content/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.580590 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9b5td_215f71fa-45e7-4a6b-9401-60e194203f62/extract-content/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.611372 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.611444 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.642055 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:29:13 crc kubenswrapper[5033]: E0226 18:29:13.642367 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.670116 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.746367 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9b5td_215f71fa-45e7-4a6b-9401-60e194203f62/extract-content/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.775791 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9b5td_215f71fa-45e7-4a6b-9401-60e194203f62/extract-utilities/0.log" Feb 26 18:29:13 crc kubenswrapper[5033]: I0226 18:29:13.835424 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pscms_4ac064e0-a007-41dd-9783-f7be886e19a0/extract-utilities/0.log" Feb 26 18:29:14 crc kubenswrapper[5033]: I0226 18:29:14.075327 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pscms_4ac064e0-a007-41dd-9783-f7be886e19a0/extract-utilities/0.log" Feb 26 18:29:14 crc kubenswrapper[5033]: I0226 18:29:14.085477 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pscms_4ac064e0-a007-41dd-9783-f7be886e19a0/extract-content/0.log" Feb 26 18:29:14 crc kubenswrapper[5033]: I0226 18:29:14.108221 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pscms_4ac064e0-a007-41dd-9783-f7be886e19a0/extract-content/0.log" Feb 26 18:29:14 crc kubenswrapper[5033]: I0226 18:29:14.204279 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9b5td_215f71fa-45e7-4a6b-9401-60e194203f62/registry-server/0.log" Feb 26 18:29:14 crc kubenswrapper[5033]: I0226 18:29:14.271799 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pscms_4ac064e0-a007-41dd-9783-f7be886e19a0/extract-utilities/0.log" Feb 26 18:29:14 crc kubenswrapper[5033]: I0226 18:29:14.278893 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:14 crc kubenswrapper[5033]: I0226 18:29:14.313464 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pscms_4ac064e0-a007-41dd-9783-f7be886e19a0/extract-content/0.log" Feb 26 18:29:14 crc kubenswrapper[5033]: I0226 18:29:14.342306 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bzhv"] Feb 26 18:29:14 crc kubenswrapper[5033]: I0226 18:29:14.682589 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-pscms_4ac064e0-a007-41dd-9783-f7be886e19a0/registry-server/0.log" Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.245235 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7bzhv" podUID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerName="registry-server" containerID="cri-o://e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a" gracePeriod=2 Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.757526 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.828700 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-catalog-content\") pod \"785aa2eb-71a2-451b-b228-d3f018ec15ec\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.829116 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-utilities\") pod \"785aa2eb-71a2-451b-b228-d3f018ec15ec\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.829175 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59nqj\" (UniqueName: \"kubernetes.io/projected/785aa2eb-71a2-451b-b228-d3f018ec15ec-kube-api-access-59nqj\") pod \"785aa2eb-71a2-451b-b228-d3f018ec15ec\" (UID: \"785aa2eb-71a2-451b-b228-d3f018ec15ec\") " Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.829831 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-utilities" (OuterVolumeSpecName: "utilities") pod "785aa2eb-71a2-451b-b228-d3f018ec15ec" (UID: "785aa2eb-71a2-451b-b228-d3f018ec15ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.840712 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/785aa2eb-71a2-451b-b228-d3f018ec15ec-kube-api-access-59nqj" (OuterVolumeSpecName: "kube-api-access-59nqj") pod "785aa2eb-71a2-451b-b228-d3f018ec15ec" (UID: "785aa2eb-71a2-451b-b228-d3f018ec15ec"). InnerVolumeSpecName "kube-api-access-59nqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.851376 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "785aa2eb-71a2-451b-b228-d3f018ec15ec" (UID: "785aa2eb-71a2-451b-b228-d3f018ec15ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.933079 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.933125 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785aa2eb-71a2-451b-b228-d3f018ec15ec-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:29:16 crc kubenswrapper[5033]: I0226 18:29:16.933139 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59nqj\" (UniqueName: \"kubernetes.io/projected/785aa2eb-71a2-451b-b228-d3f018ec15ec-kube-api-access-59nqj\") on node \"crc\" DevicePath \"\"" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.257449 5033 generic.go:334] "Generic (PLEG): container finished" podID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerID="e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a" exitCode=0 Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.257547 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7bzhv" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.257607 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bzhv" event={"ID":"785aa2eb-71a2-451b-b228-d3f018ec15ec","Type":"ContainerDied","Data":"e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a"} Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.257719 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7bzhv" event={"ID":"785aa2eb-71a2-451b-b228-d3f018ec15ec","Type":"ContainerDied","Data":"9c47cfc38fe04f3116f6be9fc3e51a14d2956f15a028b52d693eeaf168208d35"} Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.257756 5033 scope.go:117] "RemoveContainer" containerID="e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.288531 5033 scope.go:117] "RemoveContainer" containerID="771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.296012 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bzhv"] Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.306692 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7bzhv"] Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.322230 5033 scope.go:117] "RemoveContainer" containerID="2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.367664 5033 scope.go:117] "RemoveContainer" containerID="e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a" Feb 26 18:29:17 crc kubenswrapper[5033]: E0226 18:29:17.368317 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a\": container with ID starting with e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a not found: ID does not exist" containerID="e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.368440 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a"} err="failed to get container status \"e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a\": rpc error: code = NotFound desc = could not find container \"e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a\": container with ID starting with e389f64f7aaca19bd268289e97bd69b14941e6f6d2f4be54b4deecdf86ea452a not found: ID does not exist" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.368549 5033 scope.go:117] "RemoveContainer" containerID="771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e" Feb 26 18:29:17 crc kubenswrapper[5033]: E0226 18:29:17.369112 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e\": container with ID starting with 771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e not found: ID does not exist" containerID="771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.369180 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e"} err="failed to get container status \"771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e\": rpc error: code = NotFound desc = could not find container \"771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e\": container with ID starting with 771b0c4bd5ca5cbdfab7651854f0699c52da5545f3368526468ac6d91b76fd3e not found: ID does not exist" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.369212 5033 scope.go:117] "RemoveContainer" containerID="2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35" Feb 26 18:29:17 crc kubenswrapper[5033]: E0226 18:29:17.369679 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35\": container with ID starting with 2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35 not found: ID does not exist" containerID="2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.369701 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35"} err="failed to get container status \"2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35\": rpc error: code = NotFound desc = could not find container \"2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35\": container with ID starting with 2a1b0f521422b19f5010bd15e5a988ddcaab3dcfb56d176bb1dab1022288de35 not found: ID does not exist" Feb 26 18:29:17 crc kubenswrapper[5033]: I0226 18:29:17.643656 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="785aa2eb-71a2-451b-b228-d3f018ec15ec" path="/var/lib/kubelet/pods/785aa2eb-71a2-451b-b228-d3f018ec15ec/volumes" Feb 26 18:29:25 crc kubenswrapper[5033]: I0226 18:29:25.628048 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:29:25 crc kubenswrapper[5033]: E0226 18:29:25.630355 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:29:26 crc kubenswrapper[5033]: I0226 18:29:26.961147 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-7nhbd_675d6580-5f3e-46b9-a1e9-83127d5f1948/prometheus-operator/0.log" Feb 26 18:29:27 crc kubenswrapper[5033]: I0226 18:29:27.010723 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86b77b5f5d-sw9jg_f5a524af-f845-4af4-9a29-1b1bd1d7a57e/prometheus-operator-admission-webhook/0.log" Feb 26 18:29:27 crc kubenswrapper[5033]: I0226 18:29:27.070635 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-86b77b5f5d-mntls_a23d8887-760f-43dc-83f4-7d0eddea6db1/prometheus-operator-admission-webhook/0.log" Feb 26 18:29:27 crc kubenswrapper[5033]: I0226 18:29:27.150519 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-cjxmz_ed6d813c-516f-4d4d-b20b-6534359e0220/operator/0.log" Feb 26 18:29:27 crc kubenswrapper[5033]: I0226 18:29:27.212816 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-c9bgz_ff5f88ba-1664-44b4-b1da-84fe25772c98/perses-operator/0.log" Feb 26 18:29:38 crc kubenswrapper[5033]: I0226 18:29:38.629099 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:29:38 crc kubenswrapper[5033]: E0226 18:29:38.630565 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:29:52 crc kubenswrapper[5033]: I0226 18:29:52.629244 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:29:52 crc kubenswrapper[5033]: E0226 18:29:52.630622 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.144993 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535510-lvb57"] Feb 26 18:30:00 crc kubenswrapper[5033]: E0226 18:30:00.146439 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerName="registry-server" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.146464 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerName="registry-server" Feb 26 18:30:00 crc kubenswrapper[5033]: E0226 18:30:00.146507 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerName="extract-content" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.146521 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerName="extract-content" Feb 26 18:30:00 crc kubenswrapper[5033]: E0226 18:30:00.146538 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerName="extract-utilities" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.146548 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerName="extract-utilities" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.146903 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="785aa2eb-71a2-451b-b228-d3f018ec15ec" containerName="registry-server" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.148401 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535510-lvb57" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.151757 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.152100 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.152278 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.160457 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz"] Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.162153 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.164376 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.164701 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.172593 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535510-lvb57"] Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.185501 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz"] Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.286447 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc2b52f8-41ec-4999-8290-39df708dd8f9-secret-volume\") pod \"collect-profiles-29535510-24zmz\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.286696 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7mxh\" (UniqueName: \"kubernetes.io/projected/bc2b52f8-41ec-4999-8290-39df708dd8f9-kube-api-access-g7mxh\") pod \"collect-profiles-29535510-24zmz\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.286733 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k87gl\" (UniqueName: \"kubernetes.io/projected/2fbb55b2-b048-4f09-9976-d8fe4a5efc72-kube-api-access-k87gl\") pod \"auto-csr-approver-29535510-lvb57\" (UID: \"2fbb55b2-b048-4f09-9976-d8fe4a5efc72\") " pod="openshift-infra/auto-csr-approver-29535510-lvb57" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.286808 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc2b52f8-41ec-4999-8290-39df708dd8f9-config-volume\") pod \"collect-profiles-29535510-24zmz\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.389086 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7mxh\" (UniqueName: \"kubernetes.io/projected/bc2b52f8-41ec-4999-8290-39df708dd8f9-kube-api-access-g7mxh\") pod \"collect-profiles-29535510-24zmz\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.389149 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k87gl\" (UniqueName: \"kubernetes.io/projected/2fbb55b2-b048-4f09-9976-d8fe4a5efc72-kube-api-access-k87gl\") pod \"auto-csr-approver-29535510-lvb57\" (UID: \"2fbb55b2-b048-4f09-9976-d8fe4a5efc72\") " pod="openshift-infra/auto-csr-approver-29535510-lvb57" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.389210 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc2b52f8-41ec-4999-8290-39df708dd8f9-config-volume\") pod \"collect-profiles-29535510-24zmz\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.389285 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc2b52f8-41ec-4999-8290-39df708dd8f9-secret-volume\") pod \"collect-profiles-29535510-24zmz\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.391080 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc2b52f8-41ec-4999-8290-39df708dd8f9-config-volume\") pod \"collect-profiles-29535510-24zmz\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.395876 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc2b52f8-41ec-4999-8290-39df708dd8f9-secret-volume\") pod \"collect-profiles-29535510-24zmz\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.407082 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k87gl\" (UniqueName: \"kubernetes.io/projected/2fbb55b2-b048-4f09-9976-d8fe4a5efc72-kube-api-access-k87gl\") pod \"auto-csr-approver-29535510-lvb57\" (UID: \"2fbb55b2-b048-4f09-9976-d8fe4a5efc72\") " pod="openshift-infra/auto-csr-approver-29535510-lvb57" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.407833 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7mxh\" (UniqueName: \"kubernetes.io/projected/bc2b52f8-41ec-4999-8290-39df708dd8f9-kube-api-access-g7mxh\") pod \"collect-profiles-29535510-24zmz\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.467867 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535510-lvb57" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.488019 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:00 crc kubenswrapper[5033]: I0226 18:30:00.946708 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535510-lvb57"] Feb 26 18:30:01 crc kubenswrapper[5033]: I0226 18:30:01.050662 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz"] Feb 26 18:30:01 crc kubenswrapper[5033]: I0226 18:30:01.708076 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" event={"ID":"bc2b52f8-41ec-4999-8290-39df708dd8f9","Type":"ContainerStarted","Data":"b9804b6d10bbbb842315883d691dbb01d1f4b6688f40b26626a249fcda5735ae"} Feb 26 18:30:01 crc kubenswrapper[5033]: I0226 18:30:01.708119 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" event={"ID":"bc2b52f8-41ec-4999-8290-39df708dd8f9","Type":"ContainerStarted","Data":"f024d4f155c2e9873dc8043494aafedfedad82bb7774bb2f769f89de81fabb5a"} Feb 26 18:30:01 crc kubenswrapper[5033]: I0226 18:30:01.709684 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535510-lvb57" event={"ID":"2fbb55b2-b048-4f09-9976-d8fe4a5efc72","Type":"ContainerStarted","Data":"3d2b8bba3c4c8b54b22225a91b9447697642224faa1221cbc3ad49b6a23c661b"} Feb 26 18:30:01 crc kubenswrapper[5033]: I0226 18:30:01.733021 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" podStartSLOduration=1.733002412 podStartE2EDuration="1.733002412s" podCreationTimestamp="2026-02-26 18:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 18:30:01.724900516 +0000 UTC m=+10128.937499166" watchObservedRunningTime="2026-02-26 18:30:01.733002412 +0000 UTC m=+10128.945601052" Feb 26 18:30:02 crc kubenswrapper[5033]: I0226 18:30:02.724661 5033 generic.go:334] "Generic (PLEG): container finished" podID="bc2b52f8-41ec-4999-8290-39df708dd8f9" containerID="b9804b6d10bbbb842315883d691dbb01d1f4b6688f40b26626a249fcda5735ae" exitCode=0 Feb 26 18:30:02 crc kubenswrapper[5033]: I0226 18:30:02.724762 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" event={"ID":"bc2b52f8-41ec-4999-8290-39df708dd8f9","Type":"ContainerDied","Data":"b9804b6d10bbbb842315883d691dbb01d1f4b6688f40b26626a249fcda5735ae"} Feb 26 18:30:03 crc kubenswrapper[5033]: I0226 18:30:03.641930 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:30:03 crc kubenswrapper[5033]: E0226 18:30:03.642550 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.149704 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.280634 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc2b52f8-41ec-4999-8290-39df708dd8f9-config-volume\") pod \"bc2b52f8-41ec-4999-8290-39df708dd8f9\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.281018 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7mxh\" (UniqueName: \"kubernetes.io/projected/bc2b52f8-41ec-4999-8290-39df708dd8f9-kube-api-access-g7mxh\") pod \"bc2b52f8-41ec-4999-8290-39df708dd8f9\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.281232 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc2b52f8-41ec-4999-8290-39df708dd8f9-secret-volume\") pod \"bc2b52f8-41ec-4999-8290-39df708dd8f9\" (UID: \"bc2b52f8-41ec-4999-8290-39df708dd8f9\") " Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.283890 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc2b52f8-41ec-4999-8290-39df708dd8f9-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc2b52f8-41ec-4999-8290-39df708dd8f9" (UID: "bc2b52f8-41ec-4999-8290-39df708dd8f9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.288735 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc2b52f8-41ec-4999-8290-39df708dd8f9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bc2b52f8-41ec-4999-8290-39df708dd8f9" (UID: "bc2b52f8-41ec-4999-8290-39df708dd8f9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.290463 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2b52f8-41ec-4999-8290-39df708dd8f9-kube-api-access-g7mxh" (OuterVolumeSpecName: "kube-api-access-g7mxh") pod "bc2b52f8-41ec-4999-8290-39df708dd8f9" (UID: "bc2b52f8-41ec-4999-8290-39df708dd8f9"). InnerVolumeSpecName "kube-api-access-g7mxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.383804 5033 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc2b52f8-41ec-4999-8290-39df708dd8f9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.383836 5033 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc2b52f8-41ec-4999-8290-39df708dd8f9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.383848 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7mxh\" (UniqueName: \"kubernetes.io/projected/bc2b52f8-41ec-4999-8290-39df708dd8f9-kube-api-access-g7mxh\") on node \"crc\" DevicePath \"\"" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.747522 5033 generic.go:334] "Generic (PLEG): container finished" podID="2fbb55b2-b048-4f09-9976-d8fe4a5efc72" containerID="b5aefec407f82998a61f8ac93357756cd51b8769e4b049da13e5280545dcb096" exitCode=0 Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.747835 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535510-lvb57" event={"ID":"2fbb55b2-b048-4f09-9976-d8fe4a5efc72","Type":"ContainerDied","Data":"b5aefec407f82998a61f8ac93357756cd51b8769e4b049da13e5280545dcb096"} Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.752127 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" event={"ID":"bc2b52f8-41ec-4999-8290-39df708dd8f9","Type":"ContainerDied","Data":"f024d4f155c2e9873dc8043494aafedfedad82bb7774bb2f769f89de81fabb5a"} Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.752193 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f024d4f155c2e9873dc8043494aafedfedad82bb7774bb2f769f89de81fabb5a" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.752291 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535510-24zmz" Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.824894 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz"] Feb 26 18:30:04 crc kubenswrapper[5033]: I0226 18:30:04.834166 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535465-kxmgz"] Feb 26 18:30:05 crc kubenswrapper[5033]: I0226 18:30:05.668471 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc187ed7-3e79-423b-82e8-74d5c0bbfaf7" path="/var/lib/kubelet/pods/fc187ed7-3e79-423b-82e8-74d5c0bbfaf7/volumes" Feb 26 18:30:06 crc kubenswrapper[5033]: I0226 18:30:06.150203 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535510-lvb57" Feb 26 18:30:06 crc kubenswrapper[5033]: I0226 18:30:06.228501 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k87gl\" (UniqueName: \"kubernetes.io/projected/2fbb55b2-b048-4f09-9976-d8fe4a5efc72-kube-api-access-k87gl\") pod \"2fbb55b2-b048-4f09-9976-d8fe4a5efc72\" (UID: \"2fbb55b2-b048-4f09-9976-d8fe4a5efc72\") " Feb 26 18:30:06 crc kubenswrapper[5033]: I0226 18:30:06.235548 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbb55b2-b048-4f09-9976-d8fe4a5efc72-kube-api-access-k87gl" (OuterVolumeSpecName: "kube-api-access-k87gl") pod "2fbb55b2-b048-4f09-9976-d8fe4a5efc72" (UID: "2fbb55b2-b048-4f09-9976-d8fe4a5efc72"). InnerVolumeSpecName "kube-api-access-k87gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:30:06 crc kubenswrapper[5033]: I0226 18:30:06.331232 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k87gl\" (UniqueName: \"kubernetes.io/projected/2fbb55b2-b048-4f09-9976-d8fe4a5efc72-kube-api-access-k87gl\") on node \"crc\" DevicePath \"\"" Feb 26 18:30:06 crc kubenswrapper[5033]: I0226 18:30:06.777433 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535510-lvb57" event={"ID":"2fbb55b2-b048-4f09-9976-d8fe4a5efc72","Type":"ContainerDied","Data":"3d2b8bba3c4c8b54b22225a91b9447697642224faa1221cbc3ad49b6a23c661b"} Feb 26 18:30:06 crc kubenswrapper[5033]: I0226 18:30:06.778035 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d2b8bba3c4c8b54b22225a91b9447697642224faa1221cbc3ad49b6a23c661b" Feb 26 18:30:06 crc kubenswrapper[5033]: I0226 18:30:06.778152 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535510-lvb57" Feb 26 18:30:07 crc kubenswrapper[5033]: I0226 18:30:07.213059 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535504-9t6gr"] Feb 26 18:30:07 crc kubenswrapper[5033]: I0226 18:30:07.225935 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535504-9t6gr"] Feb 26 18:30:07 crc kubenswrapper[5033]: I0226 18:30:07.646799 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec0ceed-1997-4fe5-966a-d42618095b51" path="/var/lib/kubelet/pods/0ec0ceed-1997-4fe5-966a-d42618095b51/volumes" Feb 26 18:30:14 crc kubenswrapper[5033]: I0226 18:30:14.628344 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:30:14 crc kubenswrapper[5033]: E0226 18:30:14.629112 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:30:29 crc kubenswrapper[5033]: I0226 18:30:29.633714 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:30:29 crc kubenswrapper[5033]: E0226 18:30:29.634635 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.434824 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9bh27"] Feb 26 18:30:32 crc kubenswrapper[5033]: E0226 18:30:32.436190 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbb55b2-b048-4f09-9976-d8fe4a5efc72" containerName="oc" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.436206 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbb55b2-b048-4f09-9976-d8fe4a5efc72" containerName="oc" Feb 26 18:30:32 crc kubenswrapper[5033]: E0226 18:30:32.436271 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2b52f8-41ec-4999-8290-39df708dd8f9" containerName="collect-profiles" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.436279 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2b52f8-41ec-4999-8290-39df708dd8f9" containerName="collect-profiles" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.437241 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2b52f8-41ec-4999-8290-39df708dd8f9" containerName="collect-profiles" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.437300 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbb55b2-b048-4f09-9976-d8fe4a5efc72" containerName="oc" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.439499 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.454975 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9bh27"] Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.519825 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-utilities\") pod \"community-operators-9bh27\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.519922 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-catalog-content\") pod \"community-operators-9bh27\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.520131 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk584\" (UniqueName: \"kubernetes.io/projected/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-kube-api-access-nk584\") pod \"community-operators-9bh27\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.623021 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-utilities\") pod \"community-operators-9bh27\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.623785 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-utilities\") pod \"community-operators-9bh27\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.624002 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-catalog-content\") pod \"community-operators-9bh27\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.624141 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk584\" (UniqueName: \"kubernetes.io/projected/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-kube-api-access-nk584\") pod \"community-operators-9bh27\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.624412 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-catalog-content\") pod \"community-operators-9bh27\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.655751 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk584\" (UniqueName: \"kubernetes.io/projected/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-kube-api-access-nk584\") pod \"community-operators-9bh27\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:32 crc kubenswrapper[5033]: I0226 18:30:32.772753 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:33 crc kubenswrapper[5033]: I0226 18:30:33.333359 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9bh27"] Feb 26 18:30:34 crc kubenswrapper[5033]: I0226 18:30:34.109490 5033 generic.go:334] "Generic (PLEG): container finished" podID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerID="02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba" exitCode=0 Feb 26 18:30:34 crc kubenswrapper[5033]: I0226 18:30:34.109543 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bh27" event={"ID":"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1","Type":"ContainerDied","Data":"02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba"} Feb 26 18:30:34 crc kubenswrapper[5033]: I0226 18:30:34.109842 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bh27" event={"ID":"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1","Type":"ContainerStarted","Data":"51a6a0f1da32903f3ca24a1abfc8731d9667ea4c15e77c735b6b04b029e8cbbd"} Feb 26 18:30:36 crc kubenswrapper[5033]: I0226 18:30:36.135175 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bh27" event={"ID":"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1","Type":"ContainerStarted","Data":"4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c"} Feb 26 18:30:37 crc kubenswrapper[5033]: I0226 18:30:37.148633 5033 generic.go:334] "Generic (PLEG): container finished" podID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerID="4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c" exitCode=0 Feb 26 18:30:37 crc kubenswrapper[5033]: I0226 18:30:37.149774 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bh27" event={"ID":"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1","Type":"ContainerDied","Data":"4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c"} Feb 26 18:30:38 crc kubenswrapper[5033]: I0226 18:30:38.160212 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bh27" event={"ID":"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1","Type":"ContainerStarted","Data":"ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a"} Feb 26 18:30:38 crc kubenswrapper[5033]: I0226 18:30:38.184550 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9bh27" podStartSLOduration=2.730687825 podStartE2EDuration="6.184528256s" podCreationTimestamp="2026-02-26 18:30:32 +0000 UTC" firstStartedPulling="2026-02-26 18:30:34.111397683 +0000 UTC m=+10161.323996333" lastFinishedPulling="2026-02-26 18:30:37.565238114 +0000 UTC m=+10164.777836764" observedRunningTime="2026-02-26 18:30:38.178387738 +0000 UTC m=+10165.390986388" watchObservedRunningTime="2026-02-26 18:30:38.184528256 +0000 UTC m=+10165.397126906" Feb 26 18:30:42 crc kubenswrapper[5033]: I0226 18:30:42.629127 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:30:42 crc kubenswrapper[5033]: E0226 18:30:42.630423 5033 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxss5_openshift-machine-config-operator(3b30da39-cf26-480f-82fc-dbb869e9acbd)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" Feb 26 18:30:42 crc kubenswrapper[5033]: I0226 18:30:42.773630 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:42 crc kubenswrapper[5033]: I0226 18:30:42.774876 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:42 crc kubenswrapper[5033]: I0226 18:30:42.864356 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:43 crc kubenswrapper[5033]: I0226 18:30:43.257929 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:43 crc kubenswrapper[5033]: I0226 18:30:43.318567 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9bh27"] Feb 26 18:30:45 crc kubenswrapper[5033]: I0226 18:30:45.243848 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9bh27" podUID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerName="registry-server" containerID="cri-o://ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a" gracePeriod=2 Feb 26 18:30:45 crc kubenswrapper[5033]: I0226 18:30:45.770185 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:45 crc kubenswrapper[5033]: I0226 18:30:45.930438 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk584\" (UniqueName: \"kubernetes.io/projected/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-kube-api-access-nk584\") pod \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " Feb 26 18:30:45 crc kubenswrapper[5033]: I0226 18:30:45.930551 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-utilities\") pod \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " Feb 26 18:30:45 crc kubenswrapper[5033]: I0226 18:30:45.930581 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-catalog-content\") pod \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\" (UID: \"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1\") " Feb 26 18:30:45 crc kubenswrapper[5033]: I0226 18:30:45.933362 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-utilities" (OuterVolumeSpecName: "utilities") pod "72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" (UID: "72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:30:45 crc kubenswrapper[5033]: I0226 18:30:45.937396 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-kube-api-access-nk584" (OuterVolumeSpecName: "kube-api-access-nk584") pod "72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" (UID: "72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1"). InnerVolumeSpecName "kube-api-access-nk584". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.007209 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" (UID: "72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.034245 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk584\" (UniqueName: \"kubernetes.io/projected/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-kube-api-access-nk584\") on node \"crc\" DevicePath \"\"" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.034363 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.034379 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.261738 5033 generic.go:334] "Generic (PLEG): container finished" podID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerID="ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a" exitCode=0 Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.261789 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bh27" event={"ID":"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1","Type":"ContainerDied","Data":"ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a"} Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.261819 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9bh27" event={"ID":"72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1","Type":"ContainerDied","Data":"51a6a0f1da32903f3ca24a1abfc8731d9667ea4c15e77c735b6b04b029e8cbbd"} Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.261840 5033 scope.go:117] "RemoveContainer" containerID="ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.262000 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9bh27" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.287989 5033 scope.go:117] "RemoveContainer" containerID="4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.318827 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9bh27"] Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.331169 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9bh27"] Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.338634 5033 scope.go:117] "RemoveContainer" containerID="02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.365610 5033 scope.go:117] "RemoveContainer" containerID="ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a" Feb 26 18:30:46 crc kubenswrapper[5033]: E0226 18:30:46.366739 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a\": container with ID starting with ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a not found: ID does not exist" containerID="ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.366777 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a"} err="failed to get container status \"ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a\": rpc error: code = NotFound desc = could not find container \"ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a\": container with ID starting with ce70c88d7c77448aafb188eb71b2f07ac9a3f638c9c50c1d4b4e803be6f3049a not found: ID does not exist" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.366804 5033 scope.go:117] "RemoveContainer" containerID="4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c" Feb 26 18:30:46 crc kubenswrapper[5033]: E0226 18:30:46.367531 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c\": container with ID starting with 4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c not found: ID does not exist" containerID="4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.367576 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c"} err="failed to get container status \"4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c\": rpc error: code = NotFound desc = could not find container \"4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c\": container with ID starting with 4a17bb67762a60d470d79b3fb0c69ac086f36f75e402638e4540414ab208943c not found: ID does not exist" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.367606 5033 scope.go:117] "RemoveContainer" containerID="02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba" Feb 26 18:30:46 crc kubenswrapper[5033]: E0226 18:30:46.367917 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba\": container with ID starting with 02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba not found: ID does not exist" containerID="02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba" Feb 26 18:30:46 crc kubenswrapper[5033]: I0226 18:30:46.367951 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba"} err="failed to get container status \"02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba\": rpc error: code = NotFound desc = could not find container \"02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba\": container with ID starting with 02dd8393de1be2f162f4af8707ffcf6706c72c845eae812b48b3a09a65a70eba not found: ID does not exist" Feb 26 18:30:47 crc kubenswrapper[5033]: I0226 18:30:47.538200 5033 scope.go:117] "RemoveContainer" containerID="d2b35cba59bd5b843b78f28109a7311b9c11896921b945149169576b74de946c" Feb 26 18:30:47 crc kubenswrapper[5033]: I0226 18:30:47.648342 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" path="/var/lib/kubelet/pods/72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1/volumes" Feb 26 18:30:48 crc kubenswrapper[5033]: I0226 18:30:48.215191 5033 scope.go:117] "RemoveContainer" containerID="df9d1653d33e93662ec55fe9c6cbdebf148ca2fd8a3c96da4a75f84683f45b3c" Feb 26 18:30:55 crc kubenswrapper[5033]: I0226 18:30:55.628728 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:30:56 crc kubenswrapper[5033]: I0226 18:30:56.385496 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"0f12fead1855a694b2a5bc3fc00756962d95e4674bee7f9adb994587bd54369c"} Feb 26 18:31:43 crc kubenswrapper[5033]: I0226 18:31:43.890967 5033 generic.go:334] "Generic (PLEG): container finished" podID="715a121f-d869-4599-8718-2c62220d5e2b" containerID="a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9" exitCode=0 Feb 26 18:31:43 crc kubenswrapper[5033]: I0226 18:31:43.891139 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5586k/must-gather-hm6vr" event={"ID":"715a121f-d869-4599-8718-2c62220d5e2b","Type":"ContainerDied","Data":"a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9"} Feb 26 18:31:43 crc kubenswrapper[5033]: I0226 18:31:43.892275 5033 scope.go:117] "RemoveContainer" containerID="a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9" Feb 26 18:31:44 crc kubenswrapper[5033]: I0226 18:31:44.000513 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5586k_must-gather-hm6vr_715a121f-d869-4599-8718-2c62220d5e2b/gather/0.log" Feb 26 18:31:52 crc kubenswrapper[5033]: I0226 18:31:52.344388 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5586k/must-gather-hm6vr"] Feb 26 18:31:52 crc kubenswrapper[5033]: I0226 18:31:52.345228 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5586k/must-gather-hm6vr" podUID="715a121f-d869-4599-8718-2c62220d5e2b" containerName="copy" containerID="cri-o://ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f" gracePeriod=2 Feb 26 18:31:52 crc kubenswrapper[5033]: I0226 18:31:52.357063 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5586k/must-gather-hm6vr"] Feb 26 18:31:52 crc kubenswrapper[5033]: I0226 18:31:52.905121 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5586k_must-gather-hm6vr_715a121f-d869-4599-8718-2c62220d5e2b/copy/0.log" Feb 26 18:31:52 crc kubenswrapper[5033]: I0226 18:31:52.908686 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:31:52 crc kubenswrapper[5033]: I0226 18:31:52.986663 5033 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5586k_must-gather-hm6vr_715a121f-d869-4599-8718-2c62220d5e2b/copy/0.log" Feb 26 18:31:52 crc kubenswrapper[5033]: I0226 18:31:52.987489 5033 generic.go:334] "Generic (PLEG): container finished" podID="715a121f-d869-4599-8718-2c62220d5e2b" containerID="ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f" exitCode=143 Feb 26 18:31:52 crc kubenswrapper[5033]: I0226 18:31:52.987556 5033 scope.go:117] "RemoveContainer" containerID="ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f" Feb 26 18:31:52 crc kubenswrapper[5033]: I0226 18:31:52.987573 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5586k/must-gather-hm6vr" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.016637 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/715a121f-d869-4599-8718-2c62220d5e2b-must-gather-output\") pod \"715a121f-d869-4599-8718-2c62220d5e2b\" (UID: \"715a121f-d869-4599-8718-2c62220d5e2b\") " Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.016901 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s25j\" (UniqueName: \"kubernetes.io/projected/715a121f-d869-4599-8718-2c62220d5e2b-kube-api-access-6s25j\") pod \"715a121f-d869-4599-8718-2c62220d5e2b\" (UID: \"715a121f-d869-4599-8718-2c62220d5e2b\") " Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.019109 5033 scope.go:117] "RemoveContainer" containerID="a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.025224 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715a121f-d869-4599-8718-2c62220d5e2b-kube-api-access-6s25j" (OuterVolumeSpecName: "kube-api-access-6s25j") pod "715a121f-d869-4599-8718-2c62220d5e2b" (UID: "715a121f-d869-4599-8718-2c62220d5e2b"). InnerVolumeSpecName "kube-api-access-6s25j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.121691 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s25j\" (UniqueName: \"kubernetes.io/projected/715a121f-d869-4599-8718-2c62220d5e2b-kube-api-access-6s25j\") on node \"crc\" DevicePath \"\"" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.133317 5033 scope.go:117] "RemoveContainer" containerID="ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f" Feb 26 18:31:53 crc kubenswrapper[5033]: E0226 18:31:53.134919 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f\": container with ID starting with ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f not found: ID does not exist" containerID="ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.134959 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f"} err="failed to get container status \"ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f\": rpc error: code = NotFound desc = could not find container \"ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f\": container with ID starting with ec0dca446793939b43346871ca6ca9b9be8df0731d3058d23ec670f39deb705f not found: ID does not exist" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.134986 5033 scope.go:117] "RemoveContainer" containerID="a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9" Feb 26 18:31:53 crc kubenswrapper[5033]: E0226 18:31:53.135498 5033 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9\": container with ID starting with a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9 not found: ID does not exist" containerID="a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.135530 5033 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9"} err="failed to get container status \"a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9\": rpc error: code = NotFound desc = could not find container \"a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9\": container with ID starting with a3f65550678e8c83775340ff9140702f4e3c686daad9db2aae5d0cbdd36791d9 not found: ID does not exist" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.265296 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715a121f-d869-4599-8718-2c62220d5e2b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "715a121f-d869-4599-8718-2c62220d5e2b" (UID: "715a121f-d869-4599-8718-2c62220d5e2b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.326862 5033 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/715a121f-d869-4599-8718-2c62220d5e2b-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 26 18:31:53 crc kubenswrapper[5033]: I0226 18:31:53.643876 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715a121f-d869-4599-8718-2c62220d5e2b" path="/var/lib/kubelet/pods/715a121f-d869-4599-8718-2c62220d5e2b/volumes" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.141997 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535512-s2br7"] Feb 26 18:32:00 crc kubenswrapper[5033]: E0226 18:32:00.142989 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715a121f-d869-4599-8718-2c62220d5e2b" containerName="gather" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.143001 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="715a121f-d869-4599-8718-2c62220d5e2b" containerName="gather" Feb 26 18:32:00 crc kubenswrapper[5033]: E0226 18:32:00.143018 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715a121f-d869-4599-8718-2c62220d5e2b" containerName="copy" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.143024 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="715a121f-d869-4599-8718-2c62220d5e2b" containerName="copy" Feb 26 18:32:00 crc kubenswrapper[5033]: E0226 18:32:00.143053 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerName="registry-server" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.143060 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerName="registry-server" Feb 26 18:32:00 crc kubenswrapper[5033]: E0226 18:32:00.143079 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerName="extract-content" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.143085 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerName="extract-content" Feb 26 18:32:00 crc kubenswrapper[5033]: E0226 18:32:00.143100 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerName="extract-utilities" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.143106 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerName="extract-utilities" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.143325 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a6a21e-1cd8-4d3c-8862-ad4b5642d5a1" containerName="registry-server" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.143364 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="715a121f-d869-4599-8718-2c62220d5e2b" containerName="copy" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.143373 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="715a121f-d869-4599-8718-2c62220d5e2b" containerName="gather" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.144232 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535512-s2br7" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.146854 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.146855 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.148519 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.158418 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535512-s2br7"] Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.269390 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f66jz\" (UniqueName: \"kubernetes.io/projected/5250fdb8-eb10-45a5-ac3a-e39463c226f4-kube-api-access-f66jz\") pod \"auto-csr-approver-29535512-s2br7\" (UID: \"5250fdb8-eb10-45a5-ac3a-e39463c226f4\") " pod="openshift-infra/auto-csr-approver-29535512-s2br7" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.371526 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f66jz\" (UniqueName: \"kubernetes.io/projected/5250fdb8-eb10-45a5-ac3a-e39463c226f4-kube-api-access-f66jz\") pod \"auto-csr-approver-29535512-s2br7\" (UID: \"5250fdb8-eb10-45a5-ac3a-e39463c226f4\") " pod="openshift-infra/auto-csr-approver-29535512-s2br7" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.392317 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f66jz\" (UniqueName: \"kubernetes.io/projected/5250fdb8-eb10-45a5-ac3a-e39463c226f4-kube-api-access-f66jz\") pod \"auto-csr-approver-29535512-s2br7\" (UID: \"5250fdb8-eb10-45a5-ac3a-e39463c226f4\") " pod="openshift-infra/auto-csr-approver-29535512-s2br7" Feb 26 18:32:00 crc kubenswrapper[5033]: I0226 18:32:00.467277 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535512-s2br7" Feb 26 18:32:01 crc kubenswrapper[5033]: I0226 18:32:01.028403 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535512-s2br7"] Feb 26 18:32:01 crc kubenswrapper[5033]: I0226 18:32:01.066901 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535512-s2br7" event={"ID":"5250fdb8-eb10-45a5-ac3a-e39463c226f4","Type":"ContainerStarted","Data":"eb1558c62a852113de5ceb20243342b00a568b3c2b99c75a02a34d308c4ea961"} Feb 26 18:32:03 crc kubenswrapper[5033]: I0226 18:32:03.089493 5033 generic.go:334] "Generic (PLEG): container finished" podID="5250fdb8-eb10-45a5-ac3a-e39463c226f4" containerID="99c10dfc5c358560c4ff3367234cd47f0eb0ba04ac3235a688f308548009f939" exitCode=0 Feb 26 18:32:03 crc kubenswrapper[5033]: I0226 18:32:03.089589 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535512-s2br7" event={"ID":"5250fdb8-eb10-45a5-ac3a-e39463c226f4","Type":"ContainerDied","Data":"99c10dfc5c358560c4ff3367234cd47f0eb0ba04ac3235a688f308548009f939"} Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:04.486917 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535512-s2br7" Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:04.564722 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f66jz\" (UniqueName: \"kubernetes.io/projected/5250fdb8-eb10-45a5-ac3a-e39463c226f4-kube-api-access-f66jz\") pod \"5250fdb8-eb10-45a5-ac3a-e39463c226f4\" (UID: \"5250fdb8-eb10-45a5-ac3a-e39463c226f4\") " Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:04.572953 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5250fdb8-eb10-45a5-ac3a-e39463c226f4-kube-api-access-f66jz" (OuterVolumeSpecName: "kube-api-access-f66jz") pod "5250fdb8-eb10-45a5-ac3a-e39463c226f4" (UID: "5250fdb8-eb10-45a5-ac3a-e39463c226f4"). InnerVolumeSpecName "kube-api-access-f66jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:04.667919 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f66jz\" (UniqueName: \"kubernetes.io/projected/5250fdb8-eb10-45a5-ac3a-e39463c226f4-kube-api-access-f66jz\") on node \"crc\" DevicePath \"\"" Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:05.109171 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535512-s2br7" event={"ID":"5250fdb8-eb10-45a5-ac3a-e39463c226f4","Type":"ContainerDied","Data":"eb1558c62a852113de5ceb20243342b00a568b3c2b99c75a02a34d308c4ea961"} Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:05.109207 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb1558c62a852113de5ceb20243342b00a568b3c2b99c75a02a34d308c4ea961" Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:05.109235 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535512-s2br7" Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:05.566700 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535506-xtsfl"] Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:05.575192 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535506-xtsfl"] Feb 26 18:32:05 crc kubenswrapper[5033]: I0226 18:32:05.639808 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc8a4fa-1728-4825-87e5-424c0fa32e51" path="/var/lib/kubelet/pods/5fc8a4fa-1728-4825-87e5-424c0fa32e51/volumes" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.683991 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rw5wh"] Feb 26 18:32:09 crc kubenswrapper[5033]: E0226 18:32:09.685793 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5250fdb8-eb10-45a5-ac3a-e39463c226f4" containerName="oc" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.685826 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="5250fdb8-eb10-45a5-ac3a-e39463c226f4" containerName="oc" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.687027 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="5250fdb8-eb10-45a5-ac3a-e39463c226f4" containerName="oc" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.702223 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.702231 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rw5wh"] Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.878871 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-catalog-content\") pod \"redhat-operators-rw5wh\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.879005 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9bhw\" (UniqueName: \"kubernetes.io/projected/89b9025d-3f12-496b-9071-aad3f48da53a-kube-api-access-s9bhw\") pod \"redhat-operators-rw5wh\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.879088 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-utilities\") pod \"redhat-operators-rw5wh\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.980750 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9bhw\" (UniqueName: \"kubernetes.io/projected/89b9025d-3f12-496b-9071-aad3f48da53a-kube-api-access-s9bhw\") pod \"redhat-operators-rw5wh\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.980894 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-utilities\") pod \"redhat-operators-rw5wh\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.980957 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-catalog-content\") pod \"redhat-operators-rw5wh\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.981483 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-catalog-content\") pod \"redhat-operators-rw5wh\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:09 crc kubenswrapper[5033]: I0226 18:32:09.981759 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-utilities\") pod \"redhat-operators-rw5wh\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:10 crc kubenswrapper[5033]: I0226 18:32:10.013207 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9bhw\" (UniqueName: \"kubernetes.io/projected/89b9025d-3f12-496b-9071-aad3f48da53a-kube-api-access-s9bhw\") pod \"redhat-operators-rw5wh\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:10 crc kubenswrapper[5033]: I0226 18:32:10.037514 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:10 crc kubenswrapper[5033]: I0226 18:32:10.624942 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rw5wh"] Feb 26 18:32:11 crc kubenswrapper[5033]: I0226 18:32:11.173604 5033 generic.go:334] "Generic (PLEG): container finished" podID="89b9025d-3f12-496b-9071-aad3f48da53a" containerID="2d0c9e2c06cbadcadfb76ff1b7863cff6bb8abbc938ddc800ef37b690b19e2d6" exitCode=0 Feb 26 18:32:11 crc kubenswrapper[5033]: I0226 18:32:11.173685 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw5wh" event={"ID":"89b9025d-3f12-496b-9071-aad3f48da53a","Type":"ContainerDied","Data":"2d0c9e2c06cbadcadfb76ff1b7863cff6bb8abbc938ddc800ef37b690b19e2d6"} Feb 26 18:32:11 crc kubenswrapper[5033]: I0226 18:32:11.173927 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw5wh" event={"ID":"89b9025d-3f12-496b-9071-aad3f48da53a","Type":"ContainerStarted","Data":"6fd8236307ce62dac2148c240210f874beb4beb7e6ac02c7e59664be0bf1b9b0"} Feb 26 18:32:13 crc kubenswrapper[5033]: I0226 18:32:13.192472 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw5wh" event={"ID":"89b9025d-3f12-496b-9071-aad3f48da53a","Type":"ContainerStarted","Data":"01b1109fe370bdf789c33790b3ebe83f4e47f3ce65ccb9253f96619dbf7af509"} Feb 26 18:32:17 crc kubenswrapper[5033]: I0226 18:32:17.230129 5033 generic.go:334] "Generic (PLEG): container finished" podID="89b9025d-3f12-496b-9071-aad3f48da53a" containerID="01b1109fe370bdf789c33790b3ebe83f4e47f3ce65ccb9253f96619dbf7af509" exitCode=0 Feb 26 18:32:17 crc kubenswrapper[5033]: I0226 18:32:17.230557 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw5wh" event={"ID":"89b9025d-3f12-496b-9071-aad3f48da53a","Type":"ContainerDied","Data":"01b1109fe370bdf789c33790b3ebe83f4e47f3ce65ccb9253f96619dbf7af509"} Feb 26 18:32:21 crc kubenswrapper[5033]: I0226 18:32:21.271709 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw5wh" event={"ID":"89b9025d-3f12-496b-9071-aad3f48da53a","Type":"ContainerStarted","Data":"eb57e1836e7ca915e414f9a55bde3663d5e5bcad3ac55a48f1fad4f2b4ed8721"} Feb 26 18:32:21 crc kubenswrapper[5033]: I0226 18:32:21.300006 5033 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rw5wh" podStartSLOduration=2.478815081 podStartE2EDuration="12.299983593s" podCreationTimestamp="2026-02-26 18:32:09 +0000 UTC" firstStartedPulling="2026-02-26 18:32:11.176749213 +0000 UTC m=+10258.389347863" lastFinishedPulling="2026-02-26 18:32:20.997917725 +0000 UTC m=+10268.210516375" observedRunningTime="2026-02-26 18:32:21.291860456 +0000 UTC m=+10268.504459106" watchObservedRunningTime="2026-02-26 18:32:21.299983593 +0000 UTC m=+10268.512582243" Feb 26 18:32:30 crc kubenswrapper[5033]: I0226 18:32:30.038634 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:30 crc kubenswrapper[5033]: I0226 18:32:30.039181 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:30 crc kubenswrapper[5033]: I0226 18:32:30.089673 5033 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:30 crc kubenswrapper[5033]: I0226 18:32:30.405928 5033 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:30 crc kubenswrapper[5033]: I0226 18:32:30.459460 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rw5wh"] Feb 26 18:32:32 crc kubenswrapper[5033]: I0226 18:32:32.377531 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rw5wh" podUID="89b9025d-3f12-496b-9071-aad3f48da53a" containerName="registry-server" containerID="cri-o://eb57e1836e7ca915e414f9a55bde3663d5e5bcad3ac55a48f1fad4f2b4ed8721" gracePeriod=2 Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.388490 5033 generic.go:334] "Generic (PLEG): container finished" podID="89b9025d-3f12-496b-9071-aad3f48da53a" containerID="eb57e1836e7ca915e414f9a55bde3663d5e5bcad3ac55a48f1fad4f2b4ed8721" exitCode=0 Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.388842 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw5wh" event={"ID":"89b9025d-3f12-496b-9071-aad3f48da53a","Type":"ContainerDied","Data":"eb57e1836e7ca915e414f9a55bde3663d5e5bcad3ac55a48f1fad4f2b4ed8721"} Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.388881 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rw5wh" event={"ID":"89b9025d-3f12-496b-9071-aad3f48da53a","Type":"ContainerDied","Data":"6fd8236307ce62dac2148c240210f874beb4beb7e6ac02c7e59664be0bf1b9b0"} Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.388899 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd8236307ce62dac2148c240210f874beb4beb7e6ac02c7e59664be0bf1b9b0" Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.409707 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.515068 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9bhw\" (UniqueName: \"kubernetes.io/projected/89b9025d-3f12-496b-9071-aad3f48da53a-kube-api-access-s9bhw\") pod \"89b9025d-3f12-496b-9071-aad3f48da53a\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.515110 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-utilities\") pod \"89b9025d-3f12-496b-9071-aad3f48da53a\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.515204 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-catalog-content\") pod \"89b9025d-3f12-496b-9071-aad3f48da53a\" (UID: \"89b9025d-3f12-496b-9071-aad3f48da53a\") " Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.516120 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-utilities" (OuterVolumeSpecName: "utilities") pod "89b9025d-3f12-496b-9071-aad3f48da53a" (UID: "89b9025d-3f12-496b-9071-aad3f48da53a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.573079 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b9025d-3f12-496b-9071-aad3f48da53a-kube-api-access-s9bhw" (OuterVolumeSpecName: "kube-api-access-s9bhw") pod "89b9025d-3f12-496b-9071-aad3f48da53a" (UID: "89b9025d-3f12-496b-9071-aad3f48da53a"). InnerVolumeSpecName "kube-api-access-s9bhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.617419 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9bhw\" (UniqueName: \"kubernetes.io/projected/89b9025d-3f12-496b-9071-aad3f48da53a-kube-api-access-s9bhw\") on node \"crc\" DevicePath \"\"" Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.617486 5033 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.663353 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89b9025d-3f12-496b-9071-aad3f48da53a" (UID: "89b9025d-3f12-496b-9071-aad3f48da53a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 18:32:33 crc kubenswrapper[5033]: I0226 18:32:33.719900 5033 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b9025d-3f12-496b-9071-aad3f48da53a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 18:32:34 crc kubenswrapper[5033]: I0226 18:32:34.398946 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rw5wh" Feb 26 18:32:34 crc kubenswrapper[5033]: I0226 18:32:34.437890 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rw5wh"] Feb 26 18:32:34 crc kubenswrapper[5033]: I0226 18:32:34.451032 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rw5wh"] Feb 26 18:32:35 crc kubenswrapper[5033]: I0226 18:32:35.659975 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b9025d-3f12-496b-9071-aad3f48da53a" path="/var/lib/kubelet/pods/89b9025d-3f12-496b-9071-aad3f48da53a/volumes" Feb 26 18:32:48 crc kubenswrapper[5033]: I0226 18:32:48.396495 5033 scope.go:117] "RemoveContainer" containerID="d777bf7a28fa0ad80b57eb1fc7d76e9c65f39872fd707258ad545e8aa9d330d2" Feb 26 18:33:20 crc kubenswrapper[5033]: I0226 18:33:20.189093 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:33:20 crc kubenswrapper[5033]: I0226 18:33:20.189681 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:33:50 crc kubenswrapper[5033]: I0226 18:33:50.189119 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:33:50 crc kubenswrapper[5033]: I0226 18:33:50.189875 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.153670 5033 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535514-dv4jm"] Feb 26 18:34:00 crc kubenswrapper[5033]: E0226 18:34:00.154865 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b9025d-3f12-496b-9071-aad3f48da53a" containerName="extract-utilities" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.154882 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b9025d-3f12-496b-9071-aad3f48da53a" containerName="extract-utilities" Feb 26 18:34:00 crc kubenswrapper[5033]: E0226 18:34:00.154928 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b9025d-3f12-496b-9071-aad3f48da53a" containerName="extract-content" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.154937 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b9025d-3f12-496b-9071-aad3f48da53a" containerName="extract-content" Feb 26 18:34:00 crc kubenswrapper[5033]: E0226 18:34:00.154956 5033 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b9025d-3f12-496b-9071-aad3f48da53a" containerName="registry-server" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.154963 5033 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b9025d-3f12-496b-9071-aad3f48da53a" containerName="registry-server" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.155203 5033 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b9025d-3f12-496b-9071-aad3f48da53a" containerName="registry-server" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.156160 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535514-dv4jm" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.158428 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.158707 5033 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.158873 5033 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tzgvw" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.166519 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535514-dv4jm"] Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.219044 5033 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45ljj\" (UniqueName: \"kubernetes.io/projected/2860a3c2-f55d-4f80-ac9c-03df96a960f3-kube-api-access-45ljj\") pod \"auto-csr-approver-29535514-dv4jm\" (UID: \"2860a3c2-f55d-4f80-ac9c-03df96a960f3\") " pod="openshift-infra/auto-csr-approver-29535514-dv4jm" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.322505 5033 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45ljj\" (UniqueName: \"kubernetes.io/projected/2860a3c2-f55d-4f80-ac9c-03df96a960f3-kube-api-access-45ljj\") pod \"auto-csr-approver-29535514-dv4jm\" (UID: \"2860a3c2-f55d-4f80-ac9c-03df96a960f3\") " pod="openshift-infra/auto-csr-approver-29535514-dv4jm" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.341343 5033 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45ljj\" (UniqueName: \"kubernetes.io/projected/2860a3c2-f55d-4f80-ac9c-03df96a960f3-kube-api-access-45ljj\") pod \"auto-csr-approver-29535514-dv4jm\" (UID: \"2860a3c2-f55d-4f80-ac9c-03df96a960f3\") " pod="openshift-infra/auto-csr-approver-29535514-dv4jm" Feb 26 18:34:00 crc kubenswrapper[5033]: I0226 18:34:00.479621 5033 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535514-dv4jm" Feb 26 18:34:01 crc kubenswrapper[5033]: I0226 18:34:01.529867 5033 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535514-dv4jm"] Feb 26 18:34:02 crc kubenswrapper[5033]: I0226 18:34:02.311764 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535514-dv4jm" event={"ID":"2860a3c2-f55d-4f80-ac9c-03df96a960f3","Type":"ContainerStarted","Data":"a3c194fdc062eac0ed07e70281a678d86c8c3462f1663eba2b50cef4e859aa88"} Feb 26 18:34:04 crc kubenswrapper[5033]: I0226 18:34:04.340450 5033 generic.go:334] "Generic (PLEG): container finished" podID="2860a3c2-f55d-4f80-ac9c-03df96a960f3" containerID="cb67fcdd2465e1c6ed68066259ac6a9a737fe2b04df99b56e4af65fff35a49ec" exitCode=0 Feb 26 18:34:04 crc kubenswrapper[5033]: I0226 18:34:04.340568 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535514-dv4jm" event={"ID":"2860a3c2-f55d-4f80-ac9c-03df96a960f3","Type":"ContainerDied","Data":"cb67fcdd2465e1c6ed68066259ac6a9a737fe2b04df99b56e4af65fff35a49ec"} Feb 26 18:34:05 crc kubenswrapper[5033]: I0226 18:34:05.748543 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535514-dv4jm" Feb 26 18:34:05 crc kubenswrapper[5033]: I0226 18:34:05.860696 5033 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45ljj\" (UniqueName: \"kubernetes.io/projected/2860a3c2-f55d-4f80-ac9c-03df96a960f3-kube-api-access-45ljj\") pod \"2860a3c2-f55d-4f80-ac9c-03df96a960f3\" (UID: \"2860a3c2-f55d-4f80-ac9c-03df96a960f3\") " Feb 26 18:34:05 crc kubenswrapper[5033]: I0226 18:34:05.867648 5033 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2860a3c2-f55d-4f80-ac9c-03df96a960f3-kube-api-access-45ljj" (OuterVolumeSpecName: "kube-api-access-45ljj") pod "2860a3c2-f55d-4f80-ac9c-03df96a960f3" (UID: "2860a3c2-f55d-4f80-ac9c-03df96a960f3"). InnerVolumeSpecName "kube-api-access-45ljj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 18:34:05 crc kubenswrapper[5033]: I0226 18:34:05.962915 5033 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45ljj\" (UniqueName: \"kubernetes.io/projected/2860a3c2-f55d-4f80-ac9c-03df96a960f3-kube-api-access-45ljj\") on node \"crc\" DevicePath \"\"" Feb 26 18:34:06 crc kubenswrapper[5033]: I0226 18:34:06.360397 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535514-dv4jm" event={"ID":"2860a3c2-f55d-4f80-ac9c-03df96a960f3","Type":"ContainerDied","Data":"a3c194fdc062eac0ed07e70281a678d86c8c3462f1663eba2b50cef4e859aa88"} Feb 26 18:34:06 crc kubenswrapper[5033]: I0226 18:34:06.360447 5033 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3c194fdc062eac0ed07e70281a678d86c8c3462f1663eba2b50cef4e859aa88" Feb 26 18:34:06 crc kubenswrapper[5033]: I0226 18:34:06.360511 5033 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535514-dv4jm" Feb 26 18:34:06 crc kubenswrapper[5033]: I0226 18:34:06.835616 5033 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535508-bv7qc"] Feb 26 18:34:06 crc kubenswrapper[5033]: I0226 18:34:06.850149 5033 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535508-bv7qc"] Feb 26 18:34:07 crc kubenswrapper[5033]: I0226 18:34:07.639405 5033 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa" path="/var/lib/kubelet/pods/f1e4aa7f-4a0e-47b6-8a96-9f38cbffc8fa/volumes" Feb 26 18:34:20 crc kubenswrapper[5033]: I0226 18:34:20.188998 5033 patch_prober.go:28] interesting pod/machine-config-daemon-zxss5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 18:34:20 crc kubenswrapper[5033]: I0226 18:34:20.189602 5033 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 18:34:20 crc kubenswrapper[5033]: I0226 18:34:20.189649 5033 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" Feb 26 18:34:20 crc kubenswrapper[5033]: I0226 18:34:20.190471 5033 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f12fead1855a694b2a5bc3fc00756962d95e4674bee7f9adb994587bd54369c"} pod="openshift-machine-config-operator/machine-config-daemon-zxss5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 18:34:20 crc kubenswrapper[5033]: I0226 18:34:20.190537 5033 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" podUID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerName="machine-config-daemon" containerID="cri-o://0f12fead1855a694b2a5bc3fc00756962d95e4674bee7f9adb994587bd54369c" gracePeriod=600 Feb 26 18:34:20 crc kubenswrapper[5033]: I0226 18:34:20.493090 5033 generic.go:334] "Generic (PLEG): container finished" podID="3b30da39-cf26-480f-82fc-dbb869e9acbd" containerID="0f12fead1855a694b2a5bc3fc00756962d95e4674bee7f9adb994587bd54369c" exitCode=0 Feb 26 18:34:20 crc kubenswrapper[5033]: I0226 18:34:20.493173 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerDied","Data":"0f12fead1855a694b2a5bc3fc00756962d95e4674bee7f9adb994587bd54369c"} Feb 26 18:34:20 crc kubenswrapper[5033]: I0226 18:34:20.493474 5033 scope.go:117] "RemoveContainer" containerID="c8c23112e08edb87261dba591871f662b118ab41577e549b6934d2b9b6eff74f" Feb 26 18:34:21 crc kubenswrapper[5033]: I0226 18:34:21.504381 5033 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxss5" event={"ID":"3b30da39-cf26-480f-82fc-dbb869e9acbd","Type":"ContainerStarted","Data":"de4d42ff8791a9d81ce66888a9b3597f803c988d9abdb74c0fd4848a1bfd5cae"} Feb 26 18:34:48 crc kubenswrapper[5033]: I0226 18:34:48.500920 5033 scope.go:117] "RemoveContainer" containerID="91f7110e941e74cd7a0fc04545239df308942d242c5571ecfdac36bed3295ec7" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515150111206024435 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015150111207017353 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015150064275016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015150064275015462 5ustar corecore